Im trying to setup my cluster with elk and kafka inside docker containers, but always logstash cant consume data from the kafka. Producer based on my local machine, not inside docker. I appriciate any help.
docker-compose:
zoo1: image: confluentinc/cp-zookeeper restart: always container_name: zoo1 ports: - "2181:2181" environment: - ZOO_MY_ID=1 - ZOO_SERVERS=2181 - ZOOKEEPER_CLIENT_PORT=2181 - ALLOW_ANONYMOUS_LOGIN=yes kafka: image: confluentinc/cp-kafka hostname: kafka container_name: kafka depends_on: - zoo1 ports: - "9092:9092" environment: KAFKA_BROKER_ID: 1 KAFKA_ZOOKEEPER_CONNECT: zoo1:2181 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT # KAFKA_LISTENERS: PLAINTEXT_HOST://0.0.0.0:9092, PLAINTEXT://kafka:29092 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9093,PLAINTEXT_HOST://localhost:9092 logstash.conf:
input { kafka { topics => ["topic-ex"] bootstrap_servers => "localhost:9092" } } https://stackoverflow.com/questions/66828177/bootstrap-broker-localhost9092-id-1-rack-null-disconnected March 27, 2021 at 01:05PM
没有评论:
发表评论