Hi,
Trying to create kafka-spark-stream-cassandra process. Found one solution in 
jupiter notebook. Trying to simulate the same using python. However Kafka 
producer terminates with below error messages.
Traceback (most recent call last): File "kafkaSendData.py", line 13, in 
<module>   producer.send('test', message) File "kafka\producer\kafka.py", line 
504, in send   self._wait_on_metadata(topic, self.config['max_block_ms'] / 
1000.0) File "kafka\producer\kafka.py", line 631, in _wait_on_metadata   
"Failed to update metadata after %.1f secs." % 
max_wait)kafka.errors.KafkaTimeoutError: KafkaTimeoutError: Failed to update 
metadata after 60.0 secs.
Below is python code snippet.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------import
 osfrom  pyspark import SparkContextfrom kafka import KafkaProducerimport time
os.environ['PYSPARK_SUBMIT_ARGS'] = '--conf spark.ui.port=4041 --packages 
org.apache.kafka:kafka_2.11:0.10.0.0,org.apache.kafka:kafka-clients:0.10.0.0  
pyspark-shell'sc = SparkContext("local[1]", "KafkaSendStream")
producer = 
KafkaProducer(bootstrap_servers='192.168.99.100:9092',api_version='0.10.1')
while True:   message=time.strftime("%Y-%m-%d %H:%M:%S")   
producer.send('test', message)   time.sleep(1)
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Note : Docker is used to start Kafka (edited)
Please advise how to resolve the issue.
Thanks and regards
Jagannath S Bilgi

Reply via email to