Hi Roman, Sorry for my silly question.
Actually I am using Ignite sink in Confluent platform where I have given cache name, Ignite cfg and Cache allow overwright properties. After deploying the connector when I put the data into kafka topic through console-producer, the data is being successfully sinked into ignite's cache. I've printed the record.key() and record.value() and i got the result as : record.key()===k1 record.value()==v1 However when I used kafka FileStreamSourceConnector to read a text file and push the data to the topic I got : Failed to stream a record with null key! ERROR I am having a problem while reading a file, the content of the file is like this: k1,v1 k2,v2 k3,v3 My configuration file is like this: bootstrap.servers=localhost:9092 key.converter=org.apache.kafka.connect.storage.StringConverter value.converter=org.apache.kafka.connect.storage.StringConverter key.converter.schemas.enable=false value.converter.schemas.enable=false internal.key.converter=org.apache.kafka.connect.storage.StringConverter internal.value.converter=org.apache.kafka.connect.storage.StringConverter internal.key.converter.schemas.enable=false internal.value.converter.schemas.enable=false offset.storage.file.filename=/tmp/connect.offsets offset.flush.interval.ms=10000 Regards, Austin -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Kafka-Failed-to-stream-a-record-with-null-key-tp8731p8765.html Sent from the Apache Ignite Users mailing list archive at Nabble.com.
