SpaceMiao commented on PR #53:
URL: 
https://github.com/apache/flink-connector-elasticsearch/pull/53#issuecomment-1963974534

   @mtfelisb  Hi, I want read data from Kafka as byte[], but when i write these 
byte[] to elasticsearch with your "Elasticsearch8AsyncSinkBuilder". Here are 
some core codes of mine:
   ```
   KafkaSource<byte[]> bytesKafkasSource = 
KafkaSource.<byte[]>builder().setValueOnlyDeserializer(new 
ByteDeserializationSchema()).......build();
   StreamExecutionEnvironment env = StreamExecutionEnvironment 
.createLocalEnvironmentWithWebUI();
   env.formSource(bytesKafkasSource 
).sinkTo(Elasticsearch8AsyncSinkBuilder.<byte[]>builder().setElementConverter((element,
 ctx) -> new 
IndexOperation.Builder<>().index().document(element).build()).build());
   ```
   I got unexpected result, it just write small parts of the full data from the 
begining to the ES.
   Did you test this case before? I not sure if it is because of the 
deserializer "Kryo" can't handle with byte[] data?
   Or can you give me some advice about it?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to