rangareddy commented on issue #8712:
URL: https://github.com/apache/hudi/issues/8712#issuecomment-2522437596

   To write data to S3 using the Hudi Sink Connector, we need to pass 
additional S3-related parameters, which are not present in your connector 
configuration file.
   
   ```json
   {
   "name": "hudi-test-topic",
   "config": {
   "bootstrap.servers": "xx.xx.xx.xx:9092",
   "connector.class": "org.apache.hudi.connect.HoodieSinkConnector",
   "tasks.max": "1",
   "key.converter": "org.apache.kafka.connect.storage.StringConverter",
   "value.converter": "org.apache.kafka.connect.storage.StringConverter",
   "value.converter.schemas.enable": "false",
   "topics": "hudi-test-topic",
   "hoodie.table.name": "test_hudi_table",
   "hoodie.table.type": "MERGE_ON_READ",
   "hoodie.base.path": "s3a://",
   "hoodie.datasource.write.partitionpath.field": "date",
   "hoodie.datasource.write.recordkey.field": "volume",
   "hoodie.kafka.commit.interval.secs": 60
   }
   }
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to