jonaswagner opened a new issue #1107:
URL: https://github.com/apache/camel-kafka-connector/issues/1107


   Hi all,
   
   This issue relates to the conversation from the Camel Zulipat chat: 
https://camel.zulipchat.com/#narrow/stream/257303-camel-kafka-connector/topic/aws2.20s3.20source.20connector.20ClassCastException.20for.20value.2Econver
   
   As discussed with @luigidemasi, I get a ClassCastException, when I try to 
use the value converter property for my aws2 s3 source connector.
   
   This is the config I used:
   
   spec:
   class: org.apache.camel.kafkaconnector.aws2s3.CamelAws2s3SourceConnector
   tasksMax: 1
   config:
   camel.source.path.bucketNameOrArn: my-bucket-name
   camel.source.endpoint.autoCreateBucket: false
   camel.source.endpoint.region: my-region
   camel.source.endpoint.deleteAfterRead: true
   camel.source.endpoint.fileName: devl/shared/targetfile.csv
   camel.source.endpoint.accessKey: "*"
   camel.source.endpoint.secretKey: "*"
   camel.source.maxPollDuration: 10000
   camel.source.endpoint.delay: 10000
   camel.source.endpoint.autocloseBody: false
   #key.converter: org.apache.kafka.connect.storage.StringConverter
   #value.converter: 
org.apache.camel.kafkaconnector.awss3.converters.S3ObjectConverter
   topics: test-name.targetfile.source.s3
   
   The exception I get is the following:
   
   2021-03-16 07:23:23,385 INFO WorkerSourceTask{id=test-name-source-s3-0} 
Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask) 
[task-thread-test-name-source-s3-0]
   2021-03-16 07:23:23,385 INFO WorkerSourceTask{id=test-name-source-s3-0} 
flushing 0 outstanding messages for offset commit 
(org.apache.kafka.connect.runtime.WorkerSourceTask) 
[task-thread-test-name-source-s3-0]
   2021-03-16 07:23:23,385 ERROR WorkerSourceTask{id=test-name-source-s3-0} 
Task threw an uncaught and unrecoverable exception 
(org.apache.kafka.connect.runtime.WorkerTask) 
[task-thread-test-name-source-s3-0]
   org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in 
error handler
   at 
org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
   at 
org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
   at 
org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:295)
   at 
org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:321)
   at 
org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:245)
   at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184)
   at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234)
   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to 
com.amazonaws.services.s3.model.S3ObjectInputStream
   at 
org.apache.camel.kafkaconnector.awss3.converters.S3ObjectConverter.fromConnectData(S3ObjectConverter.java:37)
   at 
org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63)
   at 
org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$2(WorkerSourceTask.java:295)
   at 
org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
   at 
org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
   ... 11 more
   2021-03-16 07:23:23,385 ERROR WorkerSourceTask{id=test-name-source-s3-0} 
Task is being killed and will not recover until manually restarted 
(org.apache.kafka.connect.runtime.WorkerTask) 
[task-thread-test-name-source-s3-0]
   
   My expectation would be that I can upload binaries to S3 and convert them 
into S3Objects for my Kafka Topic.
   
   Thank you for investigating this issue
   Jonas
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to