arjun180 opened a new issue #1217:
URL: https://github.com/apache/camel-kafka-connector/issues/1217
Hello,
I'm working on getting the aws2s3 source connector to work. But it's
throwing a Java bean error on on of the parameters. I used a similar set of
parameters when I was configuring my sink connector. And that works just fine.
Error :
```
Caused by: org.apache.camel.PropertyBindingException: Error binding property
(camel.component.aws2-s3.useDefaultCredentials Provider=true) with name:
useDefaultCredentials Provider on bean:
org.apache.camel.component.aws2.s3.AWS2S3Component@72daf76 with value: true
```
My Kafka Connector yaml file :
```
apiVersion: kafka.strimzi.io/v1beta2
kind: KafkaConnector
metadata:
name: s3-source-connector
labels:
strimzi.io/cluster:my-connect-cluster
spec:
class: org.apache.camel.kafkaconnector.aws2s3.CamelAws2s3SourceConnector
tasksMax: 1
config:
key.converter: org.apache.kafka.connect.storage.StringConverter
value.converter: org.apache.kafka.connect.storage.S3ObjectConverter
topics: my-topic
camel.source.path.bucketNameOrArn: my-cluster-connect
camel.source.endpoint.useDefaultCredentials Provider: true
camel.component.aws2-s3.useDefaultCredentials Provider: true
camel.component.aws2-s3.fileName: <file-name>
camel.source.endpoint.region: <region>
```
When I was working on the sink connector, I had to change the permissions of
the IAM to create the EC2 instance of the pod and give it access to the s3
bucket (https://github.com/apache/camel-kafka-connector/issues/282). I was
wondering if there was something I needed to change on the s3 bucket side of
things?
Appreciate the help
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]