Hi,
I have written a sample spark job that reads the data residing in hbase. I
keep getting below error , any suggestions to resolve this please?
Caused by: java.lang.IllegalArgumentException: AWS Access Key ID and Secret
Access Key must be specified by setting the fs.s3.awsAccessKeyId and
fs.s3.awsSecretAccessKey properties (respectively).
at org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:74)
conf.set("fs.s3.impl", "org.apache.hadoop.fs.s3.S3FileSystem")
conf.set("fs.s3.awsAccessKeyId", "ddd")
conf.set("fs.s3.awsSecretAccessKey", "dddddd")
conf.set("fs.s3n.impl",
"org.apache.hadoop.fs.s3native.NativeS3FileSystem")
conf.set("fs.s3n.awsAccessKeyId", "xxxxxxx")
conf.set("fs.s3n.awsSecretAccessKey", "xxxx")
I tried this setting in spark config and hbase config but none of the
resolved my issue.
Thanks,
Asmath