Hi Blaz,

I did, the same result

Thank you,
Konstantin Kudryavtsev

On Wed, Dec 30, 2015 at 12:54 PM, Blaž Šnuderl <snud...@gmail.com> wrote:

> Try setting s3 credentials using keys specified here
> https://github.com/Aloisius/hadoop-s3a/blob/master/README.md
>
> Blaz
> On Dec 30, 2015 6:48 PM, "KOSTIANTYN Kudriavtsev" <
> kudryavtsev.konstan...@gmail.com> wrote:
>
>> Dear Spark community,
>>
>> I faced the following issue with trying accessing data on S3a, my code is
>> the following:
>>
>> val sparkConf = new SparkConf()
>>
>> val sc = new SparkContext(sparkConf)
>> sc.hadoopConfiguration.set("fs.s3a.impl", 
>> "org.apache.hadoop.fs.s3a.S3AFileSystem")
>> sc.hadoopConfiguration.set("fs.s3a.access.key", "---")
>> sc.hadoopConfiguration.set("fs.s3a.secret.key", "---")
>>
>> val sqlContext = SQLContext.getOrCreate(sc)
>>
>> val df = sqlContext.read.parquet(...)
>>
>> df.count
>>
>>
>> It results in the following exception and log messages:
>>
>> 15/12/30 17:00:32 DEBUG AWSCredentialsProviderChain: Unable to load 
>> credentials from BasicAWSCredentialsProvider: *Access key or secret key is 
>> null*
>> 15/12/30 17:00:32 DEBUG EC2MetadataClient: Connecting to EC2 instance 
>> metadata service at URL: 
>> http://x.x.x.x/latest/meta-data/iam/security-credentials/
>> 15/12/30 <http://x.x.x.x/latest/meta-data/iam/security-credentials/15/12/30> 
>> 17:00:32 DEBUG AWSCredentialsProviderChain: Unable to load credentials from 
>> InstanceProfileCredentialsProvider: The requested metadata is not found at 
>> http://x.x.x.x/latest/meta-data/iam/security-credentials/
>> 15/12/30 <http://x.x.x.x/latest/meta-data/iam/security-credentials/15/12/30> 
>> 17:00:32 ERROR Executor: Exception in task 1.0 in stage 1.0 (TID 3)
>> com.amazonaws.AmazonClientException: Unable to load AWS credentials from any 
>> provider in the chain
>>      at 
>> com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:117)
>>      at 
>> com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3521)
>>      at 
>> com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1031)
>>      at 
>> com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:994)
>>      at 
>> org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:297)
>>
>>
>> I run standalone spark 1.5.2 and using hadoop 2.7.1
>>
>> any ideas/workarounds?
>>
>> AWS credentials are correct for this bucket
>>
>> Thank you,
>> Konstantin Kudryavtsev
>>
>

Reply via email to