Sankar Mittapally created SPARK-17889:
-----------------------------------------
Summary: How to update sc.hadoopConfiguration.set properties in
standalone spark cluster for SparkR
Key: SPARK-17889
URL: https://issues.apache.org/jira/browse/SPARK-17889
Project: Spark
Issue Type: Request
Components: SparkR
Affects Versions: 2.0.0
Reporter: Sankar Mittapally
Hello,
I have downloaded below jars
aws-java-sdk-1.7.4.jar
hadoop-aws-2.7.1.jar
for the spark version - 2.0.0 and hadoop version - 2.7.1
I want to access the S3 bucket from sparkR, But I am not sure how to update
these properties.
sc.hadoopConfiguration.set("fs.s3n.impl","org.apache.hadoop.fs.s3native.NativeS3FileSystem")
sc.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", accessKey)
sc.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", secretKey)
Please let me know in which file i need to update these or Do I need to pass
these things during runtime.
Thanks in adv
sankar
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]