I have sent the message there as well, I thought I would send it here as
well because im actually setting up the hadoopConf
On Wed, Mar 8, 2017 at 6:49 PM, Ravi Prakash wrote:
> Sorry to hear about your travails.
>
> I think you might be better off asking the spark
Sorry to hear about your travails.
I think you might be better off asking the spark community:
http://spark.apache.org/community.html
On Wed, Mar 8, 2017 at 3:22 AM, Jonhy Stack wrote:
> Hi,
>
> I'm trying to read a s3 bucket from Spark and up until today Spark always
>
Hi,
I'm trying to read a s3 bucket from Spark and up until today Spark always
complain that the request return 403
hadoopConf = spark_context._jsc.hadoopConfiguration()
hadoopConf.set("fs.s3a.access.key", "ACCESSKEY")
hadoopConf.set("fs.s3a.secret.key", "SECRETKEY")