Should be able to use s3a (on new hadoop versions), I believe that will try or at least has a setting for v4
On Tue, Jun 30, 2015 at 8:31 PM, Exie <tfind...@prodevelop.com.au> wrote: > Not sure if this helps, but the options I set are slightly different: > > val hadoopConf=sc.hadoopConfiguration > hadoopConf.set("fs.s3n.awsAccessKeyId","key") > hadoopConf.set("fs.s3n.awsSecretAccessKey","secret") > > Try setting them to s3n as opposed to just s3 > > Good luck! > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/s3-bucket-access-read-file-tp23536p23560.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >