Hi Pierre,
One way is to recreate your credentials until AWS generates one without a
slash character in it. Another way I've been using is to pass these
credentials outside the S3 file path by setting the following (where sc is
the SparkContext).
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Access-several-s3-buckets-with-credentials-containing-tp23171.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Access-several-s3-buckets-with-credentials-containing-tp23172.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail
On 5 Jun 2015, at 08:03, Pierre B pierre.borckm...@realimpactanalytics.com
wrote:
Hi list!
My problem is quite simple.
I need to access several S3 buckets, using different credentials.:
```
val c1 =
sc.textFile(s3n://[ACCESS_KEY_ID1:SECRET_ACCESS_KEY1]@bucket1/file.csv).count
val c2