Hi,

does anybody if (and how) it's possible to get a (dev-local) Spark
installation to talk to fakes3 for s3[n|a]:// URLs?

I have managed to connect to AWS S3 from my local installation by adding
hadoop-aws and aws-java-sdk to jars, using s3:// URLs as arguments for
SparkContext#textFile(), but I'm at loss how to get it to work with a
local fakes3.

The only reference I've found so far is this issue, where somebody seems
to have gotten close, but unfortunately he's forgotten about the details:

https://github.com/jubos/fake-s3/issues/108

Thanks and best regards,
Patrick

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to