Hi, I kind am in your situation now while trying to read from S3. Where you able to find a workaround in the end?
Thnx, Erisa On Thu, Nov 12, 2015 at 12:00 PM, aecc <alessandroa...@gmail.com> wrote: > Some other stats: > > The number of files I have in the folder is 48. > The number of partitions used when reading data is 7315. > The maximum size of a file to read is 14G > The size of the folder is around: 270G > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-task-hangs-infinitely-when-accessing-S3-from-AWS-tp25289p25367.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >