Hi there, 
My application is simply easy, just read huge files from HDFS with 
textFile()
Then I will map to to tuples, after than a reduceByKey(), finally
saveToTextFile().

The problem is when I am dealing with large inputs (2.5T), when the
application enter to the 2nd stage -- reduce by key. It fail with the
exception of FileNotFoundException when trying to fetch the temp files. I
also see Timeout (120s) error before that exception. No other exception or
error. (OOM, to many files, etc..)

I had done a lot of google searches, and tried to increase executor memory,
repartition the RDD to more splits, etc.... but in vain. 
I also find another post here:
http://permalink.gmane.org/gmane.comp.lang.scala.spark.user/5449
which has exactly the same problem with mine. 

Any idea? thanks so much for the help



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Help-Get-Timeout-error-and-FileNotFoundException-when-shuffling-large-files-tp25662.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to