i've eliminated fetch failed with this parameters (don't know which was the right one for the problem)
to the spark-submit running with 1.2.0

        --conf spark.shuffle.compress=false \
        --conf spark.file.transferTo=false \
        --conf spark.shuffle.manager=hash \
        --conf spark.akka.frameSize=50 \
        --conf spark.core.connection.ack.wait.timeout=600

..but me too i'm unable to finish a job...now i'm facing OOM's...still trying...but at
least fetch failed are gone

bye

Il 23/12/2014 21:10, Chen Song ha scritto:
I tried both 1.1.1 and 1.2.0 (built against cdh5.1.0 and hadoop2.3) but I am still seeing FetchFailedException.

On Mon, Dec 22, 2014 at 8:27 AM, steghe <stefano.ghe...@icteam.it <mailto:stefano.ghe...@icteam.it>> wrote:

    Which version of spark are you running?

    It could be related to this
    https://issues.apache.org/jira/browse/SPARK-3633

    fixed in 1.1.1 and 1.2.0





    --
    View this message in context:
    
http://apache-spark-user-list.1001560.n3.nabble.com/Fetch-Failure-tp20787p20811.html
    Sent from the Apache Spark User List mailing list archive at
    Nabble.com.

    ---------------------------------------------------------------------
    To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
    <mailto:user-unsubscr...@spark.apache.org>
    For additional commands, e-mail: user-h...@spark.apache.org
    <mailto:user-h...@spark.apache.org>




--
Chen Song



--
____________________________________________________________
Stefano Ghezzi                     ICTeam S.p.A
Project Manager - PMP
tel     035 4232129                fax 035 4522034
email   stefano.ghe...@icteam.it   url http://www.icteam.com
mobile  335 7308587
____________________________________________________________

Reply via email to