Marcelo Vanzin resolved SPARK-5925.
    Resolution: Won't Fix

I don't think this can be fixed in Spark at all. There's no way to know 
beforehand how many jobs or tasks or stages an app will run. Imagine a long 
running spark-shell where the user is running a lot of small jobs... what's the 
progress of the overall app?

There's just a mismatch between the YARN API and how Spark works. The YARN API 
makes a lot of sense for MapReduce apps. It doesn't make sense for Spark. 
Unless Spark exposes its own API for applications to report progress and proxy 
that information to YARN, but I don't see that happening.

> YARN - Spark progress bar stucks at 10% but after finishing shows 100%
> ----------------------------------------------------------------------
>                 Key: SPARK-5925
>                 URL: https://issues.apache.org/jira/browse/SPARK-5925
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 1.2.1
>            Reporter: Laszlo Fesus
>            Priority: Minor
> I did set up a yarn cluster (CDH5) and spark (1.2.1), and also started Spark 
> History Server. Now I am able to click on more details on yarn's web 
> interface and get redirected to the appropriate spark logs during both job 
> execution and also after the job has finished. 
> My only concern is that while a spark job is being executed (either 
> yarn-client or yarn-cluster), the progress bar stucks at 10% and doesn't 
> increase as for MapReduce jobs. After finishing, it shows 100% properly, but 
> we are loosing the real-time tracking capability of the status bar. 
> Also tested yarn restful web interface, and it retrieves again 10% during 
> (yarn) spark job execution, and works well again after finishing. (I suppose 
> for the while being I should have a look on Spark Job Server and see if it's 
> possible to track the job via its restful web interface.)
> Did anyone else experience this behaviour? Thanks in advance.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to