Not possible as of today. See
https://issues.apache.org/jira/browse/SPARK-2387

Hemant Bhanawat
https://www.linkedin.com/in/hemant-bhanawat-92a3811
www.snappydata.io

On Thu, Feb 18, 2016 at 1:19 PM, Shushant Arora <shushantaror...@gmail.com>
wrote:

> can two stages of single job run in parallel in spark?
>
> e.g one stage is ,map transformation and another is repartition on mapped
> rdd.
>
> rdd.map(function,100).repartition(30);
>
> can it happen that map transformation which is running 100 tasks after few
> of them say (10 )  are finished and spark started another stage repartition
> which started copying data from mapped stage nodes in parallel.
>
> Thanks
>

Reply via email to