Re: Spark execution plan

2014-07-23 Thread Luis Guerra
ems union should work for this scenario > > in part C, try to use: output_a union output_b > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-execution-plan-tp10482p10491.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >

Re: Spark execution plan

2014-07-23 Thread chutium
it seems union should work for this scenario in part C, try to use: output_a union output_b -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-execution-plan-tp10482p10491.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Spark execution plan

2014-07-23 Thread Luis Guerra
Hi all, I was wondering how spark may deal with an execution plan. Using PIG as example and its DAG execution, I would like to manage Spark for a similar solution. For instance, if my code has 3 different "parts", being A and B self-sufficient parts: Part A: .. . . var output_a Part