Hi all,

I was wondering how spark may deal with an execution plan. Using PIG as
example and its DAG execution, I would like to manage Spark for a similar
solution.

For instance, if my code has 3 different "parts", being A and B
self-sufficient parts:

Part A:
......
.....
.....
var output_a
Part B:
.....
......
.....
var output_b
Part C:
....
...using output_a and output_b

How would be the execution plan in spark? Could somehow parts A and B being
executed in parallel?

Related to this, are there thread implementations in Scala? Could be this a
solution for this scenario?

Regards

Reply via email to