Hi,

I have a doubt regarding yarn-cluster mode and spark.driver.
allowMultipleContexts for below usercases.

I have a long running backend server where I will create a short-lived
Spark job in response to each user request, base on the fact that by
default multiple Spark Context cannot be created in the same JVM, looks
like I have 2 choices

1) enable spark.driver.allowMultipleContexts

2) run my jobs in yarn-cluster mode instead yarn-client

For 1) I cannot find any official document, so looks like it's not
encouraged, isn't it?
For 2), I want to make sure yarn-cluster will NOT hit such
limitation(single SparkContext per VM), apparently I have to something in
driver side to push the result set back to my application.

Thanks

-- 
--Anfernee

Reply via email to