To: Pat Ferrel
>> Cc: Felix Cheung ,
>> Marcelo Vanzin , user
>>
>> Subject: Re: spark.submit.deployMode: cluster
>>
>> Check out the Spark Jobs API... it sits behind a REST service...
>>
>>
>> On Thu, Mar 28, 2019 at 12:29 Pat Ferrel w
Nerothin
> Reply: Jason Nerothin
> Date: March 28, 2019 at 2:53:05 PM
> To: Pat Ferrel
> Cc: Felix Cheung ,
> Marcelo
> Vanzin , user
>
> Subject: Re: spark.submit.deployMode: cluster
>
> Check out the Spark Jobs API... it sits behind a REST service...
>
: spark.submit.deployMode: cluster
Check out the Spark Jobs API... it sits behind a REST service...
On Thu, Mar 28, 2019 at 12:29 Pat Ferrel wrote:
> ;-)
>
> Great idea. Can you suggest a project?
>
> Apache PredictionIO uses spark-submit (very ugly) and Apache Mahout only
>
ps since most uses are as a lib.
>
>
> From: Felix Cheung
> Reply: Felix Cheung
>
> Date: March 28, 2019 at 9:42:31 AM
> To: Pat Ferrel , Marcelo
> Vanzin
> Cc: user
> Subject: Re: spark.submit.deployMode: cluster
>
> If anyone wants to improve docs please c
: user
Subject: Re: spark.submit.deployMode: cluster
If anyone wants to improve docs please create a PR.
lol
But seriously you might want to explore other projects that manage job
submission on top of spark instead of rolling your own with spark-submit.
--
*From
To: Marcelo Vanzin
Cc: user
Subject: Re: spark.submit.deployMode: cluster
Ahh, thank you indeed!
It would have saved us a lot of time if this had been documented. I know, OSS
so contributions are welcome… I can also imagine your next comment; “If anyone
wants to improve docs see the Apache
Reply: Marcelo Vanzin
Date: March 26, 2019 at 1:59:36 PM
To: Pat Ferrel
Cc: user
Subject: Re: spark.submit.deployMode: cluster
If you're not using spark-submit, then that option does nothing.
If by "context creation API" you mean "new SparkContext()" or an
equ
have a server that starts a Spark job using the context creation API. It
> DOES NOY use spark-submit.
>
> I set spark.submit.deployMode = “cluster”
>
> In the GUI I see 2 workers with 2 executors. The link for running application
> “name” goes back to
I have a server that starts a Spark job using the context creation API. It
DOES NOY use spark-submit.
I set spark.submit.deployMode = “cluster”
In the GUI I see 2 workers with 2 executors. The link for running
application “name” goes back to my server, the machine that launched the
job