Hello!  We patched livy with this functionality -- to bypass the global
config to permit the "occasional" job to run in local mode.  We used an
attribute in the REST URL query string to control it.

It isn't very difficult to do, but (at the time) my impression was this
isn't a particularly _secure_ thing to allow.  It's pretty easy to
overwhelm Livy with may spark-local jobs, and if it's sitting on an edge
node serving many users... well, you probably don't want it to be easily
killed!  Our use case was particular, where we had a bit more control over
when and how jobs are scheduled before arriving at Livy.

Let me know if you think this is something that might be interesting to
push back into the community!

Best regards, Ryan



On Sat, Feb 1, 2020 at 9:13 AM Saisai Shao <sai.sai.s...@gmail.com> wrote:

> I don't think current Livy support such behavior, the cluster manager
> specified in the conf file is a global configuration which affects all the
> created sessions.
>
> Thanks
> Saisai
>
> Ravi Shankar <r...@unifisoftware.com> 于2020年1月28日周二 上午4:02写道:
>
>> Hey guys,
>> Is there a way to start different kind of spark sessions using the same
>> livy server ? For instance, i want my application to tell livy whether the
>> new session being requested should be started as a "local" session or a
>> "yarn" session on the cluster.
>>
>> Ravi
>>
>

Reply via email to