You can do the same thing in Zeppelin, either set that in interpreter
setting page or using inline generic configuration.

http://zeppelin.apache.org/docs/0.8.2/usage/interpreter/overview.html#generic-confinterpreter


mhd wrk <mhdwrkoff...@gmail.com> 于2019年11月5日周二 上午10:05写道:

> As explained here
> <https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties>,
> in spark we can use ''--conf" to load properties dynamically at submit time
> instead of hard-coding them in a configuration file. How can we
> achieve this via Zeppelin?
>
> On Mon, Nov 4, 2019 at 5:19 PM Jeff Zhang <zjf...@gmail.com> wrote:
>
>> It is some kind of spark problem, you'd better to ask this kind of
>> question in spark community.  If spark can, then zeppelin won't stop that.
>>
>> mhd wrk <mhdwrkoff...@gmail.com> 于2019年11月5日周二 上午1:02写道:
>>
>>> For example, passing a delegation and/or access token created per user
>>> authentication to the driver.
>>>
>>> On Fri, Nov 1, 2019 at 11:05 PM Jeff Zhang <zjf...@gmail.com> wrote:
>>>
>>>> Could you be more specific what kind of data you want to pass to spark
>>>> job, to spark driver or to spark executor ? Giving an example would be very
>>>> helpful.
>>>>
>>>> mhd wrk <mhdwrkoff...@gmail.com> 于2019年11月2日周六 上午2:38写道:
>>>>
>>>>> What's the best way in zeppelin to pass extra data (e.g. stored in
>>>>> current user's session) to spark job?
>>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>>
>>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
Best Regards

Jeff Zhang

Reply via email to