There's no way to communicate between spark and sh intepreter. It need to
implement it but it doesn't yet. But I agree that it would be helpful for
some cases. Can you create issue?

On Thu, Jan 12, 2017 at 3:32 PM, Ruslan Dautkhanov <dautkha...@gmail.com>
wrote:

> It's possible to exchange variables between Scala and Spark
> through z.put and z.get.
>
> How to pass a variable to %sh?
>
> In Jupyter it's possible to do for example as
>
>>   ! hadoop fs -put {localfile} {hdfsfile}
>
>
> where localfile and and hdfsfile are Python variables.
>
> Can't find any references for something similar in Shell Interpreter
> https://zeppelin.apache.org/docs/0.7.0-SNAPSHOT/interpreter/shell.html
>
> In many notebooks we have to pass small variabels
> from Zeppelin notes to external scripts as parameters.
>
> It would be awesome to have something like
>
> %sh
>> /path/to/script --param8={var1} --param9={var2}
>
>
> where var1 and var2 would be implied to be fetched as z.get('var1')
> and z.get('var2') respectively.
>
> Other thoughts?
>
>
> Thank you,
> Ruslan Dautkhanov
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to