Perfect! It worked for me as well after I whitelisted
livy.file.local-dir-whitelist
= ~/.livy-sessions/ (doesn't seem to be in documentation)

On Sat, Dec 2, 2017 at 7:09 AM, Stefan Miklosovic <[email protected]>
wrote:

> I am using spark master instance and two slaves and livy points to that
> master so when I submit the jar, the job will be started on master and
> distributed to slaves. I am not using hdfs nor hadoop.
>
>
> On Friday, December 1, 2017, kant kodali <[email protected]> wrote:
>
>> Hi All,
>>
>> I am wondering how to start livy server using spark standalone mode?
>> Meaning, I currently done use yarn or mesos and also dont plan to use them
>> anytime soon. so I am wondering if it is possible to start livy server in
>> spark stand alone mode. and if so what is that I need to do ? I also dont
>> use HDFS or hadoop. I just run spark applications using stand alone mode
>> and a local file system
>>
>> What should the following be set to in my case?
>>
>> export SPARK_HOME=/usr/lib/spark
>>
>> export HADOOP_CONF_DIR=/etc/hadoop/conf
>>
>>
>>
>
> --
> Stefan Miklosovic
>

Reply via email to