Re: Can not configure driver memory size

2015-11-16 Thread Mina Lee
Hi Egor, we recommand you to set SPARK_HOME because many spark configurations can be handled better in this way. SPARK_SUBMIT_OPTIONS in conf/zeppelin-env.sh is valid only when you set SPARK_HOME. Since you get JVM error, if you have any JVM related configuration in your conf/zeppelin-env.sh can y

Re: Using Elasticsearch Connector in Zeppelin Notebook

2015-11-16 Thread SiS
Hi thanks a lot for your help. It worked like you described it. But I have another question. Would it be possible to define the dependency somehow centrally, so that it is not necessary to insert the %dep in all notebooks and especially not need to restart the Interpreter all time I start the

Re: Using Elasticsearch Connector in Zeppelin Notebook

2015-11-16 Thread Josef A. Habdank
You have to edit the conf/zeppelin-env.sh add the following line: export SPARK_SUBMIT_OPTIONS="--packages org.elasticsearch::elasticsearch-spark:2.2.0.BUILD-SNAPSHOT" and then restart Zeppelin: zeppelin-daemon.sh stop zeppelin-daemon.sh start >From now on every time Zeppelin starts it will loa

Re: why zeppelin SparkInterpreter use FIFOScheduler

2015-11-16 Thread Rohit Agarwal
Hey Pranav, Did you make any progress on this? -- Rohit On Sunday, August 16, 2015, moon soo Lee wrote: > Pranav, proposal looks awesome! > > I have a question and feedback, > > You said you tested 1,2 and 3. To create SparkIMain per notebook, you need > information of notebook id. Did you get