Same problem here.

David

On Fri, Oct 23, 2015 at 12:58 PM, Rok Roskar <rokros...@gmail.com> wrote:

> I'm trying to control the amount of memory used by Spark driver and
> executors -- in a normal setting, I would specify these values via the
> spark.driver.memory and spark.executor.memory configuration options.
> however, setting these in the interpreter properties in zeppelin doesn't
> seem to do anything -- because zeppelin also sets its own extra java flags,
> I couldn't simply set the -Xmx flag in spark.driver.extraJavaOptions --
> the only alternative I could think of is to set the _JAVA_OPTIONS
> environment variable, which did the trick, but is not the optimal way I
> would want to use for changing the memory settings. What am I missing here?
> How should this be done? Thanks,
>
> Rok
>

Reply via email to