Patrik Iselind created ZEPPELIN-4826:
----------------------------------------
Summary: Zeppelin doesn't allow my Spark environment to use all my
computer's resources
Key: ZEPPELIN-4826
URL: https://issues.apache.org/jira/browse/ZEPPELIN-4826
Project: Zeppelin
Issue Type: Bug
Affects Versions: 0.9.0
Environment: I'm running Zeppelin 0.9.0 in a docker image
Reporter: Patrik Iselind
As a user, I cannot give my Spark executer (local[*]) however much RAM I want.
This means that I'm not using all my available RAM, which limits how many
executors I can have chewing at the problem.
Changing spark.driver.memory and spark.executor.memory doesn't seem to make any
difference. There's something else that blocks how much RAM my executers can
get. I have no idea what.
Could Spark share JVM with Zeppelin?
--
This message was sent by Atlassian Jira
(v8.3.4#803005)