Chris Cannon created ZEPPELIN-1443:
--------------------------------------

             Summary: Spark Interpreter Respects Settings spark-defaults.conf 
                 Key: ZEPPELIN-1443
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1443
             Project: Zeppelin
          Issue Type: Improvement
          Components: pySpark
    Affects Versions: 0.6.1
            Reporter: Chris Cannon
             Fix For: 0.6.2


Currently Zeppelin's Spark interpreter does not respect the settings in 
spark-defaults.conf. This is very frustrating because now when I create a new 
cluster I have to manually copy over the settings from spark-defaults.conf into 
the interpreter settings page (i.e., `/interpreter`) so that Zeppelin can fully 
utilize the Spark cluster with critical settings like 
spark.dynamicAllocation.enabled, spark.executor.cores, etc. Please have the 
Spark interpreter respect what's in spark-defaults.conf instead of using its 
own default values.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to