[ 
https://issues.apache.org/jira/browse/LIVY-772?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Keerthimanu Gattu updated LIVY-772:
-----------------------------------
    Description: 
Using PySpark in Hue and when I change the spark.driver.memory in 
spark-default.conf to 2048M, livy still defaults to 1G.

Edit: Further findings: it is using few configurations from 
/usr/lib/hue/desktop/libs/notebook/src/notebook/connectors/spark_shell.py. Not 
sure why it isn't defaulting to spark-defaults.conf

  was:Using PySpark in Hue and when I change the spark.driver.memory in 
spark-default.conf to 2048M, livy still defaults to 1G.


> change in spark-defaults.conf not picked up by livy
> ---------------------------------------------------
>
>                 Key: LIVY-772
>                 URL: https://issues.apache.org/jira/browse/LIVY-772
>             Project: Livy
>          Issue Type: Question
>    Affects Versions: 0.6.0
>         Environment: EMR
>            Reporter: Keerthimanu Gattu
>            Priority: Major
>
> Using PySpark in Hue and when I change the spark.driver.memory in 
> spark-default.conf to 2048M, livy still defaults to 1G.
> Edit: Further findings: it is using few configurations from 
> /usr/lib/hue/desktop/libs/notebook/src/notebook/connectors/spark_shell.py. 
> Not sure why it isn't defaulting to spark-defaults.conf



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to