[ https://issues.apache.org/jira/browse/SPARK-10375?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin resolved SPARK-10375. ------------------------------------ Resolution: Not A Problem You can't set the driver memory after the driver has already started. If you want to set it, you need to set if either in your config file or in the spark-submit command ({{--driver-memory 1g}} or {{--conf spark.driver.memory=1g"). The UI discrepancy is unfortunate but not easy (nor important enough) to fix, at least at the moment. It affects quite a lot of properties that can't really change after the context is initialized. > Setting the driver memory with SparkConf().set("spark.driver.memory","1g") > does not work > ---------------------------------------------------------------------------------------- > > Key: SPARK-10375 > URL: https://issues.apache.org/jira/browse/SPARK-10375 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 1.3.0 > Environment: Running with yarn > Reporter: Thomas > Priority: Minor > > When running pyspark 1.3.0 with yarn, the following code has no effect: > pyspark.SparkConf().set("spark.driver.memory","1g") > The Environment tab in yarn shows that the driver has 1g, however, the > Executors tab only shows 512 M (the default value) for the driver memory. > This issue goes away when the driver memory is specified via the command line > (i.e. --driver-memory 1g) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org