On Thu, Feb 18, 2016 at 10:26 AM, wgtmac wrote:
> In the code, I did following:
> val sc = new SparkContext(new
> SparkConf().setAppName("test").set("spark.driver.memory", "4g"))
You can't set the driver memory like this, in any deploy mode. When
that code runs, the driver is
Hi
I'm using spark 1.5.1. But I encountered a problem using SparkConf to set
spark.driver.memory in yarn-cluster mode.
Example 1:
In the code, I did following:
val sc = new SparkContext(new
SparkConf().setAppName("test").set("spark.driver.memory", "4g"))
And used following command to submit