I'm interested to see if anyone knows of a way to have custom job/stage
name for Spark Application.
I believe I can use *sparkContext.setCallSite(String)* to update job/stage
names but it does not let me update each stage name, setting this value
will set same text for all job and stage names for
Try increasing memory (--conf spark.executor.memory=3g or
--executor-memory) for executors. Here is something I noted from your logs
15/09/29 06:32:03 WARN MemoryStore: Failed to reserve initial memory
threshold of 1024.0 KB for computing block rdd_2_1813 in memory.
15/09/29 06:32:03 WARN
--conf is used to pass any spark configuration that starts with *spark.**
You can also use "--driver-java-options" to pass any system properties you
would like to the driver program.
On Tue, Sep 29, 2015 at 2:30 PM swetha wrote:
>
> Hi,
>
> How to set System
SPARK-5869 appears to have the same exception and is fixed in 1.3.0. I
double checked the CDH package to see if it had the patch
https://github.com/cloudera/spark/blob/cdh5.4.4-release/core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala#L161
In my case, my yarn application fails