No I did not, I thought Spark would take care of that itself since I have
put in the arguments.
On Thu, Sep 7, 2017 at 9:26 PM, Lukas Bradley
wrote:
> Did you also increase the size of the heap of the Java app that is
> starting Spark?
>
> https://alvinalexander.com/blog/post/java/java-xmx-xms-
Did you also increase the size of the heap of the Java app that is starting
Spark?
https://alvinalexander.com/blog/post/java/java-xmx-xms-memory-heap-size-control
On Thu, Sep 7, 2017 at 12:16 PM, Imran Rajjad wrote:
> I am getting Out of Memory error while running connectedComponents job on
> g
I am getting Out of Memory error while running connectedComponents job on
graph with around 12000 vertices and 134600 edges.
I am running spark in embedded mode in a standalone Java application and
have tried to increase the memory but it seems that its not taking any
effect
sparkConf = new SparkC