Hi All,
worked OK by adding below in VM options.
-Xms128m -Xmx512m -XX:MaxPermSize=300m -ea
Thanks
Sri
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893p25920.html
Sent from the Apache Spark User List mailing list archive at Nabble
Hi Mark,
I did changes to VM options in edit configuration section for the main method
and Scala test case class in IntelliJ which worked ok when I executed
individually, but while running maven install to create jar file the test case
is failing.
Can I add VM options in spark conf set in Scala
It's not a bug, but a larger heap is required with the new
UnifiedMemoryManager:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L172
On Wed, Jan 6, 2016 at 6:35 AM, kali.tumm...@gmail.com <
kali.tumm...@gmail.com> wrote:
> Hi All
Hi All,
I am running my app in IntelliJ Idea (locally) my config local[*] , the code
worked ok with spark 1.5 but when I upgraded to 1.6 I am having below issue.
is this a bug in 1.6 ? I change back to 1.5 it worked ok without any error
do I need to pass executor memory while running in local in