(-incubator, +user) It's not Spark running out of memory, but SBT, so those env variables have no effect. They're options to Spark at runtime anyway, not compile time, and you're intending to compile I take it.
SBT is a memory hog, and Spark is a big build. You will probably need to give it more than the default amount of memory to compile. But, really you should use Maven to build if you can. See notes on how to do that, including giving it more memory, at http://spark.apache.org/docs/latest/building-with-maven.html On Thu, Aug 7, 2014 at 4:45 PM, Rasika Pohankar <rasikapohan...@gmail.com> wrote: > Hello, > > I am trying to build Apache Spark version 1.0.1 on Ubuntu 12.04 LTS. After > unzipping the file and running sbt/sbt assembly I get the following error : > > rasika@rasikap:~/spark-1.0.1$ sbt/sbt package > Error occurred during initialization of VM > Could not reserve enough space for object heap > Error: Could not create the Java Virtual Machine. > Error: A fatal exception has occurred. Program will exit. > rasika@rasikap:~/spark-1.0.1$ > > I changed the SPARK_DAEMON_JAVA_OPTS to > SPARK_DAEMON_JAVA_OPTS="-Xms1024m -Xmx2048m" in the spark configuration > file(spark-env.sh), but it still gives the same error. > > Spark Version : 1.0.1 > Scala : 2.10.4 > Ubuntu : 12.04 LTS > Java : 1.7.0_65 > > How to solve the error? Please help. > > Thank you. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/JVM-Error-while-building-spark-tp11665.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org