I'm trying to configure driver memory size. So far I tied next parameters:

   -

   export JAVA_INTP_OPTS="-Xmx10g"

   -

   export SPARK_SUBMIT_OPTIONS="--driver-memory 10g --executor-memory 10g"

   -

   " -Xmx30000m \

   -

   Change SparkInterpreter.java: conf.set("spark.executor.memory", "10g");

   conf.set("spark.executor.cores", "2");
   conf.set("spark.driver.memory", "10g");
   conf.set("spark.shuffle.io.numConnectionsPerPeer", "5");
   conf.set("spark.sql.autoBroadcastJoinThreshold", "200483647");
   conf.set("spark.network.timeout", "400s");
   conf.set("spark.driver.maxResultSize", "3g");
   conf.set("spark.sql.hive.convertMetastoreParquet", "false");
   conf.set("spark.kryoserializer.buffer.max", "200m");
   conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
   conf.set("spark.dynamicAllocation.enabled", "true");
   conf.set("spark.shuffle.service.enabled", "true");
   conf.set("spark.dynamicAllocation.minExecutors", "1");
   conf.set("spark.dynamicAllocation.maxExecutors", "30");
   conf.set("spark.dynamicAllocation.executorIdleTimeout", "60s");
           //.set("spark.sql.hive.metastore.version", "1.1.0")
   conf.set("spark.dynamicAllocation.cachedExecutorIdleTimeout", "100s");

   -

   I tried setting SPARK_HOME, but it didn't even started, failed with
"Incompatible minimum and maximum heap sizes specified"


No matter, what I do I get in logs: "INFO [2015-11-11 14:55:24,453]
({sparkDriver-akka.actor.default-dispatcher-14} Logging.scala[logInfo]:59)
- Registering block manager 192.168.12.121:45057 with 530.0 MB RAM,
BlockManagerId(driver, 192.168.12.121, 45057)" and on my spark UI:

​Has anyone faced this problem or knows what to do?

-- 


*Sincerely yoursEgor Pakhomov*

Reply via email to