I am not familiar with the CDH distributions. However from the exception,
you are setting both SPARK_JAVA_OPTS and specifying individually for driver
and executor.

Check for the spark-env.sh file in your spark config directory and you
could comment/remove  the SPARK_JAVA_OPTS entry and add the values to
required driver and executor java options.

On Thu, Jul 21, 2016 at 12:10 PM, SamyaMaiti <samya.maiti2...@gmail.com>
wrote:

> Hi Team,
>
> I am using *CDH 5.7.1* with spark *1.6.0*
>
> I have a spark streaming application that read s from kafka & do some
> processing.
>
> The issue is while starting the application in CLUSTER mode, i want to pass
> custom log4j.properies file to both driver & executor.
>
> *I have the below command :-*
>
> spark-submit \
> --class xyx.search.spark.Boot \
> --conf "spark.cores.max=6" \
> --conf "spark.eventLog.enabled=true" \
> *--conf
>
> "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Driver.properties"
> \
> --conf
>
> "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Executor.properties"
> \*
> --deploy-mode "cluster" \
> /some/path/search-spark-service-1.0.0.jar \
> /some/path/conf/
>
>
> *But it gives the below exception :-*
>
> SPARK_JAVA_OPTS was detected (set to
> '-XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh ').
> This is deprecated in Spark 1.0+.
>
> Please instead use:
>  - ./spark-submit with conf/spark-defaults.conf to set defaults for an
> application
>  - ./spark-submit with --driver-java-options to set -X options for a driver
>  - spark.executor.extraJavaOptions to set -X options for executors
>  - SPARK_DAEMON_JAVA_OPTS to set java options for standalone daemons
> (master
> or worker)
>
> 2016-07-21 12:59:41 ERROR SparkContext:95 - Error initializing
> SparkContext.
> org.apache.spark.SparkException: Found both spark.executor.extraJavaOptions
> and SPARK_JAVA_OPTS. Use only the former.
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:470)
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:468)
>         at scala.collection.immutable.List.foreach(List.scala:318)
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:468)
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:454)
>
>
> /*Please note the same works with CDH 5.4 with spark 1.3.0.*/
>
> Regards,
> Sam
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-extraJavaOptions-tp27389.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


-- 
-Dhruve Ashar

Reply via email to