The message is clear that you are setting both spark.driver.extraJavaOptions 
and SPARK_JAVA_OPTS.
Please check spark-defaults.conf and interpreter setting.



Best Regard,
Jeff Zhang


From: kant kodali <kanth...@gmail.com<mailto:kanth...@gmail.com>>
Reply-To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Date: Sunday, April 23, 2017 at 1:58 PM
To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Subject: Re: java.lang.NullpointerException

FYI: I am using Spark Standalone mode

On Sat, Apr 22, 2017 at 10:57 PM, kant kodali 
<kanth...@gmail.com<mailto:kanth...@gmail.com>> wrote:

Hi All,

I get the below stack trace when I am using Zeppelin. If I don't use Zeppelin 
all my client Jobs are running fine. I am using spark 2.1.0

I am not sure why Zeppelin is unable to create a SparkContext ? but then it 
says it created SparkSession so it doesn't seem to make a lot of sense. any 
idea?

Thanks!


Caused by: org.apache.spark.SparkException: Found both 
spark.driver.extraJavaOptions and SPARK_JAVA_OPTS. Use only the former.

at 
org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:521)

at 
org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:519)

at scala.collection.immutable.List.foreach(List.scala:381)

at 
org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:519)

at 
org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:505)

at scala.Option.foreach(Option.scala:257)

at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:505)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:365)

at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258)

at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)

at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)

at scala.Option.getOrElse(Option.scala:121)

at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)

... 20 more

 INFO [2017-04-23 05:17:08,467] ({pool-2-thread-2} 
SparkInterpreter.java[createSparkSession]:372) - Created Spark session

ERROR [2017-04-23 05:17:08,467] ({pool-2-thread-2} Job.java[run]:181) - Job 
failed

java.lang.NullPointerException

at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)

at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)

at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:391)

at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:380)

at 
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)

at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:828)

at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)

at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)

at org.apache.zeppelin.scheduler.Job.run(Job.java:175)

at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)

at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)

at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

 INFO [2017-04-23 05:17:08,475] ({pool-2-thread-2} 
SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1492924623730 
finished by scheduler org.apache.zeppelin.spark.SparkInterpreter1233713905

Reply via email to