Could you check the log again ? There should be another exception above the 
exception you pasted. Most likely SparkContext is failed to create.



Best Regard,
Jeff Zhang


From: Terry Healy <the...@bnl.gov<mailto:the...@bnl.gov>>
Reply-To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Date: Friday, October 6, 2017 at 10:35 PM
To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Subject: Trying to 0.7.3 running with Spark

Using Zeppelin 0.7.3, Spark 2.1.0-mapr-1703 / Scala 2.11.8

I had previously run the demo and successfully set up MongoDB and JDBC 
interpreter for Impala under V0.7.2. Since I have upgraded to 0.7.3, everything 
broke. I am down to to complete re-install (several, in fact) and get a 
response like below for most everything I try. (Focusing just on %spark for 
now) apparently have something very basic wrong, but I'll be damned if I can 
find it. The same example works fine in spark-shell.

Any suggestions for a new guy very much appreciated.

I found [ZEPPELIN-2475] 
and<https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0ahUKEwjEtK7_jdzWAhXC4SYKHR-oAtAQFggrMAA&url=https%3A%2F%2Fissues.apache.org%2Fjira%2Fbrowse%2FZEPPELIN-2475&usg=AOvVaw2-wxGTzLNZgYSUQZFdGoyj>
 [ZEPPELIN-1560] which seem to be the same, or similar, but I did not 
understand what to change 
where<https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=0ahUKEwjEtK7_jdzWAhXC4SYKHR-oAtAQFggyMAE&url=https%3A%2F%2Fissues.apache.org%2Fjira%2Fbrowse%2FZEPPELIN-1560&usg=AOvVaw1HMgQXqJ1pSufx80ablv0y>

This is from "Zeppelin Tutorial/Basic Features (Spark)".

java.lang.NullPointerException
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:398)
at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:387)
at 
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:843)
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

Reply via email to