I'm glad to tell you that I resolved the issue.

Actually it boiled down to this error message in the logs:

java.util.NoSuchElementException: next on empty iterator at 
scala.collection.Iterator$$anon$2.next(Iterator.scala:39)
 at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)
 at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:64)
 at scala.collection.IterableLike$class.head(IterableLike.scala:91)
 at 
scala.collection.mutable.ArrayOps$ofRef.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:108)
 at 
scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:120)
 at scala.collection.mutable.ArrayOps$ofRef.head(ArrayOps.scala:108)
 at scala.collection.TraversableLike$class.last(TraversableLike.scala:455)
 at 
scala.collection.mutable.ArrayOps$ofRef.scala$collection$IndexedSeqOptimized$$super$last(ArrayOps.scala:108)
 at 
scala.collection.IndexedSeqOptimized$class.last(IndexedSeqOptimized.scala:126)
 at scala.collection.mutable.ArrayOps$ofRef.last(ArrayOps.scala:108)
 at 
org.apache.spark.executor.Executor$$anonfun$createClassLoader$1.apply(Executor.scala:338)
 at 
org.apache.spark.executor.Executor$$anonfun$createClassLoader$1.apply(Executor.scala:337)
 at scala.collection.immutable.List.foreach(List.scala:318)
 at org.apache.spark.executor.Executor.createClassLoader(Executor.scala:337)
 at org.apache.spark.executor.Executor.<init>(Executor.scala:93)
 at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:58)
 at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:125)
 at 
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
 at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
 at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:343)
 at 
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:145)
 at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:469)
 at 
org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
 at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
 at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:300)
 at org.apache.zeppelin.scheduler.Job.run(Job.java:169)
 at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:134)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
 at java.util.concurrent.FutureTask.run(FutureTask.java:262)
 at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
 at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)

and I noticed that SPARK_HOME was not set in the conf/zeppelin-env.sh file.
After adding 'export SPARK_HOME=/usr/spark' to this file Spark works fine in 
Zeppelin.

Bye,

Manuel



On Fr, 2016-03-04 at 10:25 +0100, Manuel Schölling wrote:
> Hi,
> 
> I am pretty new to Zeppelin and I'm trying to make it work using a
> vagrant setup [1,2].
> 
> However, when I want to execute some java code in zeppelin from the
> tutorial the interface switches to 'Running' but stays at 0%.
> 
> When looking into the log files I see this NullPointerException:
>         
>         ERROR [2016-03-04 09:10:20,001] ({DefaultQuartzScheduler_Worker-3} 
> JobRunShell.java[run]:211) - Job note.2AXWE6MRR threw an unhandled Exception: 
>         java.lang.NullPointerException
>               at org.apache.zeppelin.notebook.Note.runAll(Note.java:351)
>               at 
> org.apache.zeppelin.notebook.Notebook$CronJob.execute(Notebook.java:417)
>               at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>               at 
> org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
>         ERROR [2016-03-04 09:10:20,001] ({DefaultQuartzScheduler_Worker-3} 
> QuartzScheduler.java[schedulerError]:2425) - Job (note.2AXWE6MRR threw an 
> exception.
>         org.quartz.SchedulerException: Job threw an unhandled exception. [See 
> nested exception: java.lang.NullPointerException]
>               at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>               at 
> org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
>         Caused by: java.lang.NullPointerException
>               at org.apache.zeppelin.notebook.Note.runAll(Note.java:351)
>               at 
> org.apache.zeppelin.notebook.Notebook$CronJob.execute(Notebook.java:417)
>               at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>               ... 1 more
> 
> I already updated to Spark v1.6.0, Hadoop v2.6, and Zeppelin v0.5.6 but
> the problem still exists.
> 
> The line [3] that issues this problem is this:
> 
>   public void runAll() {
>     synchronized (paragraphs) {
>       for (Paragraph p : paragraphs) {
>         p.setNoteReplLoader(replLoader);
>         p.setListener(jobListenerFactory.getParagraphJobListener(this));
>         Interpreter intp = replLoader.get(p.getRequiredReplName());
>         intp.getScheduler().submit(p); // <<-- !! NullPointerException HERE !!
>       }
>     }
>   }
> 
> Any ideas how to fix this?
> 
> Thanks,
> 
> Manuel
> 
> 
> [1] https://github.com/arjones/vagrant-spark-zeppelin
> [2]
> http://arjon.es/2015/08/23/vagrant-spark-zeppelin-a-toolbox-to-the-data-analyst/
> [3]
> https://github.com/apache/incubator-zeppelin/blob/branch-0.5.6/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java#L351
> 
> 

-- 
Manuel Schölling
Deutsches Zentrum für Neurodegenerative Erkrankungen e.V.
Image and Data Analysis Facility

email: manuel.schoell...@dzne.de
phone: +49 228 43302 573

Room C.EG.4.08
Ludwig-Erhard-Allee 2
53175 Bonn



Reply via email to