Sending this to dev mailing list sine my attempt to send on users mailing
list failed with the following error.
--------
Hi. This is the qmail-send program at apache.org.
I'm afraid I wasn't able to deliver your message to the following addresses.
This is a permanent error; I've given up. Sorry it didn't work out.
<[email protected]>:
Must be sent from an @apache.org address or a subscriber address or an
address in LDAP.
--------
-Sachin
---------- Forwarded message ----------
From: Sachin Jain <[email protected]>
Date: Sat, Sep 3, 2016 at 2:43 PM
Subject: Unable to run spark in zeppelin
To: [email protected]
Hi,
I have done a fresh setup of Zeppelin and when I am trying to run spark
code (with scala), I am getting following error in logs.
Please find attached log file which contains error. I am also attaching my
notebook screenshot and the interpreter binding just in case it is helpful.
Use case: Till now, I run my code in spark shell but I want to move to
notebook experience where I can save my code snippets and run them later.
INFO [2016-09-03 14:26:16,774] ({pool-2-thread-5} Logging.scala[logInfo]:58) -
MemoryStore started with capacity 5.5 GB
INFO [2016-09-03 14:26:16,777] ({pool-2-thread-5} Logging.scala[logInfo]:58) -
Registering OutputCommitCoordinator
ERROR [2016-09-03 14:26:16,783] ({pool-2-thread-5} Job.java[run]:189) - Job
failed
java.lang.NoSuchMethodError:
scala.runtime.VolatileByteRef.create(B)Lscala/runtime/VolatileByteRef;
at scala.xml.MetaData$.iterate$1(MetaData.scala:39)
at scala.xml.MetaData$.normalize(MetaData.scala:45)
at scala.xml.Elem.<init>(Elem.scala:99)
at
org.apache.spark.ui.jobs.StagePage$$anonfun$26.apply(StagePage.scala:57)
at
org.apache.spark.ui.jobs.StagePage$$anonfun$26.apply(StagePage.scala:55)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:55)
at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:57)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:195)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:146)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:473)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_1(SparkInterpreter.java:440)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:354)
at
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:137)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:743)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
INFO [2016-09-03 14:26:16,784] ({pool-2-thread-5}
SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1472892976287
finished by scheduler org.apache.zeppelin.spark.SparkInterpreter809859938
INFO [2016-09-03 14:29:40,349] ({pool-2-thread-10}
SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1472893180349
started by scheduler org.apache.zeppelin.spark.SparkInterpreter809859938
INFO [2016-09-03 14:29:40,370] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Changing view acls to: sachinjain
INFO [2016-09-03 14:29:40,370] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Changing modify acls to: sachinjain
INFO [2016-09-03 14:29:40,370] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(sachinjain); users with modify permissions: Set(sachinjain)
INFO [2016-09-03 14:29:40,373] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Starting HTTP Server
INFO [2016-09-03 14:29:40,375] ({pool-2-thread-10} Server.java[doStart]:272) -
jetty-8.y.z-SNAPSHOT
INFO [2016-09-03 14:29:40,376] ({pool-2-thread-10}
AbstractConnector.java[doStart]:338) - Started [email protected]:59946
INFO [2016-09-03 14:29:40,376] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Successfully started service 'HTTP class server' on port 59946.
INFO [2016-09-03 14:29:40,752] ({pool-2-thread-10}
SparkInterpreter.java[createSparkContext_1]:367) - ------ Create new
SparkContext local[*] -------
WARN [2016-09-03 14:29:40,753] ({pool-2-thread-10}
Logging.scala[logWarning]:70) - Another SparkContext is being constructed (or
threw an exception in its constructor). This may indicate an error, since only
one SparkContext may be running in this JVM (see SPARK-2243). The other
SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_1(SparkInterpreter.java:440)
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:354)
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:137)
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:743)
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
org.apache.zeppelin.scheduler.Job.run(Job.java:176)
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
java.util.concurrent.FutureTask.run(FutureTask.java:262)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
java.lang.Thread.run(Thread.java:745)
INFO [2016-09-03 14:29:40,754] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Running Spark version 1.6.1
INFO [2016-09-03 14:29:40,755] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Changing view acls to: sachinjain
INFO [2016-09-03 14:29:40,755] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Changing modify acls to: sachinjain
INFO [2016-09-03 14:29:40,755] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(sachinjain); users with modify permissions: Set(sachinjain)
INFO [2016-09-03 14:29:40,770] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Successfully started service 'sparkDriver' on port 59948.
INFO [2016-09-03 14:29:40,814]
({sparkDriverActorSystem-akka.actor.default-dispatcher-2}
Slf4jLogger.scala[applyOrElse]:80) - Slf4jLogger started
INFO [2016-09-03 14:29:40,817]
({sparkDriverActorSystem-akka.actor.default-dispatcher-5}
Slf4jLogger.scala[apply$mcV$sp]:74) - Starting remoting
INFO [2016-09-03 14:29:40,823]
({sparkDriverActorSystem-akka.actor.default-dispatcher-5}
Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting started; listening on addresses
:[akka.tcp://[email protected]:59949]
INFO [2016-09-03 14:29:40,824] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Successfully started service 'sparkDriverActorSystem' on port 59949.
INFO [2016-09-03 14:29:40,825] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Registering MapOutputTracker
INFO [2016-09-03 14:29:40,826] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Registering BlockManagerMaster
INFO [2016-09-03 14:29:40,827] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Created local directory at
/private/var/folders/lk/zy2910y56s17f28nghgcjnp40000gn/T/blockmgr-545bad67-4029-44b7-9850-a36adcf4219b
INFO [2016-09-03 14:29:40,828] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- MemoryStore started with capacity 5.5 GB
INFO [2016-09-03 14:29:40,829] ({pool-2-thread-10} Logging.scala[logInfo]:58)
- Registering OutputCommitCoordinator
ERROR [2016-09-03 14:29:40,830] ({pool-2-thread-10} Job.java[run]:189) - Job
failed
java.lang.NoSuchMethodError:
scala.runtime.VolatileByteRef.create(B)Lscala/runtime/VolatileByteRef;
at scala.xml.MetaData$.iterate$1(MetaData.scala:39)
at scala.xml.MetaData$.normalize(MetaData.scala:45)
at scala.xml.Elem.<init>(Elem.scala:99)
at
org.apache.spark.ui.jobs.StagePage$$anonfun$26.apply(StagePage.scala:57)
at
org.apache.spark.ui.jobs.StagePage$$anonfun$26.apply(StagePage.scala:55)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:55)
at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:57)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:195)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:146)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:473)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_1(SparkInterpreter.java:440)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:354)
at
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:137)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:743)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
INFO [2016-09-03 14:29:40,832] ({pool-2-thread-10}
SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1472893180349
finished by scheduler org.apache.zeppelin.spark.SparkInterpreter809859938
INFO [2016-09-03 14:29:57,954] ({pool-2-thread-7}
SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1472893197953
started by scheduler org.apache.zeppelin.spark.SparkInterpreter809859938
INFO [2016-09-03 14:29:57,973] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Changing view acls to: sachinjain
INFO [2016-09-03 14:29:57,973] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Changing modify acls to: sachinjain
INFO [2016-09-03 14:29:57,973] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(sachinjain); users with modify permissions: Set(sachinjain)
INFO [2016-09-03 14:29:57,976] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Starting HTTP Server
INFO [2016-09-03 14:29:57,979] ({pool-2-thread-7} Server.java[doStart]:272) -
jetty-8.y.z-SNAPSHOT
INFO [2016-09-03 14:29:57,983] ({pool-2-thread-7}
AbstractConnector.java[doStart]:338) - Started [email protected]:59950
INFO [2016-09-03 14:29:57,983] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Successfully started service 'HTTP class server' on port 59950.
INFO [2016-09-03 14:29:58,353] ({pool-2-thread-7}
SparkInterpreter.java[createSparkContext_1]:367) - ------ Create new
SparkContext local[*] -------
WARN [2016-09-03 14:29:58,355] ({pool-2-thread-7}
Logging.scala[logWarning]:70) - Another SparkContext is being constructed (or
threw an exception in its constructor). This may indicate an error, since only
one SparkContext may be running in this JVM (see SPARK-2243). The other
SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_1(SparkInterpreter.java:440)
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:354)
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:137)
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:743)
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
org.apache.zeppelin.scheduler.Job.run(Job.java:176)
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
java.util.concurrent.FutureTask.run(FutureTask.java:262)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
java.lang.Thread.run(Thread.java:745)
INFO [2016-09-03 14:29:58,355] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Running Spark version 1.6.1
INFO [2016-09-03 14:29:58,356] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Changing view acls to: sachinjain
INFO [2016-09-03 14:29:58,356] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Changing modify acls to: sachinjain
INFO [2016-09-03 14:29:58,357] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(sachinjain); users with modify permissions: Set(sachinjain)
INFO [2016-09-03 14:29:58,369] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Successfully started service 'sparkDriver' on port 59951.
INFO [2016-09-03 14:29:58,417]
({sparkDriverActorSystem-akka.actor.default-dispatcher-3}
Slf4jLogger.scala[applyOrElse]:80) - Slf4jLogger started
INFO [2016-09-03 14:29:58,422]
({sparkDriverActorSystem-akka.actor.default-dispatcher-5}
Slf4jLogger.scala[apply$mcV$sp]:74) - Starting remoting
INFO [2016-09-03 14:29:58,427]
({sparkDriverActorSystem-akka.actor.default-dispatcher-5}
Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting started; listening on addresses
:[akka.tcp://[email protected]:59952]
INFO [2016-09-03 14:29:58,427] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Successfully started service 'sparkDriverActorSystem' on port 59952.
INFO [2016-09-03 14:29:58,428] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Registering MapOutputTracker
INFO [2016-09-03 14:29:58,430] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Registering BlockManagerMaster
INFO [2016-09-03 14:29:58,431] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Created local directory at
/private/var/folders/lk/zy2910y56s17f28nghgcjnp40000gn/T/blockmgr-0dfa9e96-2a99-4648-9cf1-2345414a16de
INFO [2016-09-03 14:29:58,431] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
MemoryStore started with capacity 5.5 GB
INFO [2016-09-03 14:29:58,433] ({pool-2-thread-7} Logging.scala[logInfo]:58) -
Registering OutputCommitCoordinator
ERROR [2016-09-03 14:29:58,434] ({pool-2-thread-7} Job.java[run]:189) - Job
failed
java.lang.NoSuchMethodError:
scala.runtime.VolatileByteRef.create(B)Lscala/runtime/VolatileByteRef;
at scala.xml.MetaData$.iterate$1(MetaData.scala:39)
at scala.xml.MetaData$.normalize(MetaData.scala:45)
at scala.xml.Elem.<init>(Elem.scala:99)
at
org.apache.spark.ui.jobs.StagePage$$anonfun$26.apply(StagePage.scala:57)
at
org.apache.spark.ui.jobs.StagePage$$anonfun$26.apply(StagePage.scala:55)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:55)
at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:57)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:195)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:146)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:473)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_1(SparkInterpreter.java:440)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:354)
at
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:137)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:743)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
INFO [2016-09-03 14:29:58,435] ({pool-2-thread-7}
SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1472893197953
finished by scheduler org.apache.zeppelin.spark.SparkInterpreter809859938