[
https://issues.apache.org/jira/browse/TOREE-406?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Raffaele Saggino updated TOREE-406:
-----------------------------------
Description:
Hello, I've been working on running Toree on Windows but I have encountered
some problems.
I created a run.cmd with the following code:
{code:none}
@echo off
pushd "%~dp0\..\"
set PROG_HOME=%cd%
popd
if not defined SPARK_HOME (
echo "SPARK_HOME must be set to the location of a Spark distribution!"
Exit /b
)
echo "Starting Spark Kernel with SPARK_HOME=%SPARK_HOME%"
pushd "%PROG_HOME%/lib"
for /f %%i in ('dir /B toree-assembly-*.jar') do set KERNEL_ASSEMBLY=%%i
popd
set PYTHONHASHSEED=0
set SPARK_SUBMIT_OPTS=-Dscala.usejavacp=true
set TOREE_ASSEMBLY=%PROG_HOME%/lib/%KERNEL_ASSEMBLY%
if not defined SPARK_OPTS (
if defined __TOREE_SPARK_OPTS__ (
set SPARK_OPTS=%__TOREE_SPARK_OPTS__%
)
)
if not defined TOREE_OPTS (
if defined __TOREE_SPARK_OPTS__ (
set TOREE_OPTS=%__TOREE_OPTS__%
)
)
"%SPARK_HOME%/bin/spark-submit" %SPARK_OPTS% --class org.apache.toree.Main
%TOREE_ASSEMBLY% %TOREE_OPTS% %*
{code}
After editing also the kernel.json accordingly (changing run.sh with run.cmd),
the kernel runs.
I made a few tests on Jupyter with the Spark Scala Kernel, scala itself works
but there is no spark context; same happens with PySpark Kernel.
This is the log:
{code:xml}
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Kernel version:
0.2.0.dev1-incubating-SNAPSHOT
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Scala version: 2.11.8
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - ZeroMQ (JeroMQ) version: 3.2.5
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Initializing internal actor system
17/04/26 12:09:32 [INFO] o.a.t.Main$$anon$1 - Connection Profile: {
"stdin_port" : 55862,
"control_port" : 55863,
"hb_port" : 55864,
"shell_port" : 55860,
"iopub_port" : 55861,
"ip" : "127.0.0.1",
"transport" : "tcp",
"signature_scheme" : "hmac-sha256",
"key" : "------"
}
(Scala,org.apache.toree.kernel.interpreter.scala.ScalaInterpreter@2c104774)
(PySpark,org.apache.toree.kernel.interpreter.pyspark.PySparkInterpreter@2cb3d0f7)
(SparkR,org.apache.toree.kernel.interpreter.sparkr.SparkRInterpreter@4e517165)
(SQL,org.apache.toree.kernel.interpreter.sql.SqlInterpreter@44e3760b)
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
[W 12:09:35.652 NotebookApp] Timeout waiting for kernel_info reply from
08b776d4-38fa-47c5-a0ea-789e7999ef3a
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message 9D7541B7013C44FCB036F4682FB3D1FD of type comm_info_request
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message CC1040D4E13F4F6086A454060E1E0B3F of type comm_open
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message 998BACAA64D940618A6694DE5E8CC9C5 of type comm_open
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:38 [WARN] o.a.t.Main$$anon$1 - No external magics provided to
PluginManager!
17/04/26 12:09:43 [INFO] o.a.t.Main$$anon$1 - 13 internal plugins loaded
17/04/26 12:09:43 [INFO] o.a.t.Main$$anon$1 - 0 external plugins loaded
17/04/26 12:09:43 [INFO] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Utilizing deploy mode: client
17/04/26 12:09:43 [WARN] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Locked to Scala interpreter with SparkIMain until decoupled!
17/04/26 12:09:43 [WARN] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Unable to control initialization of REPL class server!
17/04/26 12:09:47 [INFO] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Connecting to spark.master local[*]
17/04/26 12:09:56 [WARN] o.a.t.k.i.s.ScalaInterpreter - kernel variable:
org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2
17/04/26 12:09:56 [WARN] o.a.t.k.i.s.ScalaInterpreter - Binding List(@transient
implicit) kernel org.apache.toree.kernel.api.Kernel
org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2
17/04/26 12:09:56 [ERROR] o.a.t.k.i.s.ScalaInterpreter - Set failed in
bind(kernel, org.apache.toree.kernel.api.Kernel,
org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2)
17/04/26 12:09:56 [ERROR] o.a.t.k.i.s.ScalaInterpreter -
scala.tools.nsc.interpreter.IMain$ReadEvalPrint$EvalException: Failed to load
'$line3.$eval': $line3.$eval
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalError(IMain.scala:796)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.load(IMain.scala:800)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalClass$lzycompute(IMain.scala:803)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalClass(IMain.scala:803)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalMethod(IMain.scala:848)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:781)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.callEither(IMain.scala:790)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$class.bind(ScalaInterpreterSpecific.scala:156)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.bind(ScalaInterpreter.scala:44)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply$mcV$sp(ScalaInterpreter.scala:142)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply(ScalaInterpreter.scala:142)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply(ScalaInterpreter.scala:142)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$class.doQuietly(ScalaInterpreterSpecific.scala:179)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.doQuietly(ScalaInterpreter.scala:44)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.bindKernelVariable(ScalaInterpreter.scala:140)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.init(ScalaInterpreter.scala:87)
at
org.apache.toree.boot.layer.InterpreterManager$$anonfun$initializeInterpreters$1.apply(InterpreterManager.scala:35)
at
org.apache.toree.boot.layer.InterpreterManager$$anonfun$initializeInterpreters$1.apply(InterpreterManager.scala:34)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at
scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
at
org.apache.toree.boot.layer.InterpreterManager.initializeInterpreters(InterpreterManager.scala:34)
at
org.apache.toree.boot.layer.StandardComponentInitialization$class.initializeComponents(ComponentInitialization.scala:88)
at org.apache.toree.Main$$anon$1.initializeComponents(Main.scala:35)
at
org.apache.toree.boot.KernelBootstrap.initialize(KernelBootstrap.scala:100)
at
org.apache.toree.Main$.delayedEndpoint$org$apache$toree$Main$1(Main.scala:40)
at org.apache.toree.Main$delayedInit$body.apply(Main.scala:24)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at org.apache.toree.Main$.main(Main.scala:24)
at org.apache.toree.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: $line3.$eval
at
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.load(IMain.scala:799)
... 44 more
17/04/26 12:09:56 [INFO] o.a.t.k.i.s.ScalaInterpreter - Binding SparkContext
into interpreter as sc
17/04/26 12:10:03 [INFO] o.a.t.Main$$anon$1 - Marking relay as ready for
receiving messages
17/04/26 12:10:03 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Unstashing all
messages received!
17/04/26 12:10:03 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Relay is now fully
ready to receive messages!
17/04/26 12:10:04 [WARN] o.a.t.k.p.v.h.CommOpenHandler - Received invalid
target for Comm Open: jupyter.widget.version
17/04/26 12:10:04 [WARN] o.a.t.k.p.v.h.CommOpenHandler - Received invalid
target for Comm Open: jupyter.widget.version
17/04/26 12:10:04 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message 8329B7BBB4A44B69A1B7AD4917EFBB6A of type comm_info_request
[I 12:11:25.101 NotebookApp] Saving file at /Untitled3.ipynb
17/04/26 12:14:23 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:14:42 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:16:23 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:16:57 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:18:51 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
[I 12:20:45.754 NotebookApp] Saving file at /Untitled3.ipynb
17/04/26 12:37:00 [INFO] o.a.t.Main$$anon$1 - Resetting code execution!
17/04/26 12:37:00 [INFO] o.a.t.Main$$anon$1 - Enter Ctrl-C twice to shutdown!
[I 12:37:01.575 NotebookApp] Interrupted...
[I 12:37:01.610 NotebookApp] Shutting down kernels
17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down kernel
17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down interpreters
17/04/26 12:37:01 [INFO] o.a.t.k.i.s.ScalaInterpreter - Shutting down
interpreter
17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down actor system
17/04/26 12:37:02 [INFO] o.a.t.Main$$anon$1 - Shutting down interpreters
17/04/26 12:37:02 [INFO] o.a.t.k.i.s.ScalaInterpreter - Shutting down
interpreter
17/04/26 12:37:02 [INFO] o.a.t.Main$$anon$1 - Shutting down actor system
[I 12:37:02.682 NotebookApp] Kernel shutdown:
{code}
Any hints on solving this?
was:
Hello, I've been working on running Toree on Windows but I have encountered
some problems.
I created a run.cmd with the following code:
{code:none}
@echo off
pushd "%~dp0\..\"
set PROG_HOME=%cd%
popd
if not defined SPARK_HOME (
echo "SPARK_HOME must be set to the location of a Spark distribution!"
Exit /b
)
echo "Starting Spark Kernel with SPARK_HOME=%SPARK_HOME%"
pushd "%PROG_HOME%/lib"
for /f %%i in ('dir /B toree-assembly-*.jar') do set KERNEL_ASSEMBLY=%%i
popd
set PYTHONHASHSEED=0
set SPARK_SUBMIT_OPTS=-Dscala.usejavacp=true
set TOREE_ASSEMBLY=%PROG_HOME%/lib/%KERNEL_ASSEMBLY%
if not defined SPARK_OPTS (
if defined __TOREE_SPARK_OPTS__ (
set SPARK_OPTS=%__TOREE_SPARK_OPTS__%
)
)
if not defined TOREE_OPTS (
if defined __TOREE_SPARK_OPTS__ (
set TOREE_OPTS=%__TOREE_OPTS__%
)
)
"%SPARK_HOME%/bin/spark-submit" %SPARK_OPTS% --class org.apache.toree.Main
%TOREE_ASSEMBLY% %TOREE_OPTS% %*
{code}
After editing also the kernel.json accordingly the kernel runs, I made a few
tests on Jupyter with the Spark Scala Kernel, scala itself works but there is
no spark context.
This is the log:
{code:xml}
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Kernel version:
0.2.0.dev1-incubating-SNAPSHOT
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Scala version: 2.11.8
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - ZeroMQ (JeroMQ) version: 3.2.5
17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Initializing internal actor system
17/04/26 12:09:32 [INFO] o.a.t.Main$$anon$1 - Connection Profile: {
"stdin_port" : 55862,
"control_port" : 55863,
"hb_port" : 55864,
"shell_port" : 55860,
"iopub_port" : 55861,
"ip" : "127.0.0.1",
"transport" : "tcp",
"signature_scheme" : "hmac-sha256",
"key" : "------"
}
(Scala,org.apache.toree.kernel.interpreter.scala.ScalaInterpreter@2c104774)
(PySpark,org.apache.toree.kernel.interpreter.pyspark.PySparkInterpreter@2cb3d0f7)
(SparkR,org.apache.toree.kernel.interpreter.sparkr.SparkRInterpreter@4e517165)
(SQL,org.apache.toree.kernel.interpreter.sql.SqlInterpreter@44e3760b)
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
[W 12:09:35.652 NotebookApp] Timeout waiting for kernel_info reply from
08b776d4-38fa-47c5-a0ea-789e7999ef3a
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message 9D7541B7013C44FCB036F4682FB3D1FD of type comm_info_request
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message CC1040D4E13F4F6086A454060E1E0B3F of type comm_open
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message 998BACAA64D940618A6694DE5E8CC9C5 of type comm_open
17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
messages! Stashing until ready!
17/04/26 12:09:38 [WARN] o.a.t.Main$$anon$1 - No external magics provided to
PluginManager!
17/04/26 12:09:43 [INFO] o.a.t.Main$$anon$1 - 13 internal plugins loaded
17/04/26 12:09:43 [INFO] o.a.t.Main$$anon$1 - 0 external plugins loaded
17/04/26 12:09:43 [INFO] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Utilizing deploy mode: client
17/04/26 12:09:43 [WARN] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Locked to Scala interpreter with SparkIMain until decoupled!
17/04/26 12:09:43 [WARN] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Unable to control initialization of REPL class server!
17/04/26 12:09:47 [INFO] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
Connecting to spark.master local[*]
17/04/26 12:09:56 [WARN] o.a.t.k.i.s.ScalaInterpreter - kernel variable:
org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2
17/04/26 12:09:56 [WARN] o.a.t.k.i.s.ScalaInterpreter - Binding List(@transient
implicit) kernel org.apache.toree.kernel.api.Kernel
org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2
17/04/26 12:09:56 [ERROR] o.a.t.k.i.s.ScalaInterpreter - Set failed in
bind(kernel, org.apache.toree.kernel.api.Kernel,
org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2)
17/04/26 12:09:56 [ERROR] o.a.t.k.i.s.ScalaInterpreter -
scala.tools.nsc.interpreter.IMain$ReadEvalPrint$EvalException: Failed to load
'$line3.$eval': $line3.$eval
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalError(IMain.scala:796)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.load(IMain.scala:800)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalClass$lzycompute(IMain.scala:803)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalClass(IMain.scala:803)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalMethod(IMain.scala:848)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:781)
at
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.callEither(IMain.scala:790)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$class.bind(ScalaInterpreterSpecific.scala:156)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.bind(ScalaInterpreter.scala:44)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply$mcV$sp(ScalaInterpreter.scala:142)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply(ScalaInterpreter.scala:142)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply(ScalaInterpreter.scala:142)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$class.doQuietly(ScalaInterpreterSpecific.scala:179)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.doQuietly(ScalaInterpreter.scala:44)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.bindKernelVariable(ScalaInterpreter.scala:140)
at
org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.init(ScalaInterpreter.scala:87)
at
org.apache.toree.boot.layer.InterpreterManager$$anonfun$initializeInterpreters$1.apply(InterpreterManager.scala:35)
at
org.apache.toree.boot.layer.InterpreterManager$$anonfun$initializeInterpreters$1.apply(InterpreterManager.scala:34)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at
scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
at
org.apache.toree.boot.layer.InterpreterManager.initializeInterpreters(InterpreterManager.scala:34)
at
org.apache.toree.boot.layer.StandardComponentInitialization$class.initializeComponents(ComponentInitialization.scala:88)
at org.apache.toree.Main$$anon$1.initializeComponents(Main.scala:35)
at
org.apache.toree.boot.KernelBootstrap.initialize(KernelBootstrap.scala:100)
at
org.apache.toree.Main$.delayedEndpoint$org$apache$toree$Main$1(Main.scala:40)
at org.apache.toree.Main$delayedInit$body.apply(Main.scala:24)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at org.apache.toree.Main$.main(Main.scala:24)
at org.apache.toree.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: $line3.$eval
at
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.load(IMain.scala:799)
... 44 more
17/04/26 12:09:56 [INFO] o.a.t.k.i.s.ScalaInterpreter - Binding SparkContext
into interpreter as sc
17/04/26 12:10:03 [INFO] o.a.t.Main$$anon$1 - Marking relay as ready for
receiving messages
17/04/26 12:10:03 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Unstashing all
messages received!
17/04/26 12:10:03 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Relay is now fully
ready to receive messages!
17/04/26 12:10:04 [WARN] o.a.t.k.p.v.h.CommOpenHandler - Received invalid
target for Comm Open: jupyter.widget.version
17/04/26 12:10:04 [WARN] o.a.t.k.p.v.h.CommOpenHandler - Received invalid
target for Comm Open: jupyter.widget.version
17/04/26 12:10:04 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
message 8329B7BBB4A44B69A1B7AD4917EFBB6A of type comm_info_request
[I 12:11:25.101 NotebookApp] Saving file at /Untitled3.ipynb
17/04/26 12:14:23 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:14:42 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:16:23 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:16:57 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
17/04/26 12:18:51 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
output: ''
[I 12:20:45.754 NotebookApp] Saving file at /Untitled3.ipynb
17/04/26 12:37:00 [INFO] o.a.t.Main$$anon$1 - Resetting code execution!
17/04/26 12:37:00 [INFO] o.a.t.Main$$anon$1 - Enter Ctrl-C twice to shutdown!
[I 12:37:01.575 NotebookApp] Interrupted...
[I 12:37:01.610 NotebookApp] Shutting down kernels
17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down kernel
17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down interpreters
17/04/26 12:37:01 [INFO] o.a.t.k.i.s.ScalaInterpreter - Shutting down
interpreter
17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down actor system
17/04/26 12:37:02 [INFO] o.a.t.Main$$anon$1 - Shutting down interpreters
17/04/26 12:37:02 [INFO] o.a.t.k.i.s.ScalaInterpreter - Shutting down
interpreter
17/04/26 12:37:02 [INFO] o.a.t.Main$$anon$1 - Shutting down actor system
[I 12:37:02.682 NotebookApp] Kernel shutdown:
{code}
Any hints on solving this?
> Toree on Windows
> ----------------
>
> Key: TOREE-406
> URL: https://issues.apache.org/jira/browse/TOREE-406
> Project: TOREE
> Issue Type: Improvement
> Affects Versions: 0.2.0
> Environment: Windows 10, Scala 2.11, Python 3.6, Spark 2.1,
> Reporter: Raffaele Saggino
> Labels: windows
> Original Estimate: 1,176h
> Remaining Estimate: 1,176h
>
> Hello, I've been working on running Toree on Windows but I have encountered
> some problems.
> I created a run.cmd with the following code:
> {code:none}
> @echo off
> pushd "%~dp0\..\"
> set PROG_HOME=%cd%
> popd
> if not defined SPARK_HOME (
> echo "SPARK_HOME must be set to the location of a Spark distribution!"
> Exit /b
> )
> echo "Starting Spark Kernel with SPARK_HOME=%SPARK_HOME%"
> pushd "%PROG_HOME%/lib"
> for /f %%i in ('dir /B toree-assembly-*.jar') do set KERNEL_ASSEMBLY=%%i
> popd
> set PYTHONHASHSEED=0
> set SPARK_SUBMIT_OPTS=-Dscala.usejavacp=true
> set TOREE_ASSEMBLY=%PROG_HOME%/lib/%KERNEL_ASSEMBLY%
> if not defined SPARK_OPTS (
> if defined __TOREE_SPARK_OPTS__ (
> set SPARK_OPTS=%__TOREE_SPARK_OPTS__%
> )
> )
> if not defined TOREE_OPTS (
> if defined __TOREE_SPARK_OPTS__ (
> set TOREE_OPTS=%__TOREE_OPTS__%
> )
> )
> "%SPARK_HOME%/bin/spark-submit" %SPARK_OPTS% --class org.apache.toree.Main
> %TOREE_ASSEMBLY% %TOREE_OPTS% %*
> {code}
> After editing also the kernel.json accordingly (changing run.sh with
> run.cmd), the kernel runs.
> I made a few tests on Jupyter with the Spark Scala Kernel, scala itself works
> but there is no spark context; same happens with PySpark Kernel.
> This is the log:
> {code:xml}
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
> setLogLevel(newLevel).
> 17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Kernel version:
> 0.2.0.dev1-incubating-SNAPSHOT
> 17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Scala version: 2.11.8
> 17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - ZeroMQ (JeroMQ) version: 3.2.5
> 17/04/26 12:09:30 [INFO] o.a.t.Main$$anon$1 - Initializing internal actor
> system
> 17/04/26 12:09:32 [INFO] o.a.t.Main$$anon$1 - Connection Profile: {
> "stdin_port" : 55862,
> "control_port" : 55863,
> "hb_port" : 55864,
> "shell_port" : 55860,
> "iopub_port" : 55861,
> "ip" : "127.0.0.1",
> "transport" : "tcp",
> "signature_scheme" : "hmac-sha256",
> "key" : "------"
> }
> (Scala,org.apache.toree.kernel.interpreter.scala.ScalaInterpreter@2c104774)
> (PySpark,org.apache.toree.kernel.interpreter.pyspark.PySparkInterpreter@2cb3d0f7)
> (SparkR,org.apache.toree.kernel.interpreter.sparkr.SparkRInterpreter@4e517165)
> (SQL,org.apache.toree.kernel.interpreter.sql.SqlInterpreter@44e3760b)
> 17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
> messages! Stashing until ready!
> [W 12:09:35.652 NotebookApp] Timeout waiting for kernel_info reply from
> 08b776d4-38fa-47c5-a0ea-789e7999ef3a
> 17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
> messages! Stashing until ready!
> 17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
> message 9D7541B7013C44FCB036F4682FB3D1FD of type comm_info_request
> 17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
> messages! Stashing until ready!
> 17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
> message CC1040D4E13F4F6086A454060E1E0B3F of type comm_open
> 17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
> messages! Stashing until ready!
> 17/04/26 12:09:35 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
> message 998BACAA64D940618A6694DE5E8CC9C5 of type comm_open
> 17/04/26 12:09:35 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Not ready for
> messages! Stashing until ready!
> 17/04/26 12:09:38 [WARN] o.a.t.Main$$anon$1 - No external magics provided to
> PluginManager!
> 17/04/26 12:09:43 [INFO] o.a.t.Main$$anon$1 - 13 internal plugins loaded
> 17/04/26 12:09:43 [INFO] o.a.t.Main$$anon$1 - 0 external plugins loaded
> 17/04/26 12:09:43 [INFO] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
> Utilizing deploy mode: client
> 17/04/26 12:09:43 [WARN] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
> Locked to Scala interpreter with SparkIMain until decoupled!
> 17/04/26 12:09:43 [WARN] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
> Unable to control initialization of REPL class server!
> 17/04/26 12:09:47 [INFO] o.a.t.b.l.StandardComponentInitialization$$anon$1 -
> Connecting to spark.master local[*]
> 17/04/26 12:09:56 [WARN] o.a.t.k.i.s.ScalaInterpreter - kernel variable:
> org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2
> 17/04/26 12:09:56 [WARN] o.a.t.k.i.s.ScalaInterpreter - Binding
> List(@transient implicit) kernel org.apache.toree.kernel.api.Kernel
> org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2
> 17/04/26 12:09:56 [ERROR] o.a.t.k.i.s.ScalaInterpreter - Set failed in
> bind(kernel, org.apache.toree.kernel.api.Kernel,
> org.apache.toree.boot.layer.StandardComponentInitialization$$anon$1@363f5a2)
> 17/04/26 12:09:56 [ERROR] o.a.t.k.i.s.ScalaInterpreter -
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint$EvalException: Failed to load
> '$line3.$eval': $line3.$eval
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalError(IMain.scala:796)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.load(IMain.scala:800)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalClass$lzycompute(IMain.scala:803)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalClass(IMain.scala:803)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.evalMethod(IMain.scala:848)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:781)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.callEither(IMain.scala:790)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$class.bind(ScalaInterpreterSpecific.scala:156)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.bind(ScalaInterpreter.scala:44)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply$mcV$sp(ScalaInterpreter.scala:142)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply(ScalaInterpreter.scala:142)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$bindKernelVariable$1.apply(ScalaInterpreter.scala:142)
> at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$class.doQuietly(ScalaInterpreterSpecific.scala:179)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.doQuietly(ScalaInterpreter.scala:44)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.bindKernelVariable(ScalaInterpreter.scala:140)
> at
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.init(ScalaInterpreter.scala:87)
> at
> org.apache.toree.boot.layer.InterpreterManager$$anonfun$initializeInterpreters$1.apply(InterpreterManager.scala:35)
> at
> org.apache.toree.boot.layer.InterpreterManager$$anonfun$initializeInterpreters$1.apply(InterpreterManager.scala:34)
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at
> scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
> at
> org.apache.toree.boot.layer.InterpreterManager.initializeInterpreters(InterpreterManager.scala:34)
> at
> org.apache.toree.boot.layer.StandardComponentInitialization$class.initializeComponents(ComponentInitialization.scala:88)
> at org.apache.toree.Main$$anon$1.initializeComponents(Main.scala:35)
> at
> org.apache.toree.boot.KernelBootstrap.initialize(KernelBootstrap.scala:100)
> at
> org.apache.toree.Main$.delayedEndpoint$org$apache$toree$Main$1(Main.scala:40)
> at org.apache.toree.Main$delayedInit$body.apply(Main.scala:24)
> at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
> at
> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> at scala.App$$anonfun$main$1.apply(App.scala:76)
> at scala.App$$anonfun$main$1.apply(App.scala:76)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at
> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
> at scala.App$class.main(App.scala:76)
> at org.apache.toree.Main$.main(Main.scala:24)
> at org.apache.toree.Main.main(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException: $line3.$eval
> at
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.load(IMain.scala:799)
> ... 44 more
> 17/04/26 12:09:56 [INFO] o.a.t.k.i.s.ScalaInterpreter - Binding SparkContext
> into interpreter as sc
> 17/04/26 12:10:03 [INFO] o.a.t.Main$$anon$1 - Marking relay as ready for
> receiving messages
> 17/04/26 12:10:03 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Unstashing all
> messages received!
> 17/04/26 12:10:03 [INFO] o.a.t.k.p.v.r.KernelMessageRelay - Relay is now
> fully ready to receive messages!
> 17/04/26 12:10:04 [WARN] o.a.t.k.p.v.h.CommOpenHandler - Received invalid
> target for Comm Open: jupyter.widget.version
> 17/04/26 12:10:04 [WARN] o.a.t.k.p.v.h.CommOpenHandler - Received invalid
> target for Comm Open: jupyter.widget.version
> 17/04/26 12:10:04 [WARN] o.a.t.k.p.v.k.s.Shell - Parent header is null for
> message 8329B7BBB4A44B69A1B7AD4917EFBB6A of type comm_info_request
> [I 12:11:25.101 NotebookApp] Saving file at /Untitled3.ipynb
> 17/04/26 12:14:23 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
> output: ''
> 17/04/26 12:14:42 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
> output: ''
> 17/04/26 12:16:23 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
> output: ''
> 17/04/26 12:16:57 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
> output: ''
> 17/04/26 12:18:51 [WARN] o.a.t.k.p.v.s.KernelOutputStream - Suppressing empty
> output: ''
> [I 12:20:45.754 NotebookApp] Saving file at /Untitled3.ipynb
> 17/04/26 12:37:00 [INFO] o.a.t.Main$$anon$1 - Resetting code execution!
> 17/04/26 12:37:00 [INFO] o.a.t.Main$$anon$1 - Enter Ctrl-C twice to shutdown!
> [I 12:37:01.575 NotebookApp] Interrupted...
> [I 12:37:01.610 NotebookApp] Shutting down kernels
> 17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down kernel
> 17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down interpreters
> 17/04/26 12:37:01 [INFO] o.a.t.k.i.s.ScalaInterpreter - Shutting down
> interpreter
> 17/04/26 12:37:01 [INFO] o.a.t.Main$$anon$1 - Shutting down actor system
> 17/04/26 12:37:02 [INFO] o.a.t.Main$$anon$1 - Shutting down interpreters
> 17/04/26 12:37:02 [INFO] o.a.t.k.i.s.ScalaInterpreter - Shutting down
> interpreter
> 17/04/26 12:37:02 [INFO] o.a.t.Main$$anon$1 - Shutting down actor system
> [I 12:37:02.682 NotebookApp] Kernel shutdown:
> {code}
> Any hints on solving this?
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)