Tijo Thomas created SPARK-6928:
----------------------------------
Summary: spark-shell stops working after the replay command
Key: SPARK-6928
URL: https://issues.apache.org/jira/browse/SPARK-6928
Project: Spark
Issue Type: Bug
Components: Spark Shell
Affects Versions: 1.3.0
Environment: Scala Version :Scala-2.10
Reporter: Tijo Thomas
Step to reproduce this issues.
Step 1 :
scala> sc.parallelize(1 to 10).map(_+"2").count();
res0: Long = 10
Step 2 :
scala> :replay
Replaying: sc.parallelize(1 to 10).map(_+"2").count();
<console>:8: error: not found: value sc
sc.parallelize(1 to 10).map(_+"2").count();
^
// Note : After Replay command , Non of the spark api's are working as the
SparkContext has gone out of scope.
eg: getting this exception as given below
scala> exit
error:
while compiling: <console>
during phase: jvm
library version: version 2.10.4
compiler version: version 2.10.4
reconstructed args:
last tree to typer: Apply(constructor $read)
symbol: constructor $read in class $read (flags: <method>
<triedcooking>)
symbol definition: def <init>(): $line20.$read
tpe: $line20.$read
symbol owners: constructor $read -> class $read -> package $line20
context owners: class iwC -> package $line20
............
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]