[
https://issues.apache.org/jira/browse/SPARK-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14203183#comment-14203183
]
Corey J. Nolet commented on SPARK-4289:
---------------------------------------
I suppose we could look @ it as a Hadoop issue- though newing up a Job works
fine without the Scala shell doing the toString(). I'd have to dive in deeper
to find out why the states seem to be different between the constructor and the
toString()- and even more importantly, why it cares...
I think :silent will work for the short term.
> Creating an instance of Hadoop Job fails in the Spark shell when toString()
> is called on the instance.
> ------------------------------------------------------------------------------------------------------
>
> Key: SPARK-4289
> URL: https://issues.apache.org/jira/browse/SPARK-4289
> Project: Spark
> Issue Type: Bug
> Reporter: Corey J. Nolet
>
> This one is easy to reproduce.
> <pre>val job = new Job(sc.hadoopConfiguration)</pre>
> I'm not sure what the solution would be off hand as it's happening when the
> shell is calling toString() on the instance of Job. The problem is, because
> of the failure, the instance is never actually assigned to the job val.
> java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING
> at org.apache.hadoop.mapreduce.Job.ensureState(Job.java:283)
> at org.apache.hadoop.mapreduce.Job.toString(Job.java:452)
> at
> scala.runtime.ScalaRunTime$.scala$runtime$ScalaRunTime$$inner$1(ScalaRunTime.scala:324)
> at scala.runtime.ScalaRunTime$.stringOf(ScalaRunTime.scala:329)
> at scala.runtime.ScalaRunTime$.replStringOf(ScalaRunTime.scala:337)
> at .<init>(<console>:10)
> at .<clinit>(<console>)
> at $print(<console>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)
> at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)
> at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:610)
> at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:814)
> at
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:859)
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:771)
> at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:616)
> at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:624)
> at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:629)
> at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:954)
> at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
> at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
> at
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:902)
> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:997)
> at org.apache.spark.repl.Main$.main(Main.scala:31)
> at org.apache.spark.repl.Main.main(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]