Hi Team
       I was trying to run spark using `sbt console` on the terminal. I am
able to build the project successfully using build.sbt and the following
piece of code runs fine on IntelliJ. The only issue I am facing while
running the same on terminal is that the Executor keeps running and is not
able to complete the task. I don't know if it is not able to get the
resources it wants or something else is stopping it.
Here's the Code
```

import org.apache.spark._val sc = new SparkContext("local[1]",
"SimpleProg")val nums = sc.parallelize(List(1, 2, 3, 4));
println(nums.reduce((a, b) => a - b))

```

I've attached a file that contains the errors that show up when I manually
stop the program using `Ctrl+C`.
23/04/16 19:04:39 INFO SparkContext: Starting job: reduce at <console>:17
23/04/16 19:04:39 INFO DAGScheduler: Got job 2 (reduce at <console>:17) with 1 
output partitions
23/04/16 19:04:39 INFO DAGScheduler: Final stage: ResultStage 2 (reduce at 
<console>:17)
23/04/16 19:04:39 INFO DAGScheduler: Parents of final stage: List()
23/04/16 19:04:39 INFO DAGScheduler: Missing parents: List()
23/04/16 19:04:39 INFO BlockManagerInfo: Removed broadcast_2_piece0 on 
192.168.0.120:50005 in memory (size: 2.3 KiB, free: 434.4 MiB)
23/04/16 19:04:39 INFO DAGScheduler: Submitting ResultStage 2 
(ParallelCollectionRDD[0] at parallelize at <console>:16), which has no missing 
parents
23/04/16 19:04:39 INFO BlockManagerInfo: Removed broadcast_0_piece0 on 
192.168.0.120:50005 in memory (size: 1279.0 B, free: 434.4 MiB)
23/04/16 19:04:39 INFO MemoryStore: Block broadcast_3 stored as values in 
memory (estimated size 2.2 KiB, free 434.3 MiB)
23/04/16 19:04:39 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in 
memory (estimated size 1389.0 B, free 434.3 MiB)
23/04/16 19:04:39 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on 
192.168.0.120:50005 (size: 1389.0 B, free: 434.4 MiB)
23/04/16 19:04:39 INFO SparkContext: Created broadcast 3 from broadcast at 
DAGScheduler.scala:1200
23/04/16 19:04:39 INFO DAGScheduler: Submitting 1 missing tasks from 
ResultStage 2 (ParallelCollectionRDD[0] at parallelize at <console>:16) (first 
15 
tasks are for partitions Vector(0))
23/04/16 19:04:39 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
23/04/16 19:04:39 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, 
192.168.0.120, executor driver, partition 0, PROCESS_LOCAL, 7290 bytes)
23/04/16 19:04:39 INFO Executor: Running task 0.0 in stage 2.0 (TID 2)

[warn] Canceling execution...
[warn] Run canceled.
23/04/16 19:05:30 ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 2)
java.lang.NoClassDefFoundError: Could not initialize class 
$line9.$read$$iw$$iw$$iw$$iw$
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method)
    at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at 
java.base/java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:237)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method)
    at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at 
java.base/java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1250)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2096)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
    at 
java.base/java.io.ObjectInputStream.readArray(ObjectInputStream.java:1993)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1588)
    at 
java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
    at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
    at 
java.base/java.io.ObjectInputStream.readArray(ObjectInputStream.java:1993)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1588)
    at 
java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
    at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
    at 
java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
    at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
    at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
    at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
    at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
    at org.apache.spark.scheduler.Task.run(Task.scala:127)
    at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
23/04/16 19:05:30 WARN FileSystem: exception in the cleaner thread but it will 
continue to run
java.lang.InterruptedException
    at java.base/java.lang.Object.wait(Native Method)
    at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
    at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:176)
    at 
org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner.run(FileSystem.java:3063)
    at java.base/java.lang.Thread.run(Thread.java:834)
23/04/16 19:05:30 ERROR Utils: uncaught error in thread 
spark-listener-group-executorManagement, stopping SparkContext
java.lang.InterruptedException
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
    at 
java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
    at 
org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:110)
    at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at 
org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
    at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
23/04/16 19:05:30 ERROR Utils: uncaught error in thread 
spark-listener-group-appStatus, stopping SparkContext
java.lang.InterruptedException
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
    at 
java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
    at 
org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:110)
    at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at 
org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
    at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
23/04/16 19:05:30 ERROR Utils: throw uncaught fatal error in thread 
spark-listener-group-appStatus
java.lang.InterruptedException
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
    at 
java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
    at 
org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:110)
    at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at 
org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
    at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
23/04/16 19:05:30 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
    at java.base/java.lang.Object.wait(Native Method)
    at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
    at 
org.apache.spark.ContextCleaner.$anonfun$keepCleaning$1(ContextCleaner.scala:182)
    at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
    at 
org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:180)
    at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:77)
23/04/16 19:05:30 INFO SparkContext: SparkContext already stopped.
23/04/16 19:05:30 ERROR Utils: throw uncaught fatal error in thread 
spark-listener-group-executorManagement
java.lang.InterruptedException
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
    at 
java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
    at 
org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:110)
    at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at 
org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
    at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
    at 
org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
23/04/16 19:05:30 INFO SparkUI: Stopped Spark web UI at 
http://192.168.0.120:4040
java.lang.InterruptedException
  at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1040)
  at 
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1345)
  at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242)
  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258)
  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:187)
  at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:335)
  at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:743)
  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2093)
  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2188)
  at org.apache.spark.rdd.RDD.$anonfun$reduce$1(RDD.scala:1094)
  at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
  at org.apache.spark.rdd.RDD.withScope(RDD.scala:388)
  at org.apache.spark.rdd.RDD.reduce(RDD.scala:1076)
  ... 36 elided
23/04/16 19:05:30 INFO DAGScheduler: ResultStage 2 (reduce at <console>:17) 
failed in 51.653 s due to Stage cancelled because SparkContext was shut down
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to