Assuming that the batches are just slow and queued up (if it makes no progress at all means is something wrong with the job), usually you can improve the speed by increasing the number executors, cores or memory. It's a bit of trial/error plus observing how the job behaves.
To avoid the queues the batch processing needs to be faster than the batch interval, which might not be always possible. Gonzalo On 24 February 2016 at 03:47, Sutanu Das <[email protected]> wrote: > We are running Spark 1.4.1 and our Streaming Job is getting slow/ queued > after 12 Hours every day, we thought it was b/c of YARN but even at Local > Mode this is happening. > > > > 1. How do we DEBUG this? > > > > 2. Here is what getting Q’d > > > Active Batches (38) > > *Batch Time* > > *Input Size* > > *Scheduling Delay (?) * > > *Processing Time (?) * > > *Status* > > 2016/02/24 03:44:00 > <http://has-dal-0004:4040/streaming/batch?id=1456285440000> > > 8512 events > > - > > - > > queued > > 2016/02/24 03:42:00 > <http://has-dal-0004:4040/streaming/batch?id=1456285320000> > > 8512 events > > - > > - > > queued > > 2016/02/24 03:40:00 > <http://has-dal-0004:4040/streaming/batch?id=1456285200000> > > 8512 events > > - > > - > > queued > > 2016/02/24 03:38:00 > <http://has-dal-0004:4040/streaming/batch?id=1456285080000> > > 8512 events > > - > > - > > queued > > > > > > 3. Here is the entire thread dump of the executor - > http://server:4040/executors/threadDump/?executorId=driver > > > > > > > > > > [image: http://has-dal-0004:4040/static/spark-logo-77x50px-hd.png]1.4.1 > <http://has-dal-0004:4040/> > > - Jobs <http://has-dal-0004:4040/jobs/> > - Stages <http://has-dal-0004:4040/stages/> > - Storage <http://has-dal-0004:4040/storage/> > - Environment <http://has-dal-0004:4040/environment/> > - Executors <http://has-dal-0004:4040/executors/> > - Streaming <http://has-dal-0004:4040/streaming/> > > *ap-status* application UI > Thread dump for executor driver > > Updated at 2016/02/24 03:34:13 > > Collapse All > > Thread 1: main (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) > > org.apache.spark.streaming.ContextWaiter.waitForStopOrError(ContextWaiter.scala:63) > > org.apache.spark.streaming.StreamingContext.awaitTermination(StreamingContext.scala:623) > > airwaveApList$.main(airwaveApList.scala:137) > > airwaveApList.main(airwaveApList.scala) > > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > java.lang.reflect.Method.invoke(Method.java:497) > > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) > > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) > > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) > > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) > > org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > > Thread 2: Reference Handler (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.Object.wait(Object.java:502) > > java.lang.ref.Reference$ReferenceHandler.run(Reference.java:157) > > Thread 3: Finalizer (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143) > > java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164) > > java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:209) > > Thread 4: Signal Dispatcher (RUNNABLE) > > Thread 12: SparkListenerBus (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) > > java.util.concurrent.Semaphore.acquire(Semaphore.java:312) > > org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:65) > > org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215) > > org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63) > > Thread 16: sparkDriver-scheduler-1 (TIMED_WAITING) > > java.lang.Thread.sleep(Native Method) > > akka.actor.LightArrayRevolverScheduler.waitNanos(Scheduler.scala:226) > > akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:405) > > akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375) > > java.lang.Thread.run(Thread.java:745) > > Thread 17: sparkDriver-akka.actor.default-dispatcher-2 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 18: sparkDriver-akka.actor.default-dispatcher-3 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 19: sparkDriver-akka.actor.default-dispatcher-4 (RUNNABLE) > > java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:946) > > java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:323) > > java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1134) > > java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548) > > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509) > > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432) > > java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178) > > java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378) > > java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174) > > java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548) > > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509) > > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432) > > java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178) > > java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548) > > java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441) > > org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$writeObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:59) > > org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239) > > org.apache.spark.rdd.ParallelCollectionPartition.writeObject(ParallelCollectionRDD.scala:51) > > sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source) > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > java.lang.reflect.Method.invoke(Method.java:497) > > java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) > > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496) > > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432) > > java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178) > > java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548) > > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509) > > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432) > > java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178) > > java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348) > > org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:44) > > org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:81) > > org.apache.spark.scheduler.Task$.serializeWithDependencies(Task.scala:168) > > org.apache.spark.scheduler.TaskSetManager.resourceOffer(TaskSetManager.scala:467) > > org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$org$apache$spark$scheduler$TaskSchedulerImpl$$resourceOfferSingleTaskSet$1.apply$mcVI$sp(TaskSchedulerImpl.scala:231) > > scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) > > org.apache.spark.scheduler.TaskSchedulerImpl.org$apache$spark$scheduler$TaskSchedulerImpl$$resourceOfferSingleTaskSet(TaskSchedulerImpl.scala:226) > > org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$resourceOffers$3$$anonfun$apply$6.apply(TaskSchedulerImpl.scala:295) > > org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$resourceOffers$3$$anonfun$apply$6.apply(TaskSchedulerImpl.scala:293) > > scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) > > scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108) > > org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$resourceOffers$3.apply(TaskSchedulerImpl.scala:293) > > org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$resourceOffers$3.apply(TaskSchedulerImpl.scala:293) > > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > > scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) > > org.apache.spark.scheduler.TaskSchedulerImpl.resourceOffers(TaskSchedulerImpl.scala:293) > > org.apache.spark.scheduler.local.LocalEndpoint.reviveOffers(LocalBackend.scala:79) > > org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalBackend.scala:64) > > org.apache.spark.rpc.akka.AkkaRpcEnv.org$apache$spark$rpc$akka$AkkaRpcEnv$$processMessage(AkkaRpcEnv.scala:178) > > org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1$$anonfun$receiveWithLogging$1$$anonfun$applyOrElse$4.apply$mcV$sp(AkkaRpcEnv.scala:127) > > org.apache.spark.rpc.akka.AkkaRpcEnv.org$apache$spark$rpc$akka$AkkaRpcEnv$$safelyCall(AkkaRpcEnv.scala:198) > > org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1$$anonfun$receiveWithLogging$1.applyOrElse(AkkaRpcEnv.scala:126) > > scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) > > scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) > > scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) > > org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:59) > > org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) > > scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) > > org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) > > akka.actor.Actor$class.aroundReceive(Actor.scala:465) > > org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1.aroundReceive(AkkaRpcEnv.scala:93) > > akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) > > akka.actor.ActorCell.invoke(ActorCell.scala:487) > > akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) > > akka.dispatch.Mailbox.run(Mailbox.scala:220) > > akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) > > scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) > > scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 20: sparkDriver-akka.actor.default-dispatcher-5 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 21: sparkDriver-akka.actor.default-dispatcher-6 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 24: New I/O worker #1 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > org.jboss.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:415) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212) > > org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) > > org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) > > org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 25: New I/O worker #2 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > org.jboss.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:415) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212) > > org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) > > org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) > > org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 27: New I/O boss #3 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > org.jboss.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:415) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212) > > org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:42) > > org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 28: New I/O worker #4 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > org.jboss.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:415) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212) > > org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) > > org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) > > org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 29: New I/O worker #5 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > org.jboss.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:415) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212) > > org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) > > org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) > > org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 30: New I/O server boss #6 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:101) > > org.jboss.netty.channel.socket.nio.NioServerBoss.select(NioServerBoss.java:163) > > org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212) > > org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42) > > org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 31: MAP_OUTPUT_TRACKER cleanup timer (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.Object.wait(Object.java:502) > > java.util.TimerThread.mainLoop(Timer.java:526) > > java.util.TimerThread.run(Timer.java:505) > > Thread 32: BLOCK_MANAGER cleanup timer (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.Object.wait(Object.java:502) > > java.util.TimerThread.mainLoop(Timer.java:526) > > java.util.TimerThread.run(Timer.java:505) > > Thread 33: BROADCAST_VARS cleanup timer (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.Object.wait(Object.java:502) > > java.util.TimerThread.mainLoop(Timer.java:526) > > java.util.TimerThread.run(Timer.java:505) > > Thread 35: qtp15892131-35 Acceptor0 [email protected]:32927 > (RUNNABLE) > > java.net.PlainSocketImpl.socketAccept(Native Method) > > java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409) > > java.net.ServerSocket.implAccept(ServerSocket.java:545) > > java.net.ServerSocket.accept(ServerSocket.java:513) > > org.spark-project.jetty.server.bio.SocketConnector.accept(SocketConnector.java:117) > > org.spark-project.jetty.server.AbstractConnector$Acceptor.run(AbstractConnector.java:938) > > org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) > > java.lang.Thread.run(Thread.java:745) > > Thread 36: qtp15892131-36 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 37: qtp15892131-37 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 38: qtp15892131-38 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 39: qtp15892131-39 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 40: qtp15892131-40 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 41: qtp15892131-41 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 42: qtp15892131-42 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 43: SPARK_CONTEXT cleanup timer (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.Object.wait(Object.java:502) > > java.util.TimerThread.mainLoop(Timer.java:526) > > java.util.TimerThread.run(Timer.java:505) > > Thread 45: qtp2061226112-45 Selector0 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > org.spark-project.jetty.io.nio.SelectorManager$SelectSet.doSelect(SelectorManager.java:569) > > org.spark-project.jetty.io.nio.SelectorManager$1.run(SelectorManager.java:290) > > org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) > > java.lang.Thread.run(Thread.java:745) > > Thread 46: qtp2061226112-46 Selector1 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > org.spark-project.jetty.io.nio.SelectorManager$SelectSet.doSelect(SelectorManager.java:569) > > org.spark-project.jetty.io.nio.SelectorManager$1.run(SelectorManager.java:290) > > org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) > > java.lang.Thread.run(Thread.java:745) > > Thread 47: qtp2061226112-47 Acceptor0 [email protected]:4040 > (BLOCKED) > > sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:234) > > org.spark-project.jetty.server.nio.SelectChannelConnector.accept(SelectChannelConnector.java:109) > > org.spark-project.jetty.server.AbstractConnector$Acceptor.run(AbstractConnector.java:938) > > org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) > > java.lang.Thread.run(Thread.java:745) > > Thread 48: qtp2061226112-48 Acceptor1 [email protected]:4040 > (RUNNABLE) > > sun.nio.ch.ServerSocketChannelImpl.accept0(Native Method) > > sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:422) > > sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:250) > > org.spark-project.jetty.server.nio.SelectChannelConnector.accept(SelectChannelConnector.java:109) > > org.spark-project.jetty.server.AbstractConnector$Acceptor.run(AbstractConnector.java:938) > > org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) > > java.lang.Thread.run(Thread.java:745) > > Thread 53: heartbeat-receiver-event-loop-thread (BLOCKED) > > org.apache.spark.scheduler.TaskSchedulerImpl.executorHeartbeatReceived(TaskSchedulerImpl.scala:360) > > org.apache.spark.HeartbeatReceiver$$anonfun$receiveAndReply$1$$anon$2$$anonfun$run$2.apply$mcV$sp(HeartbeatReceiver.scala:107) > > org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1264) > > org.apache.spark.HeartbeatReceiver$$anonfun$receiveAndReply$1$$anon$2.run(HeartbeatReceiver.scala:106) > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > > java.util.concurrent.FutureTask.run(FutureTask.java:266) > > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) > > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 54: Timer-0 (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.Object.wait(Object.java:502) > > java.util.TimerThread.mainLoop(Timer.java:526) > > java.util.TimerThread.run(Timer.java:505) > > Thread 55: dag-scheduler-event-loop (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) > > java.util.concurrent.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:492) > > java.util.concurrent.LinkedBlockingDeque.take(LinkedBlockingDeque.java:680) > > org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:46) > > Thread 56: driver-heartbeater (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedNanos(AbstractQueuedSynchronizer.java:1037) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1328) > > scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:208) > > scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218) > > scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) > > scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) > > scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) > > scala.concurrent.Await$.result(package.scala:107) > > org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102) > > org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78) > > org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$reportHeartBeat(Executor.scala:444) > > org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply$mcV$sp(Executor.scala:464) > > org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:464) > > org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:464) > > org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772) > > org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:464) > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > > java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) > > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) > > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 65: shuffle-server-0 (RUNNABLE) > > sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) > > sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) > > sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) > > sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) > > sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) > > io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:622) > > io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:310) > > io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) > > java.lang.Thread.run(Thread.java:745) > > Thread 74: Spark Context Cleaner (TIMED_WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143) > > org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:157) > > org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215) > > org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:154) > > org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:67) > > Thread 75: RecurringTimer - JobGenerator (TIMED_WAITING) > > java.lang.Thread.sleep(Native Method) > > org.apache.spark.util.SystemClock.waitTillTime(Clock.scala:63) > > org.apache.spark.streaming.util.RecurringTimer.org$apache$spark$streaming$util$RecurringTimer$$loop(RecurringTimer.scala:96) > > org.apache.spark.streaming.util.RecurringTimer$$anon$1.run(RecurringTimer.scala:29) > > Thread 76: StreamingListenerBus (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997) > > java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) > > java.util.concurrent.Semaphore.acquire(Semaphore.java:312) > > org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:65) > > org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215) > > org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63) > > Thread 78: metrics-meter-tick-thread-1 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) > > java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1088) > > java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 79: JobScheduler (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) > > java.util.concurrent.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:492) > > java.util.concurrent.LinkedBlockingDeque.take(LinkedBlockingDeque.java:680) > > org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:46) > > Thread 81: JobGenerator (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) > > java.util.concurrent.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:492) > > java.util.concurrent.LinkedBlockingDeque.take(LinkedBlockingDeque.java:680) > > org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:46) > > Thread 82: ForkJoinPool-3-worker-15 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 83: metrics-meter-tick-thread-2 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093) > > java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 84: pool-12-thread-1 (WAITING) > > java.lang.Object.wait(Native Method) > > java.lang.Object.wait(Object.java:502) > > org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73) > > org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:530) > > org.apache.spark.SparkContext.runJob(SparkContext.scala:1734) > > org.apache.spark.SparkContext.runJob(SparkContext.scala:1752) > > org.apache.spark.SparkContext.runJob(SparkContext.scala:1774) > > com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:37) > > org.apache.spark.sql.cassandra.CassandraSourceRelation.insert(CassandraSourceRelation.scala:77) > > org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:87) > > org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:309) > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:144) > > airwaveApList$$anonfun$main$1.apply(airwaveApList.scala:131) > > airwaveApList$$anonfun$main$1.apply(airwaveApList.scala:79) > > org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:631) > > org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:631) > > org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:42) > > org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) > > org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) > > org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399) > > org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:40) > > org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) > > org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) > > scala.util.Try$.apply(Try.scala:161) > > org.apache.spark.streaming.scheduler.Job.run(Job.scala:34) > > org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:193) > > org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) > > org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) > > scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > > org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:192) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 90: task-result-getter-0 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) > > java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 91: task-result-getter-1 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) > > java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 92: pool-13-thread-1 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093) > > java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 159: task-result-getter-2 (BLOCKED) > > org.apache.spark.scheduler.TaskSetManager.canFetchMoreResults(TaskSetManager.scala:600) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply$mcV$sp(TaskResultGetter.scala:54) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply(TaskResultGetter.scala:51) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply(TaskResultGetter.scala:51) > > org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2.run(TaskResultGetter.scala:50) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 161: task-result-getter-3 (BLOCKED) > > org.apache.spark.scheduler.TaskSchedulerImpl.handleSuccessfulTask(TaskSchedulerImpl.scala:378) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply$mcV$sp(TaskResultGetter.scala:86) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply(TaskResultGetter.scala:51) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply(TaskResultGetter.scala:51) > > org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772) > > org.apache.spark.scheduler.TaskResultGetter$$anon$2.run(TaskResultGetter.scala:50) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 21083: qtp2061226112-21083 (RUNNABLE) > > sun.management.ThreadImpl.dumpThreads0(Native Method) > > sun.management.ThreadImpl.dumpAllThreads(ThreadImpl.java:446) > > org.apache.spark.util.Utils$.getThreadDump(Utils.scala:1931) > > org.apache.spark.SparkContext.getExecutorThreadDump(SparkContext.scala:583) > > org.apache.spark.ui.exec.ExecutorThreadDumpPage.render(ExecutorThreadDumpPage.scala:49) > > org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79) > > org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79) > > org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:69) > > javax.servlet.http.HttpServlet.service(HttpServlet.java:735) > > javax.servlet.http.HttpServlet.service(HttpServlet.java:848) > > org.spark-project.jetty.servlet.ServletHolder.handle(ServletHolder.java:684) > > org.spark-project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501) > > org.spark-project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086) > > org.spark-project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428) > > org.spark-project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020) > > org.spark-project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135) > > org.spark-project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255) > > org.spark-project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116) > > org.spark-project.jetty.server.Server.handle(Server.java:370) > > org.spark-project.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494) > > org.spark-project.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971) > > org.spark-project.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033) > > org.spark-project.jetty.http.HttpParser.parseNext(HttpParser.java:644) > > org.spark-project.jetty.http.HttpParser.parseAvailable(HttpParser.java:235) > > org.spark-project.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82) > > org.spark-project.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667) > > org.spark-project.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52) > > org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) > > java.lang.Thread.run(Thread.java:745) > > Thread 50977: sparkDriver-akka.actor.default-dispatcher-18 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 64900: qtp2061226112-64900 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 87430: qtp2061226112-87430 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 91565: qtp2061226112-91565 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078) > > org.spark-project.jetty.util.BlockingArrayQueue.poll(BlockingArrayQueue.java:342) > > org.spark-project.jetty.util.thread.QueuedThreadPool.idleJobPoll(QueuedThreadPool.java:526) > > org.spark-project.jetty.util.thread.QueuedThreadPool.access$600(QueuedThreadPool.java:44) > > org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > > java.lang.Thread.run(Thread.java:745) > > Thread 100158: Executor task launch worker-2058 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100293: block-manager-ask-thread-pool-1093 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100412: Executor task launch worker-2059 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100553: ForkJoinPool-4-worker-17 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 100554: ForkJoinPool-4-worker-31 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.idleAwaitWork(ForkJoinPool.java:2135) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2067) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 100558: ForkJoinPool-4-worker-23 (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 100578: block-manager-slave-async-thread-pool-4126 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100580: block-manager-slave-async-thread-pool-4128 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100584: block-manager-slave-async-thread-pool-4132 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100588: block-manager-ask-thread-pool-1097 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100597: Executor task launch worker-2062 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100599: Executor task launch worker-2063 (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215) > > java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460) > > java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362) > > java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941) > > java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066) > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127) > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > java.lang.Thread.run(Thread.java:745) > > Thread 100707: sparkDriver-akka.remote.default-remote-dispatcher-1132 > (WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2075) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > Thread 100708: sparkDriver-akka.remote.default-remote-dispatcher-1133 > (TIMED_WAITING) > > sun.misc.Unsafe.park(Native Method) > > scala.concurrent.forkjoin.ForkJoinPool.idleAwaitWork(ForkJoinPool.java:2135) > > scala.concurrent.forkjoin.ForkJoinPool.scan(ForkJoinPool.java:2067) > > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > > >
