I've attached the full log. The error is like this:

15/09/23 17:47:39 ERROR yarn.ApplicationMaster: User class threw exception:
java.lang.IllegalArgumentException: requirement failed: Checkpoint
directory does not exist: hdfs://
szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-26909
java.lang.IllegalArgumentException: requirement failed: Checkpoint
directory does not exist: hdfs://
szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-26909
at scala.Predef$.require(Predef.scala:233)
at
org.apache.spark.rdd.ReliableCheckpointRDD.<init>(ReliableCheckpointRDD.scala:45)
at
org.apache.spark.SparkContext$$anonfun$checkpointFile$1.apply(SparkContext.scala:1227)
at
org.apache.spark.SparkContext$$anonfun$checkpointFile$1.apply(SparkContext.scala:1227)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:709)
at org.apache.spark.SparkContext.checkpointFile(SparkContext.scala:1226)
at
org.apache.spark.streaming.dstream.DStreamCheckpointData$$anonfun$restore$1.apply(DStreamCheckpointData.scala:112)
at
org.apache.spark.streaming.dstream.DStreamCheckpointData$$anonfun$restore$1.apply(DStreamCheckpointData.scala:109)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at
org.apache.spark.streaming.dstream.DStreamCheckpointData.restore(DStreamCheckpointData.scala:109)
at
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:487)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
at scala.collection.immutable.List.foreach(List.scala:318)
at
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:488)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
at scala.collection.immutable.List.foreach(List.scala:318)
at
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:488)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
at scala.collection.immutable.List.foreach(List.scala:318)
at
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:488)
at
org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:153)
at
org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:153)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at
org.apache.spark.streaming.DStreamGraph.restoreCheckpointData(DStreamGraph.scala:153)
at
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:158)
at
org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:837)
at
org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:837)
at scala.Option.map(Option.scala:145)
at
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:837)
at com.appadhoc.data.main.StatCounter$.main(StatCounter.scala:51)
at com.appadhoc.data.main.StatCounter.main(StatCounter.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:525)
15/09/23 17:47:39 INFO yarn.ApplicationMaster: Final app status: FAILED,
exitCode: 15, (reason: User class threw exception:
java.lang.IllegalArgumentException: requirement failed: Checkpoint
directory does not exist: hdfs://
szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-26909
)
15/09/23 17:47:39 INFO spark.SparkContext: Invoking stop() from shutdown
hook


Tathagata Das <tathagata.das1...@gmail.com>于2015年9月24日周四 上午9:45写道:

> Could you provide the logs on when and how you are seeing this error?
>
> On Wed, Sep 23, 2015 at 6:32 PM, Bin Wang <wbi...@gmail.com> wrote:
>
>> BTW, I just kill the application and restart it. Then the application
>> cannot recover from checkpoint because of some lost of RDD. So I'm wonder,
>> if there are some failure in the application, won't it possible not be able
>> to recovery from checkpoint?
>>
>> Bin Wang <wbi...@gmail.com>于2015年9月23日周三 下午6:58写道:
>>
>>> I find the checkpoint directory structure is like this:
>>>
>>> -rw-r--r--   1 root root     134820 2015-09-23 16:55
>>> /user/root/checkpoint/checkpoint-1442998500000
>>> -rw-r--r--   1 root root     134768 2015-09-23 17:00
>>> /user/root/checkpoint/checkpoint-1442998800000
>>> -rw-r--r--   1 root root     134895 2015-09-23 17:05
>>> /user/root/checkpoint/checkpoint-1442999100000
>>> -rw-r--r--   1 root root     134899 2015-09-23 17:10
>>> /user/root/checkpoint/checkpoint-1442999400000
>>> -rw-r--r--   1 root root     134913 2015-09-23 17:15
>>> /user/root/checkpoint/checkpoint-1442999700000
>>> -rw-r--r--   1 root root     134928 2015-09-23 17:20
>>> /user/root/checkpoint/checkpoint-1443000000000
>>> -rw-r--r--   1 root root     134987 2015-09-23 17:25
>>> /user/root/checkpoint/checkpoint-1443000300000
>>> -rw-r--r--   1 root root     134944 2015-09-23 17:30
>>> /user/root/checkpoint/checkpoint-1443000600000
>>> -rw-r--r--   1 root root     134956 2015-09-23 17:35
>>> /user/root/checkpoint/checkpoint-1443000900000
>>> -rw-r--r--   1 root root     135244 2015-09-23 17:40
>>> /user/root/checkpoint/checkpoint-1443001200000
>>> drwxr-xr-x   - root root          0 2015-09-23 18:48
>>> /user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2
>>> drwxr-xr-x   - root root          0 2015-09-23 17:44
>>> /user/root/checkpoint/receivedBlockMetadata
>>>
>>>
>>> I restart spark and it reads from
>>> /user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2. But it seems
>>> that the data in it lost some rdds so it is not able to recovery. While I
>>> find other directories in checkpoint/, like
>>>  /user/root/checkpoint/checkpoint-1443001200000.  What does it used for?
>>> Can I recovery my data from that?
>>>
>>
>
Log Type: stderr
Log Upload Time: Wed Sep 23 17:47:51 +0800 2015
Log Length: 55303
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/yarn/nm/usercache/root/filecache/6753/spark-assembly-1.5.1-SNAPSHOT-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/09/23 17:47:28 INFO yarn.ApplicationMaster: Registered signal handlers for 
[TERM, HUP, INT]
15/09/23 17:47:31 INFO yarn.ApplicationMaster: ApplicationAttemptId: 
appattempt_1440495451668_0297_000001
15/09/23 17:47:31 INFO spark.SecurityManager: Changing view acls to: yarn,root
15/09/23 17:47:31 INFO spark.SecurityManager: Changing modify acls to: yarn,root
15/09/23 17:47:31 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(yarn, root); users 
with modify permissions: Set(yarn, root)
15/09/23 17:47:32 INFO yarn.ApplicationMaster: Starting the user application in 
a separate Thread
15/09/23 17:47:32 INFO yarn.ApplicationMaster: Waiting for spark context 
initialization
15/09/23 17:47:32 INFO yarn.ApplicationMaster: Waiting for spark context 
initialization ... 
15/09/23 17:47:32 INFO streaming.CheckpointReader: Checkpoint files found: 
hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1443001200000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1443000900000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1443000600000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1443000300000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1443000000000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1442999700000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1442999400000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1442999100000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1442998800000,hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1442998500000
15/09/23 17:47:32 INFO streaming.CheckpointReader: Attempting to load 
checkpoint from file 
hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1443001200000
15/09/23 17:47:33 INFO streaming.Checkpoint: Checkpoint for time 1443001200000 
ms validated
15/09/23 17:47:33 INFO streaming.CheckpointReader: Checkpoint successfully 
loaded from file 
hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/checkpoint-1443001200000
15/09/23 17:47:33 INFO streaming.CheckpointReader: Checkpoint was generated at 
time 1443001200000 ms
15/09/23 17:47:33 INFO spark.SparkContext: Running Spark version 1.5.0
15/09/23 17:47:33 INFO spark.SecurityManager: Changing view acls to: yarn,root
15/09/23 17:47:33 INFO spark.SecurityManager: Changing modify acls to: yarn,root
15/09/23 17:47:33 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(yarn, root); users 
with modify permissions: Set(yarn, root)
15/09/23 17:47:33 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/23 17:47:33 INFO Remoting: Starting remoting
15/09/23 17:47:33 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriver@192.168.1.1:60297]
15/09/23 17:47:33 INFO util.Utils: Successfully started service 'sparkDriver' 
on port 60297.
15/09/23 17:47:33 INFO spark.SparkEnv: Registering MapOutputTracker
15/09/23 17:47:33 INFO spark.SparkEnv: Registering BlockManagerMaster
15/09/23 17:47:33 INFO storage.DiskBlockManager: Created local directory at 
/yarn/nm/usercache/root/appcache/application_1440495451668_0297/blockmgr-879fbea0-e88f-4da9-87d5-5901fb589848
15/09/23 17:47:33 INFO storage.MemoryStore: MemoryStore started with capacity 
520.8 MB
15/09/23 17:47:33 INFO spark.HttpFileServer: HTTP File server directory is 
/yarn/nm/usercache/root/appcache/application_1440495451668_0297/spark-26c88c3e-55d5-42db-8672-5cb418213119/httpd-2acfcba6-6d7b-4c03-ba56-972b48371de3
15/09/23 17:47:33 INFO spark.HttpServer: Starting HTTP Server
15/09/23 17:47:33 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/23 17:47:33 INFO server.AbstractConnector: Started 
SocketConnector@0.0.0.0:11387
15/09/23 17:47:33 INFO util.Utils: Successfully started service 'HTTP file 
server' on port 11387.
15/09/23 17:47:34 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/09/23 17:47:34 INFO ui.JettyUtils: Adding filter: 
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
15/09/23 17:47:34 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/23 17:47:34 INFO server.AbstractConnector: Started 
SelectChannelConnector@0.0.0.0:35721
15/09/23 17:47:34 INFO util.Utils: Successfully started service 'SparkUI' on 
port 35721.
15/09/23 17:47:34 INFO ui.SparkUI: Started SparkUI at http://192.168.1.1:35721
15/09/23 17:47:34 INFO cluster.YarnClusterScheduler: Created 
YarnClusterScheduler
15/09/23 17:47:34 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 33428.
15/09/23 17:47:34 INFO netty.NettyBlockTransferService: Server created on 33428
15/09/23 17:47:34 INFO storage.BlockManagerMaster: Trying to register 
BlockManager
15/09/23 17:47:34 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager 192.168.1.1:33428 with 520.8 MB RAM, BlockManagerId(driver, 
192.168.1.1, 33428)
15/09/23 17:47:34 INFO storage.BlockManagerMaster: Registered BlockManager
15/09/23 17:47:34 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: 
ApplicationMaster registered as 
AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/YarnAM#-1356902315])
15/09/23 17:47:34 INFO client.RMProxy: Connecting to ResourceManager at 
szq2.appadhoc.com/192.168.1.2:8030
15/09/23 17:47:34 INFO yarn.YarnRMClient: Registering the ApplicationMaster
15/09/23 17:47:34 INFO yarn.YarnAllocator: Will request 1 executor containers, 
each with 4 cores and 33792 MB memory including 3072 MB overhead
15/09/23 17:47:34 INFO yarn.YarnAllocator: Container request (host: Any, 
capability: <memory:33792, vCores:4>)
15/09/23 17:47:34 INFO yarn.ApplicationMaster: Started progress reporter thread 
with (heartbeat : 3000, initial allocation : 200) intervals
15/09/23 17:47:35 INFO impl.AMRMClientImpl: Received new token for : 
szq1.appadhoc.com:8041
15/09/23 17:47:36 INFO yarn.YarnAllocator: Launching container 
container_1440495451668_0297_01_000002 for on host szq1.appadhoc.com
15/09/23 17:47:36 INFO yarn.YarnAllocator: Launching ExecutorRunnable. 
driverUrl: 
akka.tcp://sparkDriver@192.168.1.1:60297/user/CoarseGrainedScheduler,  
executorHostname: szq1.appadhoc.com
15/09/23 17:47:36 INFO yarn.YarnAllocator: Received 1 containers from YARN, 
launching executors on 1 of them.
15/09/23 17:47:36 INFO yarn.ExecutorRunnable: Starting Executor Container
15/09/23 17:47:36 INFO impl.ContainerManagementProtocolProxy: 
yarn.client.max-cached-nodemanagers-proxies : 0
15/09/23 17:47:36 INFO yarn.ExecutorRunnable: Setting up ContainerLaunchContext
15/09/23 17:47:36 INFO yarn.ExecutorRunnable: Preparing Local resources
15/09/23 17:47:36 INFO yarn.ExecutorRunnable: Prepared Local resources 
Map(__app__.jar -> resource { scheme: "hdfs" host: "szq2.appadhoc.com" port: 
8020 file: 
"/user/root/.sparkStaging/application_1440495451668_0297/adhoc-data-assembly-1.0.jar"
 } size: 87320747 timestamp: 1443001639377 type: FILE visibility: PRIVATE, 
__spark__.jar -> resource { scheme: "hdfs" host: "szq2.appadhoc.com" port: 8020 
file: 
"/user/root/.sparkStaging/application_1440495451668_0297/spark-assembly-1.5.1-SNAPSHOT-hadoop2.6.0.jar"
 } size: 142738268 timestamp: 1443001631742 type: FILE visibility: PRIVATE)
15/09/23 17:47:36 INFO yarn.ExecutorRunnable: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> 
{{PWD}}<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CLIENT_CONF_DIR<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/*<CPS>$HADOOP_COMMON_HOME/lib/*<CPS>$HADOOP_HDFS_HOME/*<CPS>$HADOOP_HDFS_HOME/lib/*<CPS>$HADOOP_YARN_HOME/*<CPS>$HADOOP_YARN_HOME/lib/*<CPS>$HADOOP_MAPRED_HOME/*<CPS>$HADOOP_MAPRED_HOME/lib/*<CPS>$MR2_CLASSPATH
    SPARK_LOG_URL_STDERR -> 
http://szq1.appadhoc.com:8042/node/containerlogs/container_1440495451668_0297_01_000002/root/stderr?start=-4096
    SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1440495451668_0297
    SPARK_YARN_CACHE_FILES_FILE_SIZES -> 142738268,87320747
    SPARK_USER -> root
    SPARK_YARN_CACHE_FILES_VISIBILITIES -> PRIVATE,PRIVATE
    SPARK_YARN_MODE -> true
    SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1443001631742,1443001639377
    SPARK_LOG_URL_STDOUT -> 
http://szq1.appadhoc.com:8042/node/containerlogs/container_1440495451668_0297_01_000002/root/stdout?start=-4096
    SPARK_YARN_CACHE_FILES -> 
hdfs://szq2.appadhoc.com:8020/user/root/.sparkStaging/application_1440495451668_0297/spark-assembly-1.5.1-SNAPSHOT-hadoop2.6.0.jar#__spark__.jar,hdfs://szq2.appadhoc.com:8020/user/root/.sparkStaging/application_1440495451668_0297/adhoc-data-assembly-1.0.jar#__app__.jar

  command:
    {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms30720m 
-Xmx30720m '-Dconfig.resource=prod.conf' -Djava.io.tmpdir={{PWD}}/tmp 
'-Dspark.driver.port=60297' '-Dspark.ui.port=0' 
-Dspark.yarn.app.container.log.dir=<LOG_DIR> 
org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
akka.tcp://sparkDriver@192.168.1.1:60297/user/CoarseGrainedScheduler 
--executor-id 1 --hostname szq1.appadhoc.com --cores 4 --app-id 
application_1440495451668_0297 --user-class-path file:$PWD/__app__.jar 1> 
<LOG_DIR>/stdout 2> <LOG_DIR>/stderr
===============================================================================
      
15/09/23 17:47:36 INFO impl.ContainerManagementProtocolProxy: Opening proxy : 
szq1.appadhoc.com:8041
15/09/23 17:47:38 INFO yarn.ApplicationMaster$AMEndpoint: Driver terminated or 
disconnected! Shutting down. szq1.appadhoc.com:61805
15/09/23 17:47:39 INFO cluster.YarnClusterSchedulerBackend: Registered 
executor: 
AkkaRpcEndpointRef(Actor[akka.tcp://sparkexecu...@szq1.appadhoc.com:49069/user/Executor#1058701258])
 with ID 1
15/09/23 17:47:39 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend is 
ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
15/09/23 17:47:39 INFO cluster.YarnClusterScheduler: 
YarnClusterScheduler.postStartHook done
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@3fa6b67b
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@421046a8
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@3a8ee17a
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@7a752e9e
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@2347c5e1
15/09/23 17:47:39 INFO dstream.FilteredDStream: Set context for 
org.apache.spark.streaming.dstream.FilteredDStream@3d1048df
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@6fc3286f
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@7593c8bb
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@395fa94b
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@3763985a
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6ba7591e
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@37cbf7c9
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@2ff8c9bd
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@7684afe5
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@138202d
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@bc86afb
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@1f485a2a
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@47178b71
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@5ac3cc2d
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6c91cce8
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@bc86afb
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@5823cb07
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@15e665c6
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4923b565
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@475a1f66
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6dc38622
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@3f9b5a7a
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@70512035
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@5e19b12e
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@68bf7345
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@475a1f66
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6dc38622
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@7128ecbc
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@35ce64ae
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@133f71b
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@7aa40e3a
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@28a240bd
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@34484d22
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@76c00dfe
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@3ad6db9a
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6feb79a3
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@4c7ef5bc
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@233730e0
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@7c69d42f
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@76c00dfe
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@7e99979
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6feb79a3
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@4c7ef5bc
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@233730e0
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@7c69d42f
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@76c00dfe
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@795adafd
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@3bdf5649
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@4399f154
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@2fb2e478
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@9b59abf
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@71c27693
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@15f6e73e
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@4c7a2b6
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4e8b5d2d
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@1f925da4
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4241ca2d
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@6d407256
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@15f6e73e
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@2212a423
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4e8b5d2d
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@1f925da4
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4241ca2d
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@6d407256
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@15f6e73e
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@75d6074e
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@3679ce52
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@76d8856e
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@12035b7
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@69a681f4
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@329df05c
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@1f0f41fa
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@22b49404
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@7d61a468
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@1570e827
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@27996370
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@474f625f
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@1f0f41fa
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@2cc36f8c
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@52d06749
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@702d1418
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@1f826d31
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@2bcebc20
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@11a0ef58
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@5cc18372
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@249ce0f1
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@377400fe
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@20ed4023
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@380dcab8
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@117812c8
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@5cc18372
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@6219116d
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@5bc73d6b
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@75916adf
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@1dff92c6
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@75fd148c
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@3a502fd8
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@66752bd0
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@168f993d
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@2d957f5d
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@31dcca0c
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@1654f367
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@2337faa5
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@66752bd0
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@302878b3
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@310c4878
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@1d69d770
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@fbf0926
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@46f1a5dd
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@22c0748e
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@9242875
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@24dc1e03
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6f3669dc
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@5d60238e
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@52be6f86
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@548bf0da
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@9242875
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@2f71a393
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4daa203a
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@5c29e7ab
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4328bad2
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@27015370
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@480cf211
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@49e85f0b
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@36b7f115
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6634b5c3
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@e8949a1
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@5740f07b
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@686c47fd
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@49e85f0b
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@1cea689a
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@6179654a
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@68e90f44
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@1ce96f2b
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@21817ee1
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@3cb0f6f6
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@625af9b
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@142bec44
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4fa46eb4
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@34601445
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@fd2d8e0
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@5b9c623d
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@625af9b
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@77f5c1c7
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@27852d58
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@186400ad
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4d686a5d
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@21f275f
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@4b51cf0f
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@67f1086a
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@15f9c76
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@2c1907ae
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@1f474720
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@8e95eb
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@11147796
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@67f1086a
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO dstream.ForEachDStream: Set context for 
org.apache.spark.streaming.dstream.ForEachDStream@56b7219c
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@2c1907ae
15/09/23 17:47:39 INFO dstream.ShuffledDStream: Set context for 
org.apache.spark.streaming.dstream.ShuffledDStream@1f474720
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@8e95eb
15/09/23 17:47:39 INFO dstream.StateDStream: Set context for 
org.apache.spark.streaming.dstream.StateDStream@11147796
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@67f1086a
15/09/23 17:47:39 INFO dstream.FlatMappedDStream: Set context for 
org.apache.spark.streaming.dstream.FlatMappedDStream@12e78ac8
15/09/23 17:47:39 INFO dstream.MapPartitionedDStream: Set context for 
org.apache.spark.streaming.dstream.MapPartitionedDStream@782047ca
15/09/23 17:47:39 INFO dstream.MappedDStream: Set context for 
org.apache.spark.streaming.dstream.MappedDStream@f495c63
15/09/23 17:47:39 INFO kafka.KafkaInputDStream: Set context for 
org.apache.spark.streaming.kafka.KafkaInputDStream@19506f6c
15/09/23 17:47:39 INFO streaming.DStreamGraph: Restoring checkpoint data
15/09/23 17:47:39 INFO dstream.ForEachDStream: Restoring checkpoint data
15/09/23 17:47:39 INFO dstream.MappedDStream: Restoring checkpoint data
15/09/23 17:47:39 INFO dstream.MappedDStream: Restoring checkpoint data
15/09/23 17:47:39 INFO dstream.StateDStream: Restoring checkpoint data
15/09/23 17:47:39 INFO dstream.DStreamCheckpointData: Restoring checkpointed 
RDD for time 1442961600000 ms from file 
'hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-27094'
15/09/23 17:47:39 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager szq1.appadhoc.com:18673 with 15.5 GB RAM, BlockManagerId(1, 
szq1.appadhoc.com, 18673)
15/09/23 17:47:39 INFO storage.MemoryStore: ensureFreeSpace(245896) called with 
curMem=0, maxMem=546129838
15/09/23 17:47:39 INFO storage.MemoryStore: Block broadcast_0 stored as values 
in memory (estimated size 240.1 KB, free 520.6 MB)
15/09/23 17:47:39 INFO storage.MemoryStore: ensureFreeSpace(21564) called with 
curMem=245896, maxMem=546129838
15/09/23 17:47:39 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as 
bytes in memory (estimated size 21.1 KB, free 520.6 MB)
15/09/23 17:47:39 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in 
memory on 192.168.1.1:33428 (size: 21.1 KB, free: 520.8 MB)
15/09/23 17:47:39 INFO spark.SparkContext: Created broadcast 0 from 
checkpointFile at DStreamCheckpointData.scala:112
15/09/23 17:47:39 INFO dstream.DStreamCheckpointData: Restoring checkpointed 
RDD for time 1442961300000 ms from file 
'hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-26909'
15/09/23 17:47:39 INFO storage.MemoryStore: ensureFreeSpace(245936) called with 
curMem=267460, maxMem=546129838
15/09/23 17:47:39 INFO storage.MemoryStore: Block broadcast_1 stored as values 
in memory (estimated size 240.2 KB, free 520.3 MB)
15/09/23 17:47:39 INFO storage.MemoryStore: ensureFreeSpace(21564) called with 
curMem=513396, maxMem=546129838
15/09/23 17:47:39 INFO storage.MemoryStore: Block broadcast_1_piece0 stored as 
bytes in memory (estimated size 21.1 KB, free 520.3 MB)
15/09/23 17:47:39 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in 
memory on 192.168.1.1:33428 (size: 21.1 KB, free: 520.8 MB)
15/09/23 17:47:39 INFO spark.SparkContext: Created broadcast 1 from 
checkpointFile at DStreamCheckpointData.scala:112
15/09/23 17:47:39 ERROR yarn.ApplicationMaster: User class threw exception: 
java.lang.IllegalArgumentException: requirement failed: Checkpoint directory 
does not exist: 
hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-26909
java.lang.IllegalArgumentException: requirement failed: Checkpoint directory 
does not exist: 
hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-26909
        at scala.Predef$.require(Predef.scala:233)
        at 
org.apache.spark.rdd.ReliableCheckpointRDD.<init>(ReliableCheckpointRDD.scala:45)
        at 
org.apache.spark.SparkContext$$anonfun$checkpointFile$1.apply(SparkContext.scala:1227)
        at 
org.apache.spark.SparkContext$$anonfun$checkpointFile$1.apply(SparkContext.scala:1227)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
        at org.apache.spark.SparkContext.withScope(SparkContext.scala:709)
        at org.apache.spark.SparkContext.checkpointFile(SparkContext.scala:1226)
        at 
org.apache.spark.streaming.dstream.DStreamCheckpointData$$anonfun$restore$1.apply(DStreamCheckpointData.scala:112)
        at 
org.apache.spark.streaming.dstream.DStreamCheckpointData$$anonfun$restore$1.apply(DStreamCheckpointData.scala:109)
        at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at 
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
        at 
org.apache.spark.streaming.dstream.DStreamCheckpointData.restore(DStreamCheckpointData.scala:109)
        at 
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:487)
        at 
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
        at 
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at 
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:488)
        at 
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
        at 
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at 
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:488)
        at 
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
        at 
org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:488)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at 
org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:488)
        at 
org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:153)
        at 
org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:153)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at 
org.apache.spark.streaming.DStreamGraph.restoreCheckpointData(DStreamGraph.scala:153)
        at 
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:158)
        at 
org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:837)
        at 
org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:837)
        at scala.Option.map(Option.scala:145)
        at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:837)
        at com.appadhoc.data.main.StatCounter$.main(StatCounter.scala:51)
        at com.appadhoc.data.main.StatCounter.main(StatCounter.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:525)
15/09/23 17:47:39 INFO yarn.ApplicationMaster: Final app status: FAILED, 
exitCode: 15, (reason: User class threw exception: 
java.lang.IllegalArgumentException: requirement failed: Checkpoint directory 
does not exist: 
hdfs://szq2.appadhoc.com:8020/user/root/checkpoint/d3714249-e03a-45c7-a0d5-1dc870b7d9f2/rdd-26909)
15/09/23 17:47:39 INFO spark.SparkContext: Invoking stop() from shutdown hook
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/metrics/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/api,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/static,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/environment/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/environment,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/pool,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/stage,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs/job,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs/json,null}
15/09/23 17:47:39 INFO handler.ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs,null}
15/09/23 17:47:39 INFO ui.SparkUI: Stopped Spark web UI at 
http://192.168.1.1:35721
15/09/23 17:47:39 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/09/23 17:47:39 INFO cluster.YarnClusterSchedulerBackend: Shutting down all 
executors
15/09/23 17:47:39 INFO cluster.YarnClusterSchedulerBackend: Asking each 
executor to shut down
15/09/23 17:47:39 INFO yarn.ApplicationMaster$AMEndpoint: Driver terminated or 
disconnected! Shutting down. szq1.appadhoc.com:49069
15/09/23 17:47:39 INFO spark.MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
15/09/23 17:47:39 INFO storage.MemoryStore: MemoryStore cleared
15/09/23 17:47:39 INFO storage.BlockManager: BlockManager stopped
15/09/23 17:47:39 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/09/23 17:47:39 INFO 
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
15/09/23 17:47:39 INFO spark.SparkContext: Successfully stopped SparkContext
15/09/23 17:47:39 INFO util.ShutdownHookManager: Shutdown hook called
15/09/23 17:47:39 INFO util.ShutdownHookManager: Deleting directory 
/yarn/nm/usercache/root/appcache/application_1440495451668_0297/spark-26c88c3e-55d5-42db-8672-5cb418213119
15/09/23 17:47:39 INFO remote.RemoteActorRefProvider$RemotingTerminator: 
Shutting down remote daemon.
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to