[ 
https://issues.apache.org/jira/browse/SPARK-4133?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14196455#comment-14196455
 ] 

Josh Rosen commented on SPARK-4133:
-----------------------------------

Spark doesn't support multiple active SparkContexts in the same JVM, although 
this isn't well-documented and there's no error-checking for this (PySpark has 
checks for this, though).  This isn't to say that we can't / won't eventually 
support multiple contexts per JVM (see SPARK-2243), but that could be somewhat 
difficult in the very short term because there may be several baked-in 
assumptions that we'll have to address (the (effectively) global SparkEnv, for 
example).  In both this issue and the issue reported in a comment on 
SPARK-4080, I think the symptoms are being caused by the two SparkContexts' 
block / broadcast managers becoming mixed up so that executors belonging to one 
SparkContext are somehow fetching blocks added by another SparkContext (or are 
deleting each others' blocks; see SPARK-3148).

SPARK-4180 will add proper error-detection to detect when multiple contexts 
have been created and to help debug where this is happening (e.g. by printing 
the callsite that created the currently active context).

> PARSING_ERROR(2) when upgrading issues from 1.0.2 to 1.1.0
> ----------------------------------------------------------
>
>                 Key: SPARK-4133
>                 URL: https://issues.apache.org/jira/browse/SPARK-4133
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.1.0
>            Reporter: Antonio Jesus Navarro
>            Priority: Blocker
>         Attachments: spark_ex.logs
>
>
> Snappy related problems found when trying to upgrade existing Spark Streaming 
> App from 1.0.2 to 1.1.0.
> We can not run an existing 1.0.2 spark app if upgraded to 1.1.0
> > IOException is thrown by snappy (parsing_error(2))
> {code}
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
> broadcast_0 is StorageLevel(true, true, false, true, 1)
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
> broadcast_0 from memory
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
> Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
> not registered locally
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
> reading broadcast variable 0
> sparkDriver-akka.actor.default-dispatcher-4 INFO  
> receiver.ReceiverSupervisorImpl - Registered receiver 0
> Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
> BlockGenerator at time 1414656492400
> Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
> BlockGenerator
> Thread-87 INFO  receiver.BlockGenerator - Started block pushing thread
> Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - 
> Starting receiver
> sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.ReceiverTracker - 
> Registered receiver for stream 0 from akka://sparkDriver
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
> Consumer Stream with group: stratioStreaming
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
> Zookeeper: node.stratio.com:2181
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
> cap=0]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
> cap=0]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
> cap=0]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (8.442354 ms) 
> StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (8.412421 ms) 
> StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
> handled message (8.385471 ms) 
> StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
> Actor[akka://sparkDriver/deadLetters]
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Verifying 
> properties
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> group.id is overridden to stratioStreaming
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> zookeeper.connect is overridden to node.stratio.com:2181
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> zookeeper.connection.timeout.ms is overridden to 10000
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 0.033998997 s
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], Connecting to 
> zookeeper instance at node.stratio.com:2181
> Executor task launch worker-0 DEBUG zkclient.ZkConnection - Creating new 
> ZookKeeper instance to connect to node.stratio.com:2181.
> ZkClient-EventThread-169-node.stratio.com:2181 INFO  zkclient.ZkEventThread - 
> Starting ZkClient event thread.
> Executor task launch worker-0 INFO  zookeeper.ZooKeeper - Initiating client 
> connection, connectString=node.stratio.com:2181 sessionTimeout=6000 
> watcher=org.I0Itec.zkclient.ZkClient@5b4bdc81
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Awaiting connection 
> to Zookeeper server
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Waiting for keeper 
> state SyncConnected
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Opening socket connection to server 
> node.stratio.com/172.19.0.96:2181. Will not attempt to authenticate using 
> SASL (unknown error)
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Socket connection established to 
> node.stratio.com/172.19.0.96:2181, initiating session
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Session establishment request sent on 
> node.stratio.com/172.19.0.96:2181
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Session establishment complete on server 
> node.stratio.com/172.19.0.96:2181, sessionid = 0x1496007e6710002, negotiated 
> timeout = 6000
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Received 
> event: WatchedEvent state:SyncConnected type:None path:null
> Executor task launch worker-0-EventThread INFO  zkclient.ZkClient - zookeeper 
> state changed (SyncConnected)
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Leaving 
> process event
> Executor task launch worker-0 DEBUG zkclient.ZkClient - State is SyncConnected
> RecurringTimer - BlockGenerator DEBUG util.RecurringTimer - Callback for 
> BlockGenerator called at time 1414656492400
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Initializing task 
> scheduler.
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], starting auto 
> committer every 60000 ms
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Scheduling task 
> kafka-consumer-autocommit with initial delay 60000 ms and period 60000 ms.
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connected to 
> node.stratio.com:2181
> Executor task launch worker-0 DEBUG consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], entering consume 
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], begin registering 
> consumer stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a in ZK
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
> not registered locally
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
> reading broadcast variable 0
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 5.5676E-5 s
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.IOException: PARSING_ERROR(2)
>       at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>       at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>       at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>       at 
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>       at 
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>       at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>       at 
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.IOException: PARSING_ERROR(2)
>       at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>       at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>       at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>       at 
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>       at 
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>       at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>       at 
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
> cap=2144]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
> cap=2144]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (1.213476 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (1.543991 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
> Actor[akka://sparkDriver/deadLetters]
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
>         org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>         org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>         org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>         
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>         
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>         org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>         
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
>         org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>         org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>         org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>         
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>         
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>         org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>         
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl 
> - Cancelling stage 0
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl 
> - Cancelling stage 0
> Thread-84 INFO  scheduler.DAGScheduler - Failed to run runJob at 
> ReceiverTracker.scala:275
> Thread-85 INFO  scheduler.DAGScheduler - Failed to run runJob at 
> ReceiverTracker.scala:275
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> Removing running stage 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> Removing running stage 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> After removal of stage 0, remaining stages = 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> After removal of stage 0, remaining stages = 0
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710002, packet:: 
> clientPath:null serverPath:null finished:false header:: 1,1  replyHeader:: 
> 1,25,-101  request:: 
> '/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a,#7b2276657273696f6e223a312c22737562736372697074696f6e223a7b227374726174696f5f73747265616d696e675f616374696f6e223a317d2c227061747465726e223a22737461746963222c2274696d657374616d70223a2231343134363536343932343737227d,v{s{31,s{'world,'anyone}}},1
>   response::  
> {code}
> > Only spark version changed
> As far as we have checked, snappy will throw this error when dealing with 
> zero bytes length arrays.
> We have tried:
> > Changing from snappy to LZF
> {code}
> Executor task launch worker-0 DEBUG zkclient.ZkConnection - Creating new 
> ZookKeeper instance to connect to node.stratio.com:2181.
> ZkClient-EventThread-166-node.stratio.com:2181 INFO  zkclient.ZkEventThread - 
> Starting ZkClient event thread.
> Executor task launch worker-0 INFO  zookeeper.ZooKeeper - Initiating client 
> connection, connectString=node.stratio.com:2181 sessionTimeout=6000 
> watcher=org.I0Itec.zkclient.ZkClient@5a4f889
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Awaiting connection 
> to Zookeeper server
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Waiting for keeper 
> state SyncConnected
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Opening socket connection to server 
> node.stratio.com/172.19.0.96:2181. Will not attempt to authenticate using 
> SASL (unknown error)
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Socket connection established to 
> node.stratio.com/172.19.0.96:2181, initiating session
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Session establishment request sent on 
> node.stratio.com/172.19.0.96:2181
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Session establishment complete on server 
> node.stratio.com/172.19.0.96:2181, sessionid = 0x1496007e6710009, negotiated 
> timeout = 6000
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Received 
> event: WatchedEvent state:SyncConnected type:None path:null
> Executor task launch worker-0-EventThread INFO  zkclient.ZkClient - zookeeper 
> state changed (SyncConnected)
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Leaving 
> process event
> Executor task launch worker-0 DEBUG zkclient.ZkClient - State is SyncConnected
> ProducerSendThread- DEBUG async.ProducerSendThread - 5000 ms elapsed. Queue 
> time reached. Sending..
> ProducerSendThread- DEBUG async.ProducerSendThread - Handling 0 events
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
> not registered locally
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
> reading broadcast variable 0
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.EOFException
>       at 
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>       at 
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>       at 
> java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>       at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 1.002E-4 s
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.EOFException
>       at 
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>       at 
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>       at 
> java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>       at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Initializing task 
> scheduler.
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414658065894-94786a0e], starting auto 
> committer every 60000 ms
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Scheduling task 
> kafka-consumer-autocommit with initial delay 60000 ms and period 60000 ms.
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connected to 
> node.stratio.com:2181
> Executor task launch worker-0 DEBUG consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414658065894-94786a0e], entering consume 
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
> cap=2066]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> handled message (1.674221 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=179 lim=2066 cap=2066]) 
> from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
> cap=2066]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> handled message (0.994221 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 cap=2066]) from 
> Actor[akka://sparkDriver/deadLetters]
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.EOFException: 
>         
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>         
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>         java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>         java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>         
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.JobGenerator - 
> Got event GenerateJobs(1414658066000 ms)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG streaming.DStreamGraph - 
> Generating jobs for time 1414658066000 ms
> RecurringTimer - BlockGenerator DEBUG util.RecurringTimer - Callback for 
> BlockGenerator called at time 1414658066000
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.EOFException: 
>         
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>         
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>         java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>         java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>         
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.MappedDStream - 
> Time 1414658066000 ms is valid
> RecurringTimer - JobGenerator DEBUG util.RecurringTimer - Callback for 
> JobGenerator called at time 1414658066000
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> RecurringTimer - JobGenerator DEBUG util.RecurringTimer - Callback for 
> JobGenerator called at time 1414658066000
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.JobGenerator - 
> Got event GenerateJobs(1414658066000 ms)
> RecurringTimer - JobGenerator DEBUG util.RecurringTimer - Callback for 
> JobGenerator called at time 1414658066000
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.JobGenerator - 
> Got event GenerateJobs(1414658066000 ms)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG streaming.DStreamGraph - 
> Generating jobs for time 1414658066000 ms
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG streaming.DStreamGraph - 
> Generating jobs for time 1414658066000 ms
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.FilteredDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG kafka.KafkaInputDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.ReceiverTracker - 
> Stream 0 received 0 blocks
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.MappedDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.FilteredDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.FilteredDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.MapValuedDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.ShuffledDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
> dstream.MapPartitionedDStream - Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.MappedDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG kafka.KafkaInputDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.ReceiverTracker - 
> Stream 0 received 0 blocks
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG kafka.KafkaInputDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.ReceiverTracker - 
> Stream 0 received 0 blocks
> sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.TaskSchedulerImpl 
> - Cancelling stage 0
> sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.TaskSchedulerImpl 
> - Cancelling stage 0
> Thread-85 INFO  scheduler.DAGScheduler - Failed to run runJob at 
> ReceiverTracker.scala:275
> sparkDriver-akka.actor.default-dispatcher-3 INFO  kafka.KafkaInputDStream - 
> Persisting RDD 1 for time 1414658066000 ms to StorageLevel(false, true, 
> false, false, 1) at time 1414658066000 ms
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414658065894-94786a0e], begin registering 
> consumer stratioStreaming_ajn-stratio-1414658065894-94786a0e in ZK
> Thread-84 INFO  scheduler.DAGScheduler - Failed to run runJob at 
> ReceiverTracker.scala:275
> sparkDriver-akka.actor.default-dispatcher-5 INFO  kafka.KafkaInputDStream - 
> Persisting RDD 1 for time 1414658066000 ms to StorageLevel(false, true, 
> false, false, 1) at time 1414658066000 ms
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.DAGScheduler - 
> Removing running stage 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.DAGScheduler - 
> Removing running stage 0
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.DAGScheduler - 
> After removal of stage 0, remaining stages = 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.DAGScheduler - 
> After removal of stage 0, remaining stages = 0
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.MappedDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.FilteredDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG streaming.DStreamGraph - 
> Generated 1 jobs for time 1414658066000 ms
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.MappedDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.FilteredDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.JobScheduler - 
> Added jobs for time 1414658066000 ms
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.JobGenerator - 
> Got event DoCheckpoint(1414658066000 ms)
> sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.JobScheduler - 
> Starting job streaming job 1414658066000 ms.0 from job set of time 
> 1414658066000 ms
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.MappedDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.FilteredDStream - 
> Time 1414658066000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG dstream.MappedDStream - 
> Time 1414658066000 ms is valid
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710009, packet:: 
> clientPath:null serverPath:null finished:false header:: 1,1  replyHeader:: 
> 1,50,0  request:: 
> '/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414658065894-94786a0e,#7b2276657273696f6e223a312c22737562736372697074696f6e223a7b227374726174696f5f73747265616d696e675f616374696f6e223a317d2c227061747465726e223a22737461746963222c2274696d657374616d70223a2231343134363538303636303136227d,v{s{31,s{'world,'anyone}}},1
>   response:: 
> '/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414658065894-94786a0e
>  
> {code}
> > Changing spark.broadcast.compress false
> {code}
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 0.240869283 s
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG kafka.KafkaInputDStream - 
> Cleared 1 RDDs that were older than 1414657342000 ms: 1414657342000 ms
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG kafka.KafkaInputDStream - 
> Cleared 1 RDDs that were older than 1414657342000 ms: 1414657342000 ms
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG streaming.DStreamGraph - 
> Cleared old metadata for time 1414657344000 ms
> sparkDriver-akka.actor.default-dispatcher-13 DEBUG 
> storage.BlockManagerSlaveActor - removing RDD 3
> Executor task launch worker-1 DEBUG storage.BlockManager - Getting local 
> block broadcast_1
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-13 INFO  storage.BlockManager - 
> Removing RDD 3
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] handled message (134.08408 ms) 
> RemoveRdd(3) from Actor[akka://sparkDriver/temp/$f]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] received message RemoveRdd(2) from 
> Actor[akka://sparkDriver/temp/$i]
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG 
> storage.BlockManagerSlaveActor - removing RDD 2
> sparkDriver-akka.actor.default-dispatcher-4 INFO  storage.BlockManager - 
> Removing RDD 2
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] handled message (0.050955 ms) 
> RemoveRdd(2) from Actor[akka://sparkDriver/temp/$i]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] received message RemoveRdd(1) from 
> Actor[akka://sparkDriver/temp/$j]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
> storage.BlockManagerSlaveActor - removing RDD 1
> sparkDriver-akka.actor.default-dispatcher-5 INFO  storage.BlockManager - 
> Removing RDD 1
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] handled message (0.037738 ms) 
> RemoveRdd(1) from Actor[akka://sparkDriver/temp/$j]
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.EOFException
>       at 
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>       at 
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>       at 
> java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>       at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
> storage.BlockManagerSlaveActor - Done removing RDD 1, response is 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - Done removing RDD 2, response is 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerSlaveActor - Done removing RDD 3, response is 0
> Executor task launch worker-1 DEBUG storage.BlockManager - Level for block 
> broadcast_1 is StorageLevel(true, true, false, true, 1)
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG streaming.DStreamGraph - 
> Generated 4 jobs for time 1414657344000 ms
> sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.JobScheduler - 
> Added jobs for time 1414657344000 ms
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.JobGenerator - 
> Got event DoCheckpoint(1414657344000 ms)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
> storage.BlockManagerSlaveActor - Sent response: 0 to 
> Actor[akka://sparkDriver/temp/$j]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - Sent response: 0 to 
> Actor[akka://sparkDriver/temp/$i]
> Executor task launch worker-1 DEBUG storage.BlockManager - Getting block 
> broadcast_1 from memory
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerSlaveActor - Sent response: 0 to 
> Actor[akka://sparkDriver/temp/$f]
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
> broadcast_0 is StorageLevel(true, true, false, true, 1)
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
> broadcast_0 from memory
> Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
> Executor task launch worker-1 DEBUG executor.Executor - Task 1's epoch is 0
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
> not registered locally
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
> reading broadcast variable 0
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 7.0321E-5 s
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.EOFException
>       at 
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>       at 
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>       at 
> java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>       at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
> cap=2066]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
> cap=2066]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> handled message (1.681797 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 cap=2066]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> handled message (0.688875 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 cap=2066]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG spark.HeartbeatReceiver - 
> [actor] received message 
> Heartbeat(localhost,[Lscala.Tuple2;@1a94802,BlockManagerId(<driver>, 
> ajn-stratio.local, 53377, 0)) from Actor[akka://sparkDriver/temp/$c]
> sparkDriver-akka.actor.default-dispatcher-3 INFO  
> receiver.ReceiverSupervisorImpl - Registered receiver 0
> Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
> BlockGenerator at time 1414657344800
> Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
> BlockGenerator
> Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - 
> Starting receiver
> Thread-87 INFO  receiver.BlockGenerator - Started block pushing thread
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.EOFException: 
>         
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>         
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>         java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>         java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>         
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.EOFException: 
>         
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>         
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>         java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>         java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>         
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
> Consumer Stream with group: stratioStreaming
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
> Zookeeper: node.stratio.com:2181
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.ReceiverTracker - 
> Registered receiver for stream 0 from akka://sparkDriver
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerMasterActor - [actor] received message 
> BlockManagerHeartbeat(BlockManagerId(<driver>, ajn-stratio.local, 53377, 0)) 
> from Actor[akka://sparkDriver/temp/$d]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerMasterActor - [actor] handled message (0.197908 ms) 
> BlockManagerHeartbeat(BlockManagerId(<driver>, ajn-stratio.local, 53377, 0)) 
> from Actor[akka://sparkDriver/temp/$d]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG spark.HeartbeatReceiver - 
> [actor] handled message (169.804965 ms) 
> Heartbeat(localhost,[Lscala.Tuple2;@1a94802,BlockManagerId(<driver>, 
> ajn-stratio.local, 53377, 0)) from Actor[akka://sparkDriver/temp/$c]
> {code}
> > Changing from TorrentBroadcast to HTTPBroadcast ("spark.broadcast.factory", 
> > "org.apache.spark.broadcast.HttpBroadcastFactory").
> {code}
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.DAGScheduler - 
> Missing parents: List()
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> submitStage(Stage 1)
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> missing: List()
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.DAGScheduler - 
> Submitting Stage 1 (FilteredRDD[6] at filter at FilteredDStream.scala:35), 
> which has no missing parents
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> submitMissingTasks(Stage 1)
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.MappedDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.FilteredDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.MappedDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.FilteredDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.MappedDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.FilteredDStream - 
> Time 1414657758000 ms is valid
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710006, packet:: 
> clientPath:null serverPath:null finished:false header:: 7,4  replyHeader:: 
> 7,41,0  request:: 
> '/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414657757842-d7a2ca15,F
>   response:: 
> #7b2276657273696f6e223a312c22737562736372697074696f6e223a7b227374726174696f5f73747265616d696e675f7265717565737473223a317d2c227061747465726e223a22737461746963222c2274696d657374616d70223a2231343134363537373538303535227d,s{41,41,1414657758409,1414657758409,0,0,0,92710854385008646,108,0,41}
>  
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.MappedDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.FilteredDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.MappedDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.FilteredDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.MappedDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.FilteredDStream - 
> Time 1414657758000 ms is valid
> qtp1571833412-35 DEBUG http.HttpParser - filled 167/167
> RecurringTimer - BlockGenerator DEBUG util.RecurringTimer - Callback for 
> BlockGenerator called at time 1414657758400
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.MappedDStream - 
> Time 1414657758000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG dstream.FilteredDStream - 
> Time 1414657758000 ms is valid
> qtp1571833412-35 - /broadcast_0 DEBUG server.Server - REQUEST /broadcast_0 on 
> BlockingHttpConnection@7cbd5b8f,g=HttpGenerator{s=0,h=-1,b=-1,c=-1},p=HttpParser{s=-5,l=10,c=0},r=1
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG streaming.DStreamGraph - 
> Generated 14 jobs for time 1414657758000 ms
> sparkDriver-akka.actor.default-dispatcher-12 INFO  scheduler.JobScheduler - 
> Added jobs for time 1414657758000 ms
> sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.JobScheduler - 
> Starting job streaming job 1414657758000 ms.0 from job set of time 
> 1414657758000 ms
> sparkDriver-akka.actor.default-dispatcher-12 DEBUG scheduler.JobGenerator - 
> Got event DoCheckpoint(1414657758000 ms)
> qtp1571833412-35 - /broadcast_0 DEBUG server.Server - RESPONSE /broadcast_0  
> 404 handled=true
> pool-7-thread-1 INFO  spark.SparkContext - Starting job: collect at 
> ActionBaseFunction.java:65
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710006, packet:: 
> clientPath:null serverPath:null finished:false header:: 8,8  replyHeader:: 
> 8,41,0  request:: '/consumers/stratioStreaming/ids,T  response:: 
> v{'stratioStreaming_ajn-stratio-1414657757842-d7a2ca15} 
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710006, packet:: 
> clientPath:null serverPath:null finished:false header:: 9,4  replyHeader:: 
> 9,41,0  request:: 
> '/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414657757842-d7a2ca15,F
>   response:: 
> #7b2276657273696f6e223a312c22737562736372697074696f6e223a7b227374726174696f5f73747265616d696e675f7265717565737473223a317d2c227061747465726e223a22737461746963222c2274696d657374616d70223a2231343134363537373538303535227d,s{41,41,1414657758409,1414657758409,0,0,0,92710854385008646,108,0,41}
>  
> pool-7-thread-1 INFO  spark.SparkContext - Job finished: collect at 
> ActionBaseFunction.java:65, took 3.9409E-5 s
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
> broadcast_0 is StorageLevel(true, true, false, true, 1)
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
> broadcast_0 from memory
> sparkDriver-akka.actor.default-dispatcher-12 INFO  scheduler.JobScheduler - 
> Finished job streaming job 1414657758000 ms.0 from job set of time 
> 1414657758000 ms
> Executor task launch worker-0 INFO  storage.BlockManager - Found block 
> broadcast_0 locally
> sparkDriver-akka.actor.default-dispatcher-12 INFO  scheduler.JobScheduler - 
> Starting job streaming job 1414657758000 ms.1 from job set of time 
> 1414657758000 ms
> Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.FileNotFoundException: http://172.17.42.1:34477/broadcast_0
>       at 
> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1624)
>       at 
> org.apache.spark.broadcast.HttpBroadcast$.org$apache$spark$broadcast$HttpBroadcast$$read(HttpBroadcast.scala:197)
>       at 
> org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:89)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> pool-7-thread-1 INFO  spark.SparkContext - Starting job: collect at 
> ActionBaseFunction.java:65
> pool-7-thread-1 INFO  spark.SparkContext - Job finished: collect at 
> ActionBaseFunction.java:65, took 3.1765E-5 s
> Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
> BlockGenerator at time 1414657758600
> Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
> BlockGenerator
> Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - 
> Starting receiver
> sparkDriver-akka.actor.default-dispatcher-2 INFO  storage.MemoryStore - 
> ensureFreeSpace(3136) called with curMem=1216, maxMem=991470551
> sparkDriver-akka.actor.default-dispatcher-12 INFO  scheduler.JobScheduler - 
> Finished job streaming job 1414657758000 ms.1 from job set of time 
> 1414657758000 ms
> sparkDriver-akka.actor.default-dispatcher-12 INFO  scheduler.JobScheduler - 
> Starting job streaming job 1414657758000 ms.2 from job set of time 
> 1414657758000 ms
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=1868 
> cap=1868]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 INFO  storage.MemoryStore - Block 
> broadcast_1 stored as values in memory (estimated size 3.1 KB, free 945.5 MB)
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710006, packet:: 
> clientPath:null serverPath:null finished:false header:: 10,8  replyHeader:: 
> 10,41,0  request:: '/brokers/ids,F  response:: v{'7} 
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG storage.BlockManager - Put 
> block broadcast_1 locally took  7 ms
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG storage.BlockManager - 
> Putting block broadcast_1 without replication took  7 ms
> pool-7-thread-1 INFO  spark.SparkContext - Starting job: collect at 
> ActionBaseFunction.java:65
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
> Consumer Stream with group: stratioStreaming
> sparkDriver-akka.actor.default-dispatcher-3 INFO  
> receiver.ReceiverSupervisorImpl - Registered receiver 0
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
> Zookeeper: node.stratio.com:2181
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Verifying 
> properties
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> group.id is overridden to stratioStreaming
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> zookeeper.connect is overridden to node.stratio.com:2181
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> zookeeper.connection.timeout.ms is overridden to 10000
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414657758445-7b49bb3b], Connecting to 
> zookeeper instance at node.stratio.com:2181
> sparkDriver-akka.actor.default-dispatcher-4 INFO  scheduler.ReceiverTracker - 
> Registered receiver for stream 0 from akka://sparkDriver
> Thread-99 INFO  receiver.BlockGenerator - Started block pushing thread
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.DAGScheduler - 
> Submitting 2 missing tasks from Stage 1 (FilteredRDD[6] at filter at 
> FilteredDStream.scala:35)
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> New pending tasks: Set(ResultTask(1, 1), ResultTask(1, 0))
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl 
> - Adding task set 1.0 with 2 tasks
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSetManager - 
> Epoch for TaskSet 1.0: 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSetManager - 
> Valid locality levels for TaskSet 1.0: NO_PREF, ANY
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710006, packet:: 
> clientPath:null serverPath:null finished:false header:: 11,4  replyHeader:: 
> 11,41,0  request:: '/brokers/ids/7,F  response:: 
> #7b226a6d785f706f7274223a393939392c2274696d657374616d70223a2231343134363535333735373234222c22686f7374223a226e6f64652e7374726174696f2e636f6d222c2276657273696f6e223a312c22706f7274223a393039327d,s{18,18,1414655375792,1414655375792,0,0,0,92710854385008640,95,0,18}
>  
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> received message ReviveOffers from Actor[akka://sparkDriver/deadLetters]
> pool-7-thread-1 INFO  spark.SparkContext - Job finished: collect at 
> ActionBaseFunction.java:65, took 3.0385E-5 s
> Executor task launch worker-0 DEBUG zkclient.ZkConnection - Creating new 
> ZookKeeper instance to connect to node.stratio.com:2181.
> Executor task launch worker-0 INFO  zookeeper.ZooKeeper - Initiating client 
> connection, connectString=node.stratio.com:2181 sessionTimeout=6000 
> watcher=org.I0Itec.zkclient.ZkClient@2fdc4517
> ZkClient-EventThread-189-node.stratio.com:2181 INFO  zkclient.ZkEventThread - 
> Starting ZkClient event thread.
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 1
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> handled message (4.883443 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=1868 cap=1868]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_1, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.TaskSetManager - 
> Starting task 0.0 in stage 1.0 (TID 1, localhost, PROCESS_LOCAL, 880 bytes)
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Opening socket connection to server 
> node.stratio.com/172.19.0.96:2181. Will not attempt to authenticate using 
> SASL (unknown error)
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Socket connection established to 
> node.stratio.com/172.19.0.96:2181, initiating session
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Session establishment request sent on 
> node.stratio.com/172.19.0.96:2181
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Session establishment complete on server 
> node.stratio.com/172.19.0.96:2181, sessionid = 0x1496007e6710007, negotiated 
> timeout = 6000
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Awaiting connection 
> to Zookeeper server
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Waiting for keeper 
> state SyncConnected
> sparkDriver-akka.actor.default-dispatcher-12 INFO  scheduler.JobScheduler - 
> Finished job streaming job 1414657758000 ms.2 from job set of time 
> 1414657758000 ms
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Received 
> event: WatchedEvent state:SyncConnected type:None path:null
> Executor task launch worker-0-EventThread INFO  zkclient.ZkClient - zookeeper 
> state changed (SyncConnected)
> sparkDriver-akka.actor.default-dispatcher-12 INFO  scheduler.JobScheduler - 
> Starting job streaming job 1414657758000 ms.3 from job set of time 
> 1414657758000 ms
> Executor task launch worker-0 DEBUG zkclient.ZkClient - State is SyncConnected
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.FileNotFoundException: 
> http://172.17.42.1:34477/broadcast_0
>         
> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1624)
>         
> org.apache.spark.broadcast.HttpBroadcast$.org$apache$spark$broadcast$HttpBroadcast$$read(HttpBroadcast.scala:197)
>         
> org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:89)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Leaving 
> process event
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Initializing task 
> scheduler.
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> handled message (7.610459 ms) ReviveOffers from 
> Actor[akka://sparkDriver/deadLetters]
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414657758445-7b49bb3b], starting auto 
> committer every 60000 ms
> Executor task launch worker-1 INFO  executor.Executor - Running task 0.0 in 
> stage 1.0 (TID 1)
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Scheduling task 
> kafka-consumer-autocommit with initial delay 60000 ms and period 60000 ms.
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(1,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
> cap=0]) from Actor[akka://sparkDriver/deadLetters]
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connected to 
> node.stratio.com:2181
> Executor task launch worker-0 DEBUG consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414657758445-7b49bb3b], entering consume 
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> handled message (0.07141 ms) 
> StatusUpdate(1,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
> Actor[akka://sparkDriver/deadLetters]
> {code}
> but with no luck for the moment.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to