[ 
https://issues.apache.org/jira/browse/SPARK-4133?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Antonio Jesus Navarro updated SPARK-4133:
-----------------------------------------
    Description: 
Snappy related problems found when trying to upgrade existing Spark Streaming 
App from 1.0.2 to 1.1.0.

We can not run an existing 1.0.2 spark app if upgraded to 1.1.0

> IOException is thrown by snappy (parsing_error(2))

{code}
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
broadcast_0 is StorageLevel(true, true, false, true, 1)
Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
broadcast_0 from memory
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
not registered locally
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
reading broadcast variable 0
sparkDriver-akka.actor.default-dispatcher-4 INFO  
receiver.ReceiverSupervisorImpl - Registered receiver 0
Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
BlockGenerator at time 1414656492400
Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
BlockGenerator
Thread-87 INFO  receiver.BlockGenerator - Started block pushing thread
Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - Starting 
receiver
sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.ReceiverTracker - 
Registered receiver for stream 0 from akka://sparkDriver
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
Consumer Stream with group: stratioStreaming
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
Zookeeper: node.stratio.com:2181
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
cap=0]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
cap=0]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
cap=0]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (8.442354 ms) 
StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (8.412421 ms) 
StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
handled message (8.385471 ms) 
StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
Actor[akka://sparkDriver/deadLetters]
Executor task launch worker-0 INFO  utils.VerifiableProperties - Verifying 
properties
Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
group.id is overridden to stratioStreaming
Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
zookeeper.connect is overridden to node.stratio.com:2181
Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
zookeeper.connection.timeout.ms is overridden to 10000
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
broadcast variable 0 took 0.033998997 s
Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], Connecting to zookeeper 
instance at node.stratio.com:2181
Executor task launch worker-0 DEBUG zkclient.ZkConnection - Creating new 
ZookKeeper instance to connect to node.stratio.com:2181.
ZkClient-EventThread-169-node.stratio.com:2181 INFO  zkclient.ZkEventThread - 
Starting ZkClient event thread.
Executor task launch worker-0 INFO  zookeeper.ZooKeeper - Initiating client 
connection, connectString=node.stratio.com:2181 sessionTimeout=6000 
watcher=org.I0Itec.zkclient.ZkClient@5b4bdc81
Executor task launch worker-0 DEBUG zkclient.ZkClient - Awaiting connection to 
Zookeeper server
Executor task launch worker-0 DEBUG zkclient.ZkClient - Waiting for keeper 
state SyncConnected
Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
zookeeper.ClientCnxn - Opening socket connection to server 
node.stratio.com/172.19.0.96:2181. Will not attempt to authenticate using SASL 
(unknown error)
Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
zookeeper.ClientCnxn - Socket connection established to 
node.stratio.com/172.19.0.96:2181, initiating session
Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
zookeeper.ClientCnxn - Session establishment request sent on 
node.stratio.com/172.19.0.96:2181
Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
zookeeper.ClientCnxn - Session establishment complete on server 
node.stratio.com/172.19.0.96:2181, sessionid = 0x1496007e6710002, negotiated 
timeout = 6000
Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Received 
event: WatchedEvent state:SyncConnected type:None path:null
Executor task launch worker-0-EventThread INFO  zkclient.ZkClient - zookeeper 
state changed (SyncConnected)
Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Leaving 
process event
Executor task launch worker-0 DEBUG zkclient.ZkClient - State is SyncConnected
RecurringTimer - BlockGenerator DEBUG util.RecurringTimer - Callback for 
BlockGenerator called at time 1414656492400
Executor task launch worker-0 DEBUG utils.KafkaScheduler - Initializing task 
scheduler.
Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], starting auto committer 
every 60000 ms
Executor task launch worker-0 DEBUG utils.KafkaScheduler - Scheduling task 
kafka-consumer-autocommit with initial delay 60000 ms and period 60000 ms.
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connected to 
node.stratio.com:2181
Executor task launch worker-0 DEBUG consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], entering consume 
Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], begin registering 
consumer stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a in ZK
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
not registered locally
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
reading broadcast variable 0
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
broadcast variable 0 took 5.5676E-5 s
Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
in stage 0.0 (TID 0)
java.io.IOException: PARSING_ERROR(2)
        at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        at 
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        at 
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        at 
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        at 
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        at 
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
in stage 0.0 (TID 0)
java.io.IOException: PARSING_ERROR(2)
        at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        at 
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        at 
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        at 
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        at 
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        at 
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
cap=2144]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
cap=2144]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl - 
parentName: , name: TaskSet_0, runningTasks: 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl - 
parentName: , name: TaskSet_0, runningTasks: 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (1.213476 ms) 
StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (1.543991 ms) 
StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
Actor[akka://sparkDriver/deadLetters]
Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
        org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        java.lang.reflect.Method.invoke(Method.java:606)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)
Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
        org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        java.lang.reflect.Method.invoke(Method.java:606)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)
Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
failed 1 times; aborting job
Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
failed 1 times; aborting job
Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
0.0, whose tasks have all completed, from pool 
Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
0.0, whose tasks have all completed, from pool 
sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl - 
Cancelling stage 0
sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl - 
Cancelling stage 0
Thread-84 INFO  scheduler.DAGScheduler - Failed to run runJob at 
ReceiverTracker.scala:275
Thread-85 INFO  scheduler.DAGScheduler - Failed to run runJob at 
ReceiverTracker.scala:275
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
Removing running stage 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
Removing running stage 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
After removal of stage 0, remaining stages = 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
After removal of stage 0, remaining stages = 0
Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710002, packet:: 
clientPath:null serverPath:null finished:false header:: 1,1  replyHeader:: 
1,25,-101  request:: 
'/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a,#7b2276657273696f6e223a312c22737562736372697074696f6e223a7b227374726174696f5f73747265616d696e675f616374696f6e223a317d2c227061747465726e223a22737461746963222c2274696d657374616d70223a2231343134363536343932343737227d,v{s{31,s{'world,'anyone}}},1
  response::  
{code}
> Only spark version changed

As far as we have checked, snappy will throw this error when dealing with zero 
bytes length arrays.

We have tried:

> Changing from snappy to LZF, 
> Changing spark.broadcast.compress false
{code}
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
broadcast variable 0 took 0.240869283 s
sparkDriver-akka.actor.default-dispatcher-4 DEBUG kafka.KafkaInputDStream - 
Cleared 1 RDDs that were older than 1414657342000 ms: 1414657342000 ms
sparkDriver-akka.actor.default-dispatcher-4 DEBUG kafka.KafkaInputDStream - 
Cleared 1 RDDs that were older than 1414657342000 ms: 1414657342000 ms
sparkDriver-akka.actor.default-dispatcher-4 DEBUG streaming.DStreamGraph - 
Cleared old metadata for time 1414657344000 ms
sparkDriver-akka.actor.default-dispatcher-13 DEBUG 
storage.BlockManagerSlaveActor - removing RDD 3
Executor task launch worker-1 DEBUG storage.BlockManager - Getting local block 
broadcast_1
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - Time 
1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.FilteredDStream - 
Time 1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
Time 1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-13 INFO  storage.BlockManager - 
Removing RDD 3
sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
storage.BlockManagerSlaveActor - [actor] handled message (134.08408 ms) 
RemoveRdd(3) from Actor[akka://sparkDriver/temp/$f]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
storage.BlockManagerSlaveActor - [actor] received message RemoveRdd(2) from 
Actor[akka://sparkDriver/temp/$i]
sparkDriver-akka.actor.default-dispatcher-4 DEBUG 
storage.BlockManagerSlaveActor - removing RDD 2
sparkDriver-akka.actor.default-dispatcher-4 INFO  storage.BlockManager - 
Removing RDD 2
sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
storage.BlockManagerSlaveActor - [actor] handled message (0.050955 ms) 
RemoveRdd(2) from Actor[akka://sparkDriver/temp/$i]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
storage.BlockManagerSlaveActor - [actor] received message RemoveRdd(1) from 
Actor[akka://sparkDriver/temp/$j]
sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
storage.BlockManagerSlaveActor - removing RDD 1
sparkDriver-akka.actor.default-dispatcher-5 INFO  storage.BlockManager - 
Removing RDD 1
sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
storage.BlockManagerSlaveActor - [actor] handled message (0.037738 ms) 
RemoveRdd(1) from Actor[akka://sparkDriver/temp/$j]
Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
in stage 0.0 (TID 0)
java.io.EOFException
        at 
java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
        at 
java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
        at 
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
        at 
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
        at 
org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
        at 
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
        at 
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
storage.BlockManagerSlaveActor - Done removing RDD 1, response is 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
storage.BlockManagerSlaveActor - Done removing RDD 2, response is 0
sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
storage.BlockManagerSlaveActor - Done removing RDD 3, response is 0
Executor task launch worker-1 DEBUG storage.BlockManager - Level for block 
broadcast_1 is StorageLevel(true, true, false, true, 1)
sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.FilteredDStream - 
Time 1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - Time 
1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
Time 1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-3 DEBUG streaming.DStreamGraph - 
Generated 4 jobs for time 1414657344000 ms
sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.JobScheduler - 
Added jobs for time 1414657344000 ms
sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.JobGenerator - Got 
event DoCheckpoint(1414657344000 ms)
sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
storage.BlockManagerSlaveActor - Sent response: 0 to 
Actor[akka://sparkDriver/temp/$j]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
storage.BlockManagerSlaveActor - Sent response: 0 to 
Actor[akka://sparkDriver/temp/$i]
Executor task launch worker-1 DEBUG storage.BlockManager - Getting block 
broadcast_1 from memory
sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
storage.BlockManagerSlaveActor - Sent response: 0 to 
Actor[akka://sparkDriver/temp/$f]
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
broadcast_0 is StorageLevel(true, true, false, true, 1)
Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
broadcast_0 from memory
Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
Executor task launch worker-1 DEBUG executor.Executor - Task 1's epoch is 0
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
not registered locally
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
reading broadcast variable 0
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - Time 
1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
Time 1414657344000 ms is valid
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
broadcast variable 0 took 7.0321E-5 s
Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
in stage 0.0 (TID 0)
java.io.EOFException
        at 
java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
        at 
java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
        at 
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
        at 
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
        at 
org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
        at 
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
        at 
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
cap=2066]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
cap=2066]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.TaskSchedulerImpl - 
parentName: , name: TaskSet_0, runningTasks: 0
sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.TaskSchedulerImpl - 
parentName: , name: TaskSet_0, runningTasks: 0
sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
handled message (1.681797 ms) 
StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 cap=2066]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
handled message (0.688875 ms) 
StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 cap=2066]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - Time 
1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
Time 1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-5 DEBUG spark.HeartbeatReceiver - 
[actor] received message 
Heartbeat(localhost,[Lscala.Tuple2;@1a94802,BlockManagerId(<driver>, 
ajn-stratio.local, 53377, 0)) from Actor[akka://sparkDriver/temp/$c]
sparkDriver-akka.actor.default-dispatcher-3 INFO  
receiver.ReceiverSupervisorImpl - Registered receiver 0
Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
BlockGenerator at time 1414657344800
Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
BlockGenerator
Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - Starting 
receiver
Thread-87 INFO  receiver.BlockGenerator - Started block pushing thread
Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
stage 0.0 (TID 0, localhost): java.io.EOFException: 
        
java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
        
java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
        java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
        java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
        
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
        
org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
        
org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
        
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
        
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        java.lang.reflect.Method.invoke(Method.java:606)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)
Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
stage 0.0 (TID 0, localhost): java.io.EOFException: 
        
java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
        
java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
        java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
        java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
        
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
        
org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
        
org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
        
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
        
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        java.lang.reflect.Method.invoke(Method.java:606)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
Consumer Stream with group: stratioStreaming
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
Zookeeper: node.stratio.com:2181
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - Time 
1414657344000 ms is valid
sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
Time 1414657344000 ms is valid
Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
failed 1 times; aborting job
Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
failed 1 times; aborting job
sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.ReceiverTracker - 
Registered receiver for stream 0 from akka://sparkDriver
Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
0.0, whose tasks have all completed, from pool 
Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
0.0, whose tasks have all completed, from pool 
sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
storage.BlockManagerMasterActor - [actor] received message 
BlockManagerHeartbeat(BlockManagerId(<driver>, ajn-stratio.local, 53377, 0)) 
from Actor[akka://sparkDriver/temp/$d]
sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
storage.BlockManagerMasterActor - [actor] handled message (0.197908 ms) 
BlockManagerHeartbeat(BlockManagerId(<driver>, ajn-stratio.local, 53377, 0)) 
from Actor[akka://sparkDriver/temp/$d]
sparkDriver-akka.actor.default-dispatcher-5 DEBUG spark.HeartbeatReceiver - 
[actor] handled message (169.804965 ms) 
Heartbeat(localhost,[Lscala.Tuple2;@1a94802,BlockManagerId(<driver>, 
ajn-stratio.local, 53377, 0)) from Actor[akka://sparkDriver/temp/$c]
{code}
> Changing from TorrentBroadcast to HTTPBroadcast.

but with no luck for the moment.



  was:
Snappy related problems found when trying to upgrade existing Spark Streaming 
App from 1.0.2 to 1.1.0.

We can not run an existing 1.0.2 spark app if upgraded to 1.1.0

> IOException is thrown by snappy (parsing_error(2))
> Only spark version changed

As far as we have checked, snappy will throw this error when dealing with zero 
bytes length arrays.

We have tried:

> Changing from snappy to LZF, 
> Changing broadcast.compression false
> Changing from TorrentBroadcast to HTTPBroadcast.

but with no luck for the moment.

{code}
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
broadcast_0 is StorageLevel(true, true, false, true, 1)
Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
broadcast_0 from memory
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
not registered locally
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
reading broadcast variable 0
sparkDriver-akka.actor.default-dispatcher-4 INFO  
receiver.ReceiverSupervisorImpl - Registered receiver 0
Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
BlockGenerator at time 1414656492400
Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
BlockGenerator
Thread-87 INFO  receiver.BlockGenerator - Started block pushing thread
Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - Starting 
receiver
sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.ReceiverTracker - 
Registered receiver for stream 0 from akka://sparkDriver
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
Consumer Stream with group: stratioStreaming
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
Zookeeper: node.stratio.com:2181
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
cap=0]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
cap=0]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
cap=0]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (8.442354 ms) 
StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (8.412421 ms) 
StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
handled message (8.385471 ms) 
StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
Actor[akka://sparkDriver/deadLetters]
Executor task launch worker-0 INFO  utils.VerifiableProperties - Verifying 
properties
Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
group.id is overridden to stratioStreaming
Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
zookeeper.connect is overridden to node.stratio.com:2181
Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
zookeeper.connection.timeout.ms is overridden to 10000
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
broadcast variable 0 took 0.033998997 s
Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], Connecting to zookeeper 
instance at node.stratio.com:2181
Executor task launch worker-0 DEBUG zkclient.ZkConnection - Creating new 
ZookKeeper instance to connect to node.stratio.com:2181.
ZkClient-EventThread-169-node.stratio.com:2181 INFO  zkclient.ZkEventThread - 
Starting ZkClient event thread.
Executor task launch worker-0 INFO  zookeeper.ZooKeeper - Initiating client 
connection, connectString=node.stratio.com:2181 sessionTimeout=6000 
watcher=org.I0Itec.zkclient.ZkClient@5b4bdc81
Executor task launch worker-0 DEBUG zkclient.ZkClient - Awaiting connection to 
Zookeeper server
Executor task launch worker-0 DEBUG zkclient.ZkClient - Waiting for keeper 
state SyncConnected
Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
zookeeper.ClientCnxn - Opening socket connection to server 
node.stratio.com/172.19.0.96:2181. Will not attempt to authenticate using SASL 
(unknown error)
Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
zookeeper.ClientCnxn - Socket connection established to 
node.stratio.com/172.19.0.96:2181, initiating session
Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
zookeeper.ClientCnxn - Session establishment request sent on 
node.stratio.com/172.19.0.96:2181
Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
zookeeper.ClientCnxn - Session establishment complete on server 
node.stratio.com/172.19.0.96:2181, sessionid = 0x1496007e6710002, negotiated 
timeout = 6000
Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Received 
event: WatchedEvent state:SyncConnected type:None path:null
Executor task launch worker-0-EventThread INFO  zkclient.ZkClient - zookeeper 
state changed (SyncConnected)
Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Leaving 
process event
Executor task launch worker-0 DEBUG zkclient.ZkClient - State is SyncConnected
RecurringTimer - BlockGenerator DEBUG util.RecurringTimer - Callback for 
BlockGenerator called at time 1414656492400
Executor task launch worker-0 DEBUG utils.KafkaScheduler - Initializing task 
scheduler.
Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], starting auto committer 
every 60000 ms
Executor task launch worker-0 DEBUG utils.KafkaScheduler - Scheduling task 
kafka-consumer-autocommit with initial delay 60000 ms and period 60000 ms.
Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connected to 
node.stratio.com:2181
Executor task launch worker-0 DEBUG consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], entering consume 
Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
[stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], begin registering 
consumer stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a in ZK
Executor task launch worker-0 DEBUG storage.BlockManager - Getting local block 
broadcast_0
Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
not registered locally
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
reading broadcast variable 0
Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
broadcast variable 0 took 5.5676E-5 s
Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
in stage 0.0 (TID 0)
java.io.IOException: PARSING_ERROR(2)
        at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        at 
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        at 
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        at 
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        at 
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        at 
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
in stage 0.0 (TID 0)
java.io.IOException: PARSING_ERROR(2)
        at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        at 
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        at 
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        at 
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        at 
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        at 
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
cap=2144]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
cap=2144]) from Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl - 
parentName: , name: TaskSet_0, runningTasks: 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl - 
parentName: , name: TaskSet_0, runningTasks: 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (1.213476 ms) 
StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
Actor[akka://sparkDriver/deadLetters]
sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
handled message (1.543991 ms) 
StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
Actor[akka://sparkDriver/deadLetters]
Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
        org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        java.lang.reflect.Method.invoke(Method.java:606)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)
Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
        org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
        org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
        
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
        org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
        
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
        
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
        
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
        sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        java.lang.reflect.Method.invoke(Method.java:606)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)
Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
failed 1 times; aborting job
Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
failed 1 times; aborting job
Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
0.0, whose tasks have all completed, from pool 
Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
0.0, whose tasks have all completed, from pool 
sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl - 
Cancelling stage 0
sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl - 
Cancelling stage 0
Thread-84 INFO  scheduler.DAGScheduler - Failed to run runJob at 
ReceiverTracker.scala:275
Thread-85 INFO  scheduler.DAGScheduler - Failed to run runJob at 
ReceiverTracker.scala:275
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
Removing running stage 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
Removing running stage 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
After removal of stage 0, remaining stages = 0
sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
After removal of stage 0, remaining stages = 0
Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710002, packet:: 
clientPath:null serverPath:null finished:false header:: 1,1  replyHeader:: 
1,25,-101  request:: 
'/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a,#7b2276657273696f6e223a312c22737562736372697074696f6e223a7b227374726174696f5f73747265616d696e675f616374696f6e223a317d2c227061747465726e223a22737461746963222c2274696d657374616d70223a2231343134363536343932343737227d,v{s{31,s{'world,'anyone}}},1
  response::  
{code}


> PARSING_ERROR(2) when upgrading issues from 1.0.2 to 1.1.0
> ----------------------------------------------------------
>
>                 Key: SPARK-4133
>                 URL: https://issues.apache.org/jira/browse/SPARK-4133
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.1.0
>            Reporter: Antonio Jesus Navarro
>            Priority: Blocker
>
> Snappy related problems found when trying to upgrade existing Spark Streaming 
> App from 1.0.2 to 1.1.0.
> We can not run an existing 1.0.2 spark app if upgraded to 1.1.0
> > IOException is thrown by snappy (parsing_error(2))
> {code}
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
> broadcast_0 is StorageLevel(true, true, false, true, 1)
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
> broadcast_0 from memory
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
> Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
> not registered locally
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
> reading broadcast variable 0
> sparkDriver-akka.actor.default-dispatcher-4 INFO  
> receiver.ReceiverSupervisorImpl - Registered receiver 0
> Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
> BlockGenerator at time 1414656492400
> Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
> BlockGenerator
> Thread-87 INFO  receiver.BlockGenerator - Started block pushing thread
> Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - 
> Starting receiver
> sparkDriver-akka.actor.default-dispatcher-5 INFO  scheduler.ReceiverTracker - 
> Registered receiver for stream 0 from akka://sparkDriver
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
> Consumer Stream with group: stratioStreaming
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
> Zookeeper: node.stratio.com:2181
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
> cap=0]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
> cap=0]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 
> cap=0]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (8.442354 ms) 
> StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (8.412421 ms) 
> StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-6 DEBUG local.LocalActor - [actor] 
> handled message (8.385471 ms) 
> StatusUpdate(0,RUNNING,java.nio.HeapByteBuffer[pos=0 lim=0 cap=0]) from 
> Actor[akka://sparkDriver/deadLetters]
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Verifying 
> properties
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> group.id is overridden to stratioStreaming
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> zookeeper.connect is overridden to node.stratio.com:2181
> Executor task launch worker-0 INFO  utils.VerifiableProperties - Property 
> zookeeper.connection.timeout.ms is overridden to 10000
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 0.033998997 s
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], Connecting to 
> zookeeper instance at node.stratio.com:2181
> Executor task launch worker-0 DEBUG zkclient.ZkConnection - Creating new 
> ZookKeeper instance to connect to node.stratio.com:2181.
> ZkClient-EventThread-169-node.stratio.com:2181 INFO  zkclient.ZkEventThread - 
> Starting ZkClient event thread.
> Executor task launch worker-0 INFO  zookeeper.ZooKeeper - Initiating client 
> connection, connectString=node.stratio.com:2181 sessionTimeout=6000 
> watcher=org.I0Itec.zkclient.ZkClient@5b4bdc81
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Awaiting connection 
> to Zookeeper server
> Executor task launch worker-0 DEBUG zkclient.ZkClient - Waiting for keeper 
> state SyncConnected
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Opening socket connection to server 
> node.stratio.com/172.19.0.96:2181. Will not attempt to authenticate using 
> SASL (unknown error)
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Socket connection established to 
> node.stratio.com/172.19.0.96:2181, initiating session
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Session establishment request sent on 
> node.stratio.com/172.19.0.96:2181
> Executor task launch worker-0-SendThread(node.stratio.com:2181) INFO  
> zookeeper.ClientCnxn - Session establishment complete on server 
> node.stratio.com/172.19.0.96:2181, sessionid = 0x1496007e6710002, negotiated 
> timeout = 6000
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Received 
> event: WatchedEvent state:SyncConnected type:None path:null
> Executor task launch worker-0-EventThread INFO  zkclient.ZkClient - zookeeper 
> state changed (SyncConnected)
> Executor task launch worker-0-EventThread DEBUG zkclient.ZkClient - Leaving 
> process event
> Executor task launch worker-0 DEBUG zkclient.ZkClient - State is SyncConnected
> RecurringTimer - BlockGenerator DEBUG util.RecurringTimer - Callback for 
> BlockGenerator called at time 1414656492400
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Initializing task 
> scheduler.
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], starting auto 
> committer every 60000 ms
> Executor task launch worker-0 DEBUG utils.KafkaScheduler - Scheduling task 
> kafka-consumer-autocommit with initial delay 60000 ms and period 60000 ms.
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connected to 
> node.stratio.com:2181
> Executor task launch worker-0 DEBUG consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], entering consume 
> Executor task launch worker-0 INFO  consumer.ZookeeperConsumerConnector - 
> [stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a], begin registering 
> consumer stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a in ZK
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
> not registered locally
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
> reading broadcast variable 0
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 5.5676E-5 s
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.IOException: PARSING_ERROR(2)
>       at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>       at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>       at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>       at 
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>       at 
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>       at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>       at 
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.IOException: PARSING_ERROR(2)
>       at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>       at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>       at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>       at 
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>       at 
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>       at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>       at 
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
> cap=2144]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 
> cap=2144]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (1.213476 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG local.LocalActor - [actor] 
> handled message (1.543991 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2144 cap=2144]) from 
> Actor[akka://sparkDriver/deadLetters]
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
>         org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>         org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>         org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>         
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>         
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>         org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>         
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.IOException: PARSING_ERROR(2)
>         org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
>         org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
>         org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
>         
> org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
>         
> org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
>         org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
>         
> org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl 
> - Cancelling stage 0
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.TaskSchedulerImpl 
> - Cancelling stage 0
> Thread-84 INFO  scheduler.DAGScheduler - Failed to run runJob at 
> ReceiverTracker.scala:275
> Thread-85 INFO  scheduler.DAGScheduler - Failed to run runJob at 
> ReceiverTracker.scala:275
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> Removing running stage 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> Removing running stage 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> After removal of stage 0, remaining stages = 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG scheduler.DAGScheduler - 
> After removal of stage 0, remaining stages = 0
> Executor task launch worker-0-SendThread(node.stratio.com:2181) DEBUG 
> zookeeper.ClientCnxn - Reading reply sessionid:0x1496007e6710002, packet:: 
> clientPath:null serverPath:null finished:false header:: 1,1  replyHeader:: 
> 1,25,-101  request:: 
> '/consumers/stratioStreaming/ids/stratioStreaming_ajn-stratio-1414656492293-8ecb3e3a,#7b2276657273696f6e223a312c22737562736372697074696f6e223a7b227374726174696f5f73747265616d696e675f616374696f6e223a317d2c227061747465726e223a22737461746963222c2274696d657374616d70223a2231343134363536343932343737227d,v{s{31,s{'world,'anyone}}},1
>   response::  
> {code}
> > Only spark version changed
> As far as we have checked, snappy will throw this error when dealing with 
> zero bytes length arrays.
> We have tried:
> > Changing from snappy to LZF, 
> > Changing spark.broadcast.compress false
> {code}
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 0.240869283 s
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG kafka.KafkaInputDStream - 
> Cleared 1 RDDs that were older than 1414657342000 ms: 1414657342000 ms
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG kafka.KafkaInputDStream - 
> Cleared 1 RDDs that were older than 1414657342000 ms: 1414657342000 ms
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG streaming.DStreamGraph - 
> Cleared old metadata for time 1414657344000 ms
> sparkDriver-akka.actor.default-dispatcher-13 DEBUG 
> storage.BlockManagerSlaveActor - removing RDD 3
> Executor task launch worker-1 DEBUG storage.BlockManager - Getting local 
> block broadcast_1
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-13 INFO  storage.BlockManager - 
> Removing RDD 3
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] handled message (134.08408 ms) 
> RemoveRdd(3) from Actor[akka://sparkDriver/temp/$f]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] received message RemoveRdd(2) from 
> Actor[akka://sparkDriver/temp/$i]
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG 
> storage.BlockManagerSlaveActor - removing RDD 2
> sparkDriver-akka.actor.default-dispatcher-4 INFO  storage.BlockManager - 
> Removing RDD 2
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] handled message (0.050955 ms) 
> RemoveRdd(2) from Actor[akka://sparkDriver/temp/$i]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] received message RemoveRdd(1) from 
> Actor[akka://sparkDriver/temp/$j]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
> storage.BlockManagerSlaveActor - removing RDD 1
> sparkDriver-akka.actor.default-dispatcher-5 INFO  storage.BlockManager - 
> Removing RDD 1
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - [actor] handled message (0.037738 ms) 
> RemoveRdd(1) from Actor[akka://sparkDriver/temp/$j]
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.EOFException
>       at 
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>       at 
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>       at 
> java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>       at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
> storage.BlockManagerSlaveActor - Done removing RDD 1, response is 0
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - Done removing RDD 2, response is 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerSlaveActor - Done removing RDD 3, response is 0
> Executor task launch worker-1 DEBUG storage.BlockManager - Level for block 
> broadcast_1 is StorageLevel(true, true, false, true, 1)
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG streaming.DStreamGraph - 
> Generated 4 jobs for time 1414657344000 ms
> sparkDriver-akka.actor.default-dispatcher-3 INFO  scheduler.JobScheduler - 
> Added jobs for time 1414657344000 ms
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.JobGenerator - 
> Got event DoCheckpoint(1414657344000 ms)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG 
> storage.BlockManagerSlaveActor - Sent response: 0 to 
> Actor[akka://sparkDriver/temp/$j]
> sparkDriver-akka.actor.default-dispatcher-2 DEBUG 
> storage.BlockManagerSlaveActor - Sent response: 0 to 
> Actor[akka://sparkDriver/temp/$i]
> Executor task launch worker-1 DEBUG storage.BlockManager - Getting block 
> broadcast_1 from memory
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerSlaveActor - Sent response: 0 to 
> Actor[akka://sparkDriver/temp/$f]
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Level for block 
> broadcast_0 is StorageLevel(true, true, false, true, 1)
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting block 
> broadcast_0 from memory
> Executor task launch worker-0 DEBUG executor.Executor - Task 0's epoch is 0
> Executor task launch worker-1 DEBUG executor.Executor - Task 1's epoch is 0
> Executor task launch worker-0 DEBUG storage.BlockManager - Getting local 
> block broadcast_0
> Executor task launch worker-0 DEBUG storage.BlockManager - Block broadcast_0 
> not registered locally
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Started 
> reading broadcast variable 0
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> Executor task launch worker-0 INFO  broadcast.TorrentBroadcast - Reading 
> broadcast variable 0 took 7.0321E-5 s
> Executor task launch worker-0 ERROR executor.Executor - Exception in task 0.0 
> in stage 0.0 (TID 0)
> java.io.EOFException
>       at 
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>       at 
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>       at 
> java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>       at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>       at 
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
> cap=2066]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> received message StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 
> cap=2066]) from Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG scheduler.TaskSchedulerImpl 
> - parentName: , name: TaskSet_0, runningTasks: 0
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG local.LocalActor - [actor] 
> handled message (1.681797 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 cap=2066]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG local.LocalActor - [actor] 
> handled message (0.688875 ms) 
> StatusUpdate(0,FAILED,java.nio.HeapByteBuffer[pos=0 lim=2066 cap=2066]) from 
> Actor[akka://sparkDriver/deadLetters]
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG spark.HeartbeatReceiver - 
> [actor] received message 
> Heartbeat(localhost,[Lscala.Tuple2;@1a94802,BlockManagerId(<driver>, 
> ajn-stratio.local, 53377, 0)) from Actor[akka://sparkDriver/temp/$c]
> sparkDriver-akka.actor.default-dispatcher-3 INFO  
> receiver.ReceiverSupervisorImpl - Registered receiver 0
> Executor task launch worker-0 INFO  util.RecurringTimer - Started timer for 
> BlockGenerator at time 1414657344800
> Executor task launch worker-0 INFO  receiver.BlockGenerator - Started 
> BlockGenerator
> Executor task launch worker-0 INFO  receiver.ReceiverSupervisorImpl - 
> Starting receiver
> Thread-87 INFO  receiver.BlockGenerator - Started block pushing thread
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.EOFException: 
>         
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>         
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>         java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>         java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>         
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Result resolver thread-0 WARN  scheduler.TaskSetManager - Lost task 0.0 in 
> stage 0.0 (TID 0, localhost): java.io.EOFException: 
>         
> java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2325)
>         
> java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
>         java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
>         java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
>         
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:95)
>         
> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:235)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
>         sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         java.lang.reflect.Method.invoke(Method.java:606)
>         
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
>         
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:745)
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Starting Kafka 
> Consumer Stream with group: stratioStreaming
> Executor task launch worker-0 INFO  kafka.KafkaReceiver - Connecting to 
> Zookeeper: node.stratio.com:2181
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.MappedDStream - 
> Time 1414657344000 ms is valid
> sparkDriver-akka.actor.default-dispatcher-4 DEBUG dstream.FilteredDStream - 
> Time 1414657344000 ms is valid
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> Result resolver thread-0 ERROR scheduler.TaskSetManager - Task 0 in stage 0.0 
> failed 1 times; aborting job
> sparkDriver-akka.actor.default-dispatcher-2 INFO  scheduler.ReceiverTracker - 
> Registered receiver for stream 0 from akka://sparkDriver
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> Result resolver thread-0 INFO  scheduler.TaskSchedulerImpl - Removed TaskSet 
> 0.0, whose tasks have all completed, from pool 
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerMasterActor - [actor] received message 
> BlockManagerHeartbeat(BlockManagerId(<driver>, ajn-stratio.local, 53377, 0)) 
> from Actor[akka://sparkDriver/temp/$d]
> sparkDriver-akka.actor.default-dispatcher-3 DEBUG 
> storage.BlockManagerMasterActor - [actor] handled message (0.197908 ms) 
> BlockManagerHeartbeat(BlockManagerId(<driver>, ajn-stratio.local, 53377, 0)) 
> from Actor[akka://sparkDriver/temp/$d]
> sparkDriver-akka.actor.default-dispatcher-5 DEBUG spark.HeartbeatReceiver - 
> [actor] handled message (169.804965 ms) 
> Heartbeat(localhost,[Lscala.Tuple2;@1a94802,BlockManagerId(<driver>, 
> ajn-stratio.local, 53377, 0)) from Actor[akka://sparkDriver/temp/$c]
> {code}
> > Changing from TorrentBroadcast to HTTPBroadcast.
> but with no luck for the moment.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to