[ 
https://issues.apache.org/jira/browse/NIFI-1953?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

pradeep arumalla updated NIFI-1953:
-----------------------------------
    Description: 
 
when I use StorageLevel.MEMORY_AND_DISK() , throws the below exception in 
Apache Spark.Not always I can use MEMORY_ONLY(), when there is too much data 
coming in , I will be getting  "Could not compute split, block 
input-0-1464774108087 not found"  exceptions if I use 
StorageLevel.MEMORY_ONLY().

JavaReceiverInputDStream packetStream = 
                     ssc.receiverStream(new NiFiReceiver(config, 
StorageLevel.MEMORY_AND_DISK())); 

 I get below exception 


16/06/01 12:50:28 ERROR scheduler.ReceiverTracker: Deregistered receiver for 
stream 0: Restarting receiver with delay 2000ms: Failed to receive data from 
NiFi - java.io.NotSerializableException: 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1 
Serialization stack: 
        - object not serializable (class: 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1, value: 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1@70fb979) 
        at 
org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
 
        at 
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
 
        at 
org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:153) 
        at 
org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1189)
 
        at 
org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1198) 
        at org.apache.spark.storage.MemoryStore.putArray(MemoryStore.scala:131) 
        at 
org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:168) 
        at 
org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:142) 
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:790) 
        at 
org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:637) 
        at 
org.apache.spark.streaming.receiver.BlockManagerBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:81)
 
        at 
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:141)
 
        at 
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushIterator(ReceiverSupervisorImpl.scala:121)
 
        at 
org.apache.spark.streaming.receiver.Receiver.store(Receiver.scala:152) 
        at 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable.run(NiFiReceiver.java:182) 
        at java.lang.Thread.run(Thread.java:745) 
Remove Ads

  was:
 
when I use StorageLevel.MEMORY_AND_DISK() , throws the below exception.

JavaReceiverInputDStream packetStream = 
                     ssc.receiverStream(new NiFiReceiver(config, 
StorageLevel.MEMORY_AND_DISK())); 

 I get below exception 


16/06/01 12:50:28 ERROR scheduler.ReceiverTracker: Deregistered receiver for 
stream 0: Restarting receiver with delay 2000ms: Failed to receive data from 
NiFi - java.io.NotSerializableException: 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1 
Serialization stack: 
        - object not serializable (class: 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1, value: 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1@70fb979) 
        at 
org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
 
        at 
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
 
        at 
org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:153) 
        at 
org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1189)
 
        at 
org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1198) 
        at org.apache.spark.storage.MemoryStore.putArray(MemoryStore.scala:131) 
        at 
org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:168) 
        at 
org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:142) 
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:790) 
        at 
org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:637) 
        at 
org.apache.spark.streaming.receiver.BlockManagerBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:81)
 
        at 
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:141)
 
        at 
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushIterator(ReceiverSupervisorImpl.scala:121)
 
        at 
org.apache.spark.streaming.receiver.Receiver.store(Receiver.scala:152) 
        at 
org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable.run(NiFiReceiver.java:182) 
        at java.lang.Thread.run(Thread.java:745) 
Remove Ads


> object not serializable (class: 
> org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1, value: 
> ---------------------------------------------------------------------------------------------
>
>                 Key: NIFI-1953
>                 URL: https://issues.apache.org/jira/browse/NIFI-1953
>             Project: Apache NiFi
>          Issue Type: Bug
>            Reporter: pradeep arumalla
>
>  
> when I use StorageLevel.MEMORY_AND_DISK() , throws the below exception in 
> Apache Spark.Not always I can use MEMORY_ONLY(), when there is too much data 
> coming in , I will be getting  "Could not compute split, block 
> input-0-1464774108087 not found"  exceptions if I use 
> StorageLevel.MEMORY_ONLY().
> JavaReceiverInputDStream packetStream = 
>                      ssc.receiverStream(new NiFiReceiver(config, 
> StorageLevel.MEMORY_AND_DISK())); 
>  I get below exception 
> 16/06/01 12:50:28 ERROR scheduler.ReceiverTracker: Deregistered receiver for 
> stream 0: Restarting receiver with delay 2000ms: Failed to receive data from 
> NiFi - java.io.NotSerializableException: 
> org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1 
> Serialization stack: 
>         - object not serializable (class: 
> org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1, value: 
> org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable$1@70fb979) 
>         at 
> org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
>  
>         at 
> org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
>  
>         at 
> org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:153)
>  
>         at 
> org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1189)
>  
>         at 
> org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1198) 
>         at 
> org.apache.spark.storage.MemoryStore.putArray(MemoryStore.scala:131) 
>         at 
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:168) 
>         at 
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:142) 
>         at 
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:790) 
>         at 
> org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:637) 
>         at 
> org.apache.spark.streaming.receiver.BlockManagerBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:81)
>  
>         at 
> org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:141)
>  
>         at 
> org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushIterator(ReceiverSupervisorImpl.scala:121)
>  
>         at 
> org.apache.spark.streaming.receiver.Receiver.store(Receiver.scala:152) 
>         at 
> org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable.run(NiFiReceiver.java:182) 
>         at java.lang.Thread.run(Thread.java:745) 
> Remove Ads



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to