#1 see https://spark.apache.org/docs/latest/streaming-programming-guide.html#level-of-parallelism-in-data-receiving
#2 By default, all input data and persisted RDDs generated by DStream transformations are automatically cleared. Spark Streaming decides when to clear the data based on the transformations that are used. See https://spark.apache.org/docs/latest/streaming-programming-guide.html#memory-tuning Hope this helps. On 25 July 2015 at 13:43, anshu shukla <anshushuk...@gmail.com> wrote: > 1 - How to increase the level of *parallelism in spark streaming custom > RECEIVER* . > > 2 - Will ssc.receiverstream(/**anything //) will *delete the data > stored in spark memory using store(s) * logic . > > -- > Thanks & Regards, > Anshu Shukla >