Hi Swetha,

Would you mind elaborating your usage scenario of DStream unpersisting?

>From my understanding:

1. Spark Streaming will automatically unpersist outdated data (you already
mentioned about the configurations).
2. If streaming job is started, I think you may lose the control of the
job, when do you call this unpersist, how to call this unpersist (from
another thread)?

Thanks
Saisai


On Thu, Nov 5, 2015 at 3:13 PM, swetha kasireddy <swethakasire...@gmail.com>
wrote:

> Other than setting the following.
>
> sparkConf.set("spark.streaming.unpersist", "true")
> sparkConf.set("spark.cleaner.ttl", "7200s")
>
>
> On Wed, Nov 4, 2015 at 5:03 PM, swetha <swethakasire...@gmail.com> wrote:
>
>> Hi,
>>
>> How to unpersist a DStream in Spark Streaming? I know that we can persist
>> using dStream.persist() or dStream.cache. But, I don't see any method to
>> unPersist.
>>
>> Thanks,
>> Swetha
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-unpersist-a-DStream-in-Spark-Streaming-tp25281.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to