Also check spark UI streaming section for various helpful stats. by default
it runs on 4040 but can change it by setting    --conf "spark.ui.port=nnnn"

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 4 August 2016 at 23:48, Mohammed Guller <moham...@glassbeam.com> wrote:

> The backlog will increase as time passes and eventually you will run out
> of memory.
>
>
>
> Mohammed
>
> Author: Big Data Analytics with Spark
> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/>
>
>
>
> *From:* Saurav Sinha [mailto:sauravsinh...@gmail.com]
> *Sent:* Wednesday, August 3, 2016 11:57 PM
> *To:* user
> *Subject:* Explanation regarding Spark Streaming
>
>
>
> Hi,
>
>
>
> I have query
>
>
>
> Q1. What will happen if spark streaming job have batchDurationTime as 60
> sec and processing time of complete pipeline is greater then 60 sec.
>
>
>
> --
>
> Thanks and Regards,
>
>
>
> Saurav Sinha
>
>
>
> Contact: 9742879062
>

Reply via email to