Re: Spark Streaming Job completed without executing next batches

2017-11-16 Thread KhajaAsmath Mohammed
Here is screenshot . Status shows finished but it should be running for
next batch to pick up the data.


[image: Inline image 1]

On Thu, Nov 16, 2017 at 10:01 PM, KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:

> Hi,
>
> I have scheduled spark streaming job to run every 30 minutes and it was
> running fine till 32 hours and suddenly I see status of Finsished instead
> of running (Since it always run in background and shows up in resource
> manager)
>
> Am i doing anything wrong here? how come job was finished without picking
> next bacth from kafka.
>
> I run using below command in cloudera cluster.
>
> spark2-submit --class com.telematics.datascience.drivers.OCCDataPointDriver
> --master yarn --queue hadvaoccx_dse_pool --principal va_d...@ad.nav.com
> --keytab ./va_dflt.keytab  Telematics.jar -c /home/yyy1k78/occtelematics/
> application-datapoint-hdfs-dyn.properties &
>
> Thanks,
> Asmath
>


Spark Streaming Job completed without executing next batches

2017-11-16 Thread KhajaAsmath Mohammed
Hi,

I have scheduled spark streaming job to run every 30 minutes and it was
running fine till 32 hours and suddenly I see status of Finsished instead
of running (Since it always run in background and shows up in resource
manager)

Am i doing anything wrong here? how come job was finished without picking
next bacth from kafka.

I run using below command in cloudera cluster.

spark2-submit --class com.telematics.datascience.drivers.OCCDataPointDriver
--master yarn --queue hadvaoccx_dse_pool --principal va_d...@ad.nav.com
--keytab ./va_dflt.keytab  Telematics.jar -c
/home/yyy1k78/occtelematics/application-datapoint-hdfs-dyn.properties &

Thanks,
Asmath