Getting spark application driver ID programmatically

2015-10-01 Thread Snehal Nagmote
Hi , I have use case where we need to automate start/stop of spark streaming application. To stop spark job, we need driver/application id of the job . For example : /app/spark-master/bin/spark-class org.apache.spark.deploy.Client kill spark://10.65.169.242:7077 $driver_id I am thinking to get

Re: Understanding Batch Processing Time

2015-09-02 Thread Snehal Nagmote
you jstack into the driver and see what is process doing after job 352? > > Also to confirm, the system is stuck after job 352 finishes, and before > job 353 starts (shows up in the UI), right? > > TD > > On Wed, Sep 2, 2015 at 12:55 PM, Snehal Nagmote > wrote: > >

Understanding Batch Processing Time

2015-09-02 Thread Snehal Nagmote
Hi All, I have spark job where I read data from Kafka every 5 seconds interval and query Cassandra based on Kafka data using spark Cassandra Connector , I am using spark 1.4 , Often the batch gets stuck in processing after job Id 352 . Spark takes long time to spawn job 353 where it reads from Ca

Re: [Kafka-Spark-Consumer] Spark-Streaming Job Fails due to Futures timed out

2015-06-09 Thread Snehal Nagmote
locks to >> blockmanager and spark not able to recover from this failure but Receivet >> keep trying .. >> >> Which version of Spark you are using ? >> >> Dibyendu >> On Jun 9, 2015 5:14 AM, "Snehal Nagmote" >> wrote: >> >>> All

[Kafka-Spark-Consumer] Spark-Streaming Job Fails due to Futures timed out

2015-06-08 Thread Snehal Nagmote
All, I am using Kafka Spark Consumer https://github.com/dibbhatt/kafka-spark-consumer in spark streaming job . After spark streaming job runs for few hours , all executors exit and I still see status of application on SPARK UI as running Does anyone know cause of this exception and how to fix

Registering Custom metrics [Spark-Streaming-monitoring]

2015-05-28 Thread Snehal Nagmote
Hello All, I am using spark streaming 1.3 . I want to capture few custom metrics based on accumulators, I followed somewhat similar to this approach , val instrumentation = new SparkInstrumentation("example.metrics") * val numReqs = sc.accumulator(0L) * instrumentation.source.registerDailyAcc

Accumulators in Spark Streaming on UI

2015-05-26 Thread Snehal Nagmote
Hello all, I have accumulator in spark streaming application which counts number of events received from Kafka. >From the documentation , It seems Spark UI has support to display it . But I am unable to see it on UI. I am using spark 1.3.1 Do I need to call any method (print) or am I missing

Spark Application Dependency Issue

2015-05-20 Thread Snehal Nagmote
Hi All, I am on spark 1.1 with Datastax DSE. Application is Spark Streaming and have Couchbase dependencies which uses http-core 4.3.2 . While running application I get this error This is the error I get NoSuchMethodError: org.apache.http.protocol.RequestUserAgent.(Ljava/lang/String;)V at com