Re: [E] How to do stop streaming before the application got killed

2017-12-22 Thread Rastogi, Pankaj
You can add a shutdown hook to your JVM and request spark streaming context to stop gracefully. /** * Shutdown hook to shutdown JVM gracefully * @param ssCtx */ def addShutdownHook(ssCtx: StreamingContext) = { Runtime.getRuntime.addShutdownHook( new Thread() { override

Re: [E] Re: Spark Job is stuck at SUBMITTED when set Driver Memory > Executor Memory

2017-06-12 Thread Rastogi, Pankaj
Please make sure that you have enough memory available on the driver node. If there is not enough free memory on the driver node, then your application won't start. Pankaj From: vaquar khan > Date: Saturday, June 10, 2017 at 5:02 PM To:

SPARK-19547

2017-06-07 Thread Rastogi, Pankaj
Hi, I have been trying to distribute Kafka topics among different instances of same consumer group. I am using KafkaDirectStream API for creating DStreams. After the second consumer group comes up, Kafka does partition rebalance and then Spark driver of the first consumer dies with the