Re: Spark streaming app starts processing when kill that app
Hey Hareesh, Thanks for the help, they were starving. I increased the core + memory on that machine. Now it is working fine. Thanks again On Tue, May 3, 2016 at 12:57 PM, Shams ul Haquewrote: > No, i made a cluster of 2 machines. And after submission to master, this > app moves on slave machine for execution. > Well i am going to give a try to your suggestion by running both on same > machine. > > Thanks > Shams > > On Tue, May 3, 2016 at 12:53 PM, hareesh makam > wrote: > >> If you are running your master on a single core, it might be an issue of >> Starvation. >> assuming you are running it locally, try setting master to local[2] or >> higher. >> >> Check the first example at >> https://spark.apache.org/docs/latest/streaming-programming-guide.html >> >> - Hareesh >> >> On 3 May 2016 at 12:35, Shams ul Haque wrote: >> >>> Hi all, >>> >>> I am facing strange issue when running Spark Streaming app. >>> >>> What i was doing is, When i submit my app by *spark-submit *it works >>> fine and also visible in Spark UI. But it doesn't process any data coming >>> from kafka. And when i kill that app by pressing Ctrl + C on terminal, then >>> it start processing all data received from Kafka and then get shutdown. >>> >>> I am trying to figure out why is this happening. Please help me if you >>> know anything. >>> >>> Thanks and regards >>> Shams ul Haque >>> >> >> >
Re: Spark streaming app starts processing when kill that app
No, i made a cluster of 2 machines. And after submission to master, this app moves on slave machine for execution. Well i am going to give a try to your suggestion by running both on same machine. Thanks Shams On Tue, May 3, 2016 at 12:53 PM, hareesh makamwrote: > If you are running your master on a single core, it might be an issue of > Starvation. > assuming you are running it locally, try setting master to local[2] or > higher. > > Check the first example at > https://spark.apache.org/docs/latest/streaming-programming-guide.html > > - Hareesh > > On 3 May 2016 at 12:35, Shams ul Haque wrote: > >> Hi all, >> >> I am facing strange issue when running Spark Streaming app. >> >> What i was doing is, When i submit my app by *spark-submit *it works >> fine and also visible in Spark UI. But it doesn't process any data coming >> from kafka. And when i kill that app by pressing Ctrl + C on terminal, then >> it start processing all data received from Kafka and then get shutdown. >> >> I am trying to figure out why is this happening. Please help me if you >> know anything. >> >> Thanks and regards >> Shams ul Haque >> > >
Re: Spark streaming app starts processing when kill that app
If you are running your master on a single core, it might be an issue of Starvation. assuming you are running it locally, try setting master to local[2] or higher. Check the first example at https://spark.apache.org/docs/latest/streaming-programming-guide.html - Hareesh On 3 May 2016 at 12:35, Shams ul Haquewrote: > Hi all, > > I am facing strange issue when running Spark Streaming app. > > What i was doing is, When i submit my app by *spark-submit *it works fine > and also visible in Spark UI. But it doesn't process any data coming from > kafka. And when i kill that app by pressing Ctrl + C on terminal, then it > start processing all data received from Kafka and then get shutdown. > > I am trying to figure out why is this happening. Please help me if you > know anything. > > Thanks and regards > Shams ul Haque >
Spark streaming app starts processing when kill that app
Hi all, I am facing strange issue when running Spark Streaming app. What i was doing is, When i submit my app by *spark-submit *it works fine and also visible in Spark UI. But it doesn't process any data coming from kafka. And when i kill that app by pressing Ctrl + C on terminal, then it start processing all data received from Kafka and then get shutdown. I am trying to figure out why is this happening. Please help me if you know anything. Thanks and regards Shams ul Haque