Re: IDE suitable for Spark : Monitoring & Debugging Spark Jobs

2020-04-07 Thread Som Lima
The definitive guide Chapter 18: Monitoring and Debugging "This chapter covers the key details you need to monitor and debug your Spark Applications. To do this , we will walk through the spark UI with an example query designed to help you understand how to trace your own jobs through the

Re: Debugging Spark application

2017-02-16 Thread Md. Rezaul Karim
pts to eclipse *I > think* > > > Regards > Sam > > > On Thu, 16 Feb 2017 at 22:00, Md. Rezaul Karim < > rezaul.ka...@insight-centre.org> wrote: > >> Hi, >> >> I was looking for some URLs/documents for getting started on debugging >> Spark applicatio

Re: Debugging Spark application

2017-02-16 Thread Sam Elamin
nts for getting started on debugging > Spark applications. > > I prefer developing Spark applications with Scala on Eclipse and then > package the application jar before submitting. > > > > Kind regards, > Reza > > > >

Debugging Spark application

2017-02-16 Thread Md. Rezaul Karim
Hi, I was looking for some URLs/documents for getting started on debugging Spark applications. I prefer developing Spark applications with Scala on Eclipse and then package the application jar before submitting. Kind regards, Reza

Re: suggest configuration for debugging spark streaming, kafka

2015-08-27 Thread Jacek Laskowski
On Wed, Aug 26, 2015 at 11:02 PM, Joanne Contact joannenetw...@gmail.com wrote: Hi I have a Ubuntu box with 4GB memory and duo cores. Do you think it won't be enough to run spark streaming and kafka? I try to install standalone mode spark kafka so I can debug them in IDE. Do I need to install

suggest configuration for debugging spark streaming, kafka

2015-08-26 Thread Joanne Contact
Hi I have a Ubuntu box with 4GB memory and duo cores. Do you think it won't be enough to run spark streaming and kafka? I try to install standalone mode spark kafka so I can debug them in IDE. Do I need to install hadoop? Thanks! J

Debugging Spark job in Eclipse

2015-08-05 Thread Deepesh Maheshwari
Hi, As spark job is executed when you run start() method of JavaStreamingContext. All the job like map, flatMap is already defined earlier but even though you put breakpoints in the function ,breakpoint doesn't stop there , then how can i debug the spark jobs. JavaDStreamString

Re: Debugging Spark job in Eclipse

2015-08-05 Thread Eugene Morozov
Deepesh, you have to call an action to start actual processing. words.count() would do the trick. On 05 Aug 2015, at 11:42, Deepesh Maheshwari deepesh.maheshwar...@gmail.com wrote: Hi, As spark job is executed when you run start() method of JavaStreamingContext. All the job like map,

Debugging spark java application

2014-11-19 Thread Mukesh Jha
Hello experts, Is there an easy way to debug a spark java application? I'm putting debug logs in the map's function but there aren't any logs on the console. Also can i include my custom jars while launching spark-shell and do my poc there? This might me a naive question but any help here is

Re: Debugging spark java application

2014-11-19 Thread Akhil Das
For debugging you can refer these two threads http://apache-spark-user-list.1001560.n3.nabble.com/How-do-you-hit-breakpoints-using-IntelliJ-In-functions-used-by-an-RDD-td12754.html

Debugging spark

2014-07-19 Thread Ruchir Jha
I am a newbie and am looking for pointers to start debugging my spark app and did not find a straightforward tutorial. Any help is appreciated? Sent from my iPhone

Debugging Spark AWS S3

2014-05-16 Thread Robert James
I have Spark code which runs beautifully when MASTER=local. When I run it with MASTER set to a spark ec2 cluster, the workers seem to run, but the results, which are supposed to be put to AWS S3, don't appear on S3. I'm at a loss for how to debug this. I don't see any S3 exceptions anywhere.

Re: Debugging Spark AWS S3

2014-05-16 Thread Ian Ferreira
Did you check the executor stderr logs? On 5/16/14, 2:37 PM, Robert James srobertja...@gmail.com wrote: I have Spark code which runs beautifully when MASTER=local. When I run it with MASTER set to a spark ec2 cluster, the workers seem to run, but the results, which are supposed to be put to AWS