The definitive guide
Chapter 18:
Monitoring and Debugging

"This chapter covers the key details you need to monitor and debug your
Spark Applications.  To do this , we will walk through the spark UI with an
example query designed to help you understand how to trace your  own jobs
through the executions life cycle. The example we'll look at will also help
you to understand  how to debug your jobs and where errors are likely to
occur."








On Tue, 7 Apr 2020, 18:28 Pat Ferrel, <p...@occamsmachete.com> wrote:

> IntelliJ Scala works well when debugging master=local. Has anyone used it
> for remote/cluster debugging? I’ve heard it is possible...
>
>
> From: Luiz Camargo <camar...@gmail.com> <camar...@gmail.com>
> Reply: Luiz Camargo <camar...@gmail.com> <camar...@gmail.com>
> Date: April 7, 2020 at 10:26:35 AM
> To: Dennis Suhari <d.suh...@icloud.com.invalid>
> <d.suh...@icloud.com.invalid>
> Cc: yeikel valdes <em...@yeikel.com> <em...@yeikel.com>,
> zahidr1...@gmail.com <zahidr1...@gmail.com> <zahidr1...@gmail.com>,
> user@spark.apache.org <user@spark.apache.org> <user@spark.apache.org>
> Subject:  Re: IDE suitable for Spark
>
> I have used IntelliJ Spark/Scala with the sbt tool
>
> On Tue, Apr 7, 2020 at 1:18 PM Dennis Suhari <d.suh...@icloud.com.invalid>
> wrote:
>
>> We are using Pycharm resp. R Studio with Spark libraries to submit Spark
>> Jobs.
>>
>> Von meinem iPhone gesendet
>>
>> Am 07.04.2020 um 18:10 schrieb yeikel valdes <em...@yeikel.com>:
>>
>> 
>>
>> Zeppelin is not an IDE but a notebook.  It is helpful to experiment but
>> it is missing a lot of the features that we expect from an IDE.
>>
>> Thanks for sharing though.
>>
>> ---- On Tue, 07 Apr 2020 04:45:33 -0400 * zahidr1...@gmail.com
>> <zahidr1...@gmail.com> * wrote ----
>>
>> When I first logged on I asked if there was a suitable IDE for Spark.
>> I did get a couple of responses.
>> *Thanks.*
>>
>> I did actually find one which is suitable IDE for spark.
>> That is  *Apache Zeppelin.*
>>
>> One of many reasons it is suitable for Apache Spark is.
>> The  *up and running Stage* which involves typing *bin/zeppelin-daemon.sh
>> start*
>> Go to browser and type *http://localhost:8080 <http://localhost:8080>*
>> That's it!
>>
>> Then to
>> * Hit the ground running*
>> There are also ready to go Apache Spark examples
>> showing off the type of functionality one will be using in real life
>> production.
>>
>> Zeppelin comes with  embedded Apache Spark  and scala as default
>> interpreter with 20 + interpreters.
>> I have gone on to discover there are a number of other advantages for
>> real time production
>> environment with Zeppelin offered up by other Apache Products.
>>
>> Backbutton.co.uk
>> ¯\_(ツ)_/¯
>> ♡۶Java♡۶RMI ♡۶
>> Make Use Method {MUM}
>> makeuse.org
>> <http://www.backbutton.co.uk>
>>
>>
>>
>
> --
>
>
> Prof. Luiz Camargo
> Educador - Computação
>
>
>

Reply via email to