I Mich,

I do not have access to UI as I am running jobs on remote system and I can
access it using putty only so only console or logs files are available to
me.

Thanks

On Mon, Sep 19, 2016 at 11:36 AM, Mich Talebzadeh <mich.talebza...@gmail.com
> wrote:

> Spark UI on port 4040 by default
>
> HTH
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 19 September 2016 at 10:34, Cristina Rozee <rozee.crist...@gmail.com>
> wrote:
>
>> Could you please explain a little bit?
>>
>>
>> On Sun, Sep 18, 2016 at 10:19 PM, Jacek Laskowski <ja...@japila.pl>
>> wrote:
>>
>>> SparkListener perhaps?
>>>
>>> Jacek
>>>
>>> On 15 Sep 2016 1:41 p.m., "Cristina Rozee" <rozee.crist...@gmail.com>
>>> wrote:
>>>
>>>> Hello,
>>>>
>>>> I am running a spark application and I would like to know the total
>>>> amount of shuffle data (read + write ) so could anyone let me know how to
>>>> get this information?
>>>>
>>>> Thank you
>>>> Cristina.
>>>>
>>>
>>
>

Reply via email to