If you are able to log onto the node where UI has been launched, then try
`ps -aux | grep HistoryServer` and the first column of output should be the
user.

On Wed, Aug 15, 2018 at 10:26 PM Fawze Abujaber <fawz...@gmail.com> wrote:

> Thanks Manu, Do you know how i can see which user the UI is running,
> because i'm using cloudera manager and i created a user for cloudera
> manager and called it spark but this didn't solve me issue and here i'm
> trying to find out the user for the spark hisotry UI.
>
> On Wed, Aug 15, 2018 at 5:11 PM Manu Zhang <owenzhang1...@gmail.com>
> wrote:
>
>> Hi Fawze,
>>
>> A) The file permission is currently hard coded to 770 (
>> https://github.com/apache/spark/blob/branch-2.3/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala#L287
>> ).
>> B) I think add all users (including UI) to the group like Spark will do.
>>
>>
>> On Wed, Aug 15, 2018 at 6:38 PM Fawze Abujaber <fawz...@gmail.com> wrote:
>>
>>> Hi Manu,
>>>
>>> Thanks for your response.
>>>
>>> Yes, i see but still interesting to know how i can see these
>>> applications from the spark history UI.
>>>
>>> How i can know with which user i'm  logged in when i'm navigating the
>>> spark history UI.
>>>
>>> The Spark process is running with cloudera-scm and the events written in
>>> the spark2history folder at the HDFS written with the user name who is
>>> running the application and group spark (770 permissions).
>>>
>>> I'm interesting to see if i can force these logs to be written with 774
>>> or 775 permission or finding another solutions that enable Rnd or anyone to
>>> be able to investigate his application logs using the UI.
>>>
>>> for example : can i use such spark conf : spark.eventLog.permissions=755
>>>
>>> The 2 options i see here:
>>>
>>> A) find a way to enforce these logs to be written with other permissions.
>>>
>>> B) Find the user that the UI running with as creating LDAP groups and
>>> user that can handle this.
>>>
>>> for example creating group called Spark and create the user that the UI
>>> running with and add this user to the spark group.
>>> not sure if this option will work as i don't know if these steps
>>> authenticate against the LDAP.
>>>
>>
>
> --
> Take Care
> Fawze Abujaber
>

Reply via email to