You can login to the machine where the worker is running and do a *jps *command
to get the process id (pid) then you can do *ps aux | grep pid* to view
under which user it is running. Usually it is the same username that is
used while you start the workers.

Thanks
Best Regards

On Sat, Oct 25, 2014 at 12:38 AM, <jan.zi...@centrum.cz> wrote:

> Hi,
>
> I would like to ask under which user is run the Spark program on slaves?
> My Spark is running on top of the Yarn.
>
> The reason I am asking for this is that I need to download data for NLTK
> library and these data are dowloaded for specific python user and I am
> currently struggling with this.
> http://apache-spark-user-list.1001560.n3.nabble.com/PySpark-problem-with-textblob-from-NLTK-used-in-map-td17211.html
>
> Than you in advance for any ideas.
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to