Thanks Haosdent!
Tim
On Mon, Sep 14, 2015 at 1:29 AM, SLiZn Liu wrote:
> I found the --no-switch_user flag in mesos slave configuration. Will give
> it a try. Thanks Tim, and haosdent !
>
>
> On Mon, Sep 14, 2015 at 4:15 PM haosdent wrote:
>
>> >
At the same time, make sure SPARK_USER is the real one living on slave
before execute your spark program.
2015-09-14 16:29 GMT+08:00 SLiZn Liu :
> I found the --no-switch_user flag in mesos slave configuration. Will give
> it a try. Thanks Tim, and haosdent !
>
>
> On
No, we set up a specific user to start mesos, it isn't root.
On Mon, Sep 14, 2015 at 1:05 PM haosdent wrote:
> Do you start your mesos cluster with root?
>
> On Mon, Sep 14, 2015 at 12:10 PM, SLiZn Liu
> wrote:
>
>> Hi Mesos Users,
>>
>> I’m trying
Actually --proxy-user is more about which user you're impersonated to run
the driver, but not the user that is going to be passed to Mesos to run as.
The way to use a partciular user when running a spark job is to set the
SPARK_USER environment variable, and that user will be passed to Mesos.
Thx Tommy, did you mean add proxy user like this:
spark-submit --proxy-user ...
where represents the user who started Mesos?
and is this parameter documented anywhere?
On Mon, Sep 14, 2015 at 1:34 PM tommy xiao wrote:
> @SLiZn Liu yes, you need add proxy_user parameter
> turn off --switch-user flag in the Mesos slave
--no-switch_user :-)
On Mon, Sep 14, 2015 at 4:03 PM, Tim Chen wrote:
> Actually --proxy-user is more about which user you're impersonated to run
> the driver, but not the user that is going to be passed to Mesos to run as.
>
I found the --no-switch_user flag in mesos slave configuration. Will give
it a try. Thanks Tim, and haosdent !
On Mon, Sep 14, 2015 at 4:15 PM haosdent wrote:
> > turn off --switch-user flag in the Mesos slave
> --no-switch_user :-)
>
> On Mon, Sep 14, 2015 at 4:03 PM, Tim
Do you start your mesos cluster with root?
On Mon, Sep 14, 2015 at 12:10 PM, SLiZn Liu wrote:
> Hi Mesos Users,
>
> I’m trying to run Spark jobs on my Mesos cluster. However I discovered
> that my Spark job must be submitted by the same user who started Mesos,
>
Hi Mesos Users,
I’m trying to run Spark jobs on my Mesos cluster. However I discovered that
my Spark job must be submitted by the same user who started Mesos,
otherwise a ExecutorLostFailure will rise, and the job won’t be executed.
Is there anyway that every user share a same Mesos cluster in
@SLiZn Liu yes, you need add proxy_user parameter and your cluster should
have the proxy_user in the /etc/passwd in every node.
2015-09-14 13:05 GMT+08:00 haosdent :
> Do you start your mesos cluster with root?
>
> On Mon, Sep 14, 2015 at 12:10 PM, SLiZn Liu
10 matches
Mail list logo