Re: Running executors missing in sparkUI

2016-02-25 Thread Jan Štěrba
I am running spark1.3 on Cloudera hadoop 5.4
--
Jan Sterba
https://twitter.com/honzasterba | http://flickr.com/honzasterba |
http://500px.com/honzasterba


On Thu, Feb 25, 2016 at 4:22 PM, Yin Yang  wrote:
> Which Spark / hadoop release are you running ?
>
> Thanks
>
> On Thu, Feb 25, 2016 at 4:28 AM, Jan Štěrba  wrote:
>>
>> Hello,
>>
>> I have quite a weird behaviour that I can't quite wrap my head around.
>> I am running Spark on a Hadoop YARN cluster. I have Spark configured
>> in such a way that it utilizes all free vcores in the cluster (setting
>> max vcores per executor and number of executors to use all vcores in
>> cluster).
>>
>> Once oozie launcher job and spark AM claim their job, there should be
>> free resources for 8 spark executor nodes but in spark UI I only see 7
>> active nodes (there should be two spark nodes per one hadoop host). I
>> have checked what containers are running on each hadoop nodes and
>> discovered that one node is in deed running more spark containers than
>> is reported in spark UI.
>>
>> This behaviour is very strange to me and I have no idea what to make
>> of it or how to debug it.
>>
>> Any thoughts?
>>
>> Thanks.
>>
>> --
>> Jan Sterba
>> https://twitter.com/honzasterba | http://flickr.com/honzasterba |
>> http://500px.com/honzasterba
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Running executors missing in sparkUI

2016-02-25 Thread Yin Yang
Which Spark / hadoop release are you running ?

Thanks

On Thu, Feb 25, 2016 at 4:28 AM, Jan Štěrba  wrote:

> Hello,
>
> I have quite a weird behaviour that I can't quite wrap my head around.
> I am running Spark on a Hadoop YARN cluster. I have Spark configured
> in such a way that it utilizes all free vcores in the cluster (setting
> max vcores per executor and number of executors to use all vcores in
> cluster).
>
> Once oozie launcher job and spark AM claim their job, there should be
> free resources for 8 spark executor nodes but in spark UI I only see 7
> active nodes (there should be two spark nodes per one hadoop host). I
> have checked what containers are running on each hadoop nodes and
> discovered that one node is in deed running more spark containers than
> is reported in spark UI.
>
> This behaviour is very strange to me and I have no idea what to make
> of it or how to debug it.
>
> Any thoughts?
>
> Thanks.
>
> --
> Jan Sterba
> https://twitter.com/honzasterba | http://flickr.com/honzasterba |
> http://500px.com/honzasterba
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Running executors missing in sparkUI

2016-02-25 Thread Jan Štěrba
Hello,

I have quite a weird behaviour that I can't quite wrap my head around.
I am running Spark on a Hadoop YARN cluster. I have Spark configured
in such a way that it utilizes all free vcores in the cluster (setting
max vcores per executor and number of executors to use all vcores in
cluster).

Once oozie launcher job and spark AM claim their job, there should be
free resources for 8 spark executor nodes but in spark UI I only see 7
active nodes (there should be two spark nodes per one hadoop host). I
have checked what containers are running on each hadoop nodes and
discovered that one node is in deed running more spark containers than
is reported in spark UI.

This behaviour is very strange to me and I have no idea what to make
of it or how to debug it.

Any thoughts?

Thanks.

--
Jan Sterba
https://twitter.com/honzasterba | http://flickr.com/honzasterba |
http://500px.com/honzasterba

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org