3 cores* not 8

César.



> Le 6 oct. 2015 à 19:08, Cesar Berezowski <ce...@adaltas.com> a écrit :
> 
> I deployed hdp 2.3.1 and got spark 1.3.1, spark 1.4 is supposed to be 
> available as technical preview I think
> 
> vendor’s forum ? you mean hortonworks' ? 
> 
> --
> Update on my info: 
> 
> Set Yarn to use 16 cores instead of 8 & set min container size to 4096mb
> Thus: 
> 12 executors, 12G of Ram and 8 cores
> 
> But same issue, still creates 3 container (+ driver), 1 core and 6.3gb each, 
> taking 16gb on yarn
> 
> César.
> 
> 
> 
>> Le 6 oct. 2015 à 19:00, Ted Yu <yuzhih...@gmail.com 
>> <mailto:yuzhih...@gmail.com>> a écrit :
>> 
>> Considering posting the question on vendor's forum.
>> 
>> HDP 2.3 comes with Spark 1.4 if I remember correctly.
>> 
>> On Tue, Oct 6, 2015 at 9:05 AM, czoo <ce...@adaltas.com 
>> <mailto:ce...@adaltas.com>> wrote:
>> Hi,
>> 
>> This post might be a duplicate with updates from another one (by me), sorry
>> in advance
>> 
>> I have an HDP 2.3 cluster running Spark 1.3.1 on 6 nodes (edge + master + 4
>> workers)
>> Each worker has 8 cores and 40G of RAM available in Yarn
>> 
>> That makes a total of 160GB and 32 cores
>> 
>> I'm running a job with the following parameters :
>> --master yarn-client
>> --num-executors 12 (-> 3 / node)
>> --executor-cores 2
>> --executor-memory 12G
>> 
>> I don't know if it's optimal but it should run (right ?)
>> 
>> However I end up with spark setting up 2 executors using 1 core & 6.2G each
>> 
>> Plus, my job is doing a cartesian product so I end up with a pretty big
>> DataFrame that inevitably ends on a GC exception...
>> It used to run on HDP2.2 / Spark 1.2.1 but I can't find any way to run it
>> now
>> 
>> Any Idea ?
>> 
>> Thanks a lot
>> 
>> Cesar
>> 
>> 
>> 
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-3-1-on-Yarn-not-using-all-given-capacity-tp24955.html
>>  
>> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-3-1-on-Yarn-not-using-all-given-capacity-tp24955.html>
>> Sent from the Apache Spark User List mailing list archive at Nabble.com 
>> <http://nabble.com/>.
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
>> <mailto:user-unsubscr...@spark.apache.org>
>> For additional commands, e-mail: user-h...@spark.apache.org 
>> <mailto:user-h...@spark.apache.org>
>> 
>> 
> 

Reply via email to