Hi,
Good question. The extra memory comes from
spark.yarn.executor.memoryOverhead, the space used for the application
master, and the way the YARN rounds requests up. This explains it in a
little more detail:
http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/
Thanks Sandy, it is very useful!
bit1...@163.com
From: Sandy Ryza
Date: 2015-04-29 15:24
To: bit1...@163.com
CC: user
Subject: Re: Question about Memory Used and VCores Used
Hi,
Good question. The extra memory comes from spark.yarn.executor.memoryOverhead,
the space used
Hi,guys,
I have the following computation with 3 workers:
spark-sql --master yarn --executor-memory 3g --executor-cores 2 --driver-memory
1g -e 'select count(*) from table'
The resources used are shown as below on the UI:
I don't understand why the memory used is 15GB and vcores used is 5. I