could this be related to SPARK-18787?

On Sun, Jan 22, 2017 at 1:45 PM, Reynold Xin <r...@databricks.com> wrote:

> Are you using G1 GC? G1 sometimes uses a lot more memory than the size
> allocated.
>
>
> On Sun, Jan 22, 2017 at 12:58 AM StanZhai <m...@zhaishidan.cn> wrote:
>
>> Hi all,
>>
>>
>>
>> We just upgraded our Spark from 1.6.2 to 2.1.0.
>>
>>
>>
>> Our Spark application is started by spark-submit with config of
>>
>> `--executor-memory 35G` in standalone model, but the actual use of memory
>> up
>>
>> to 65G after a full gc(jmap -histo:live $pid) as follow:
>>
>>
>>
>> test@c6 ~ $ ps aux | grep CoarseGrainedExecutorBackend
>>
>> test      181941  181 34.7 94665384 68836752 ?   Sl   09:25 711:21
>>
>> /home/test/service/jdk/bin/java -cp
>>
>> /home/test/service/hadoop/share/hadoop/common/hadoop-
>> lzo-0.4.20-SNAPSHOT.jar:/home/test/service/hadoop/share/
>> hadoop/common/hadoop-lzo-0.4.20-SNAPSHOT.jar:/home/test/
>> service/spark/conf/:/home/test/service/spark/jars/*:/
>> home/test/service/hadoop/etc/hadoop/
>>
>> -Xmx35840M -Dspark.driver.port=47781 -XX:+PrintGCDetails
>>
>> -XX:+PrintGCDateStamps -Xloggc:./gc.log -verbose:gc
>>
>> org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url
>>
>> spark://coarsegrainedschedu...@xxx.xxx.xxx.xxx:47781 --executor-id 1
>>
>> --hostname test-192 --cores 36 --app-id app-20170122092509-0017
>> --worker-url
>>
>> spark://Worker@test-192:33890
>>
>>
>>
>> Our Spark jobs are all sql.
>>
>>
>>
>> The exceed memory looks like off-heap memory, but the default value of
>>
>> `spark.memory.offHeap.enabled` is `false`.
>>
>>
>>
>> We didn't find the problem in Spark 1.6.x, what causes this in Spark
>> 2.1.0?
>>
>>
>>
>> Any help is greatly appreicated!
>>
>>
>>
>> Best,
>>
>> Stan
>>
>>
>>
>>
>>
>>
>>
>> --
>>
>> View this message in context: http://apache-spark-
>> developers-list.1001551.n3.nabble.com/Executors-exceed-
>> maximum-memory-defined-with-executor-memory-in-Spark-2-1-0-tp20697.html
>>
>> Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>>
>>
>>
>> ---------------------------------------------------------------------
>>
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>>
>>

Reply via email to