Luis Angel Hernández Acosta created SPARK-16667:
---------------------------------------------------

             Summary: Spark driver executor dont release unused memory
                 Key: SPARK-16667
                 URL: https://issues.apache.org/jira/browse/SPARK-16667
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.6.0
         Environment: Ubuntu wily 64 bits
java 1.8
3 slaves(4GB) 1 master(2GB) virtual machines in Vmware over i7 4th generation 
with 16 gb RAM)
            Reporter: Luis Angel Hernández Acosta


I'm running spark app in standalone cluster. My app create sparkContext and 
make many calculation with graphx over the time. To calculate, my app create 
new java thread and wait for it's ending signal. Betwenn any calculation, 
memory grows 50mb-100mb. I make a thread to be sure that any object created for 
calculate is destryed after calculate's end, but memory still growing. I tray 
stoping the sparkContext and all executor memory allocated by app is freed but 
my driver's memory still growing same 50m-100mb.
Spark env:
export SPARK_MASTER_IP=master
export SPARK_WORKER_CORES=4
export SPARK_WORKER_MEMORY=2919m
export SPARK_WORKER_INSTANCES=1
export SPARK_DAEMON_MEMORY=256m
export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true 
-Dspark.worker.cleanup.interval=10"
That are my only configurations




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to