I'm using livy-0.5.0 with spark2.3.0,I started  a  session with 4GB mem for 
Driver, And I run code server times :
 var tmp1 = spark.sql("use tpcds_bin_partitioned_orc_2");var tmp2 = 
spark.sql("select count(1) from tpcds_bin_partitioned_orc_2.store_sales").show
the table have 5760749 rows data.
after run about 10 times , the Driver physical memory will beyond 4.5GB and 
killed by yarn.
I saw the old generation memory  keep growing and can not release by gc.

2018-11-12 

lk_spark 



发件人:"lk_hadoop"<lk_had...@163.com>
发送时间:2018-11-12 09:37
主题:about LIVY-424
收件人:"user"<u...@livy.incubator.apache.org>
抄送:

hi,all:
        I meet this issue https://issues.apache.org/jira/browse/LIVY-424  , 
anybody know how to resolve it?
2018-11-12


lk_hadoop 

Reply via email to