Are you having this issue with spark 1.5 as well? We had similar OOM issue
and was told by databricks to upgrade to 1.5 to resolve that. I guess they
are trying to sell Tachyon :)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Off-heap-memory-usage-of
p://www.leaseweb.com
Luttenbergweg 8, 1101 EC Amsterdam, Netherlands
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Off-heap-memory-usage-of-Spark-Executors-keeps-increasing-tp25398.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.