I run spark in standalone mode. I use Spark in python to do the program. The
OutOfMemory Error alway occurs after some steps during the iteration
process. The usage of memory should be almost the same in each step. This
problem can be solved by increasing "java heap size" or running in a small
data. I'm very confused about this. Has someone met the same error? how to
fix it? I think increasing the heap size is not a good method.  

Thanks 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/OutOfMemory-Error-tp1746.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to