Thanks Wyane. 

Maybe that's is what is happening. My current limits are. 

$ ps -u ssimanta -L | wc -l (with Spark  and spark-shell *not* running) 
790
$ ulimit -u
1024

Once I start Spark my limit increases to 

$ ps -u ssimanta -L | wc -l (with Spark and spark-shell running) 

982

Any recommendations about how large should this limit be ? I'm assuming this
limit needs to be changed on all my Spark nodes.

thanks
-Soumya





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Caused-by-java-lang-OutOfMemoryError-unable-to-create-new-native-thread-tp5379p5383.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to