Re: [Spark Shell] Could the spark shell be reset to the original status?

2015-07-16 Thread Terry Hole
Hi Ted,

Thanks for the information. The post seems little different with my
requirement: suppose we defined different functions to do different
streaming work (e.g. 50 functions), i want to test these 50 functions in
the spark shell, and the shell will always throw OOM at the middle of test
(yes, it could be solved by increasing the jvm memory size, but if we have
more functions, the issue still will happen). The main issue is that the
shell keeps track all the information (class, objects...) from started, so
the java memory will increase time to time when define/invoke the
functions.

Thanks!
- Terry

Ted Yu yuzhih...@gmail.com于2015年7月17日周五 下午12:02写道:

 See this recent thread:


 http://search-hadoop.com/m/q3RTtFW7iMDkrj61/Spark+shell+oom+subj=java+lang+OutOfMemoryError+PermGen+space



 On Jul 16, 2015, at 8:51 PM, Terry Hole hujie.ea...@gmail.com wrote:

 Hi,

 Background: The spark shell will get out of memory error after dealing
 lots of spark work.

 Is there any method which can reset the spark shell to the startup status?
 I tried *:reset*, but it seems not working: i can not create spark
 context anymore (some compile error as below) after the *:reset*. (I
 have to restart the shell after OOM to workaround)

 == Expanded type of tree ==
 TypeRef(TypeSymbol(class $read extends Serializable))
 uncaught exception during compilation: java.lang.AssertionError
 java.lang.AssertionError: assertion failed: Tried to find '$line16' in
 'C:\Users\jhu\AppData\Local\Temp\spark-2ad09490-c0c6-41e2-addb-63087ce0ae63'
 but it is not a directory
 That entry seems to have slain the compiler.  Shall I replayyour session?
 I can re-run each line except the last one.[y/n]
 Abandoning crashed session.

 Thanks!
 -Terry




Re: [Spark Shell] Could the spark shell be reset to the original status?

2015-07-16 Thread Ted Yu
See this recent thread:

http://search-hadoop.com/m/q3RTtFW7iMDkrj61/Spark+shell+oom+subj=java+lang+OutOfMemoryError+PermGen+space



 On Jul 16, 2015, at 8:51 PM, Terry Hole hujie.ea...@gmail.com wrote:
 
 Hi,
 
 Background: The spark shell will get out of memory error after dealing lots 
 of spark work. 
 
 Is there any method which can reset the spark shell to the startup status? I 
 tried :reset, but it seems not working: i can not create spark context 
 anymore (some compile error as below) after the :reset. (I have to restart 
 the shell after OOM to workaround) 
 
 == Expanded type of tree ==
 TypeRef(TypeSymbol(class $read extends Serializable))
 uncaught exception during compilation: java.lang.AssertionError
 java.lang.AssertionError: assertion failed: Tried to find '$line16' in 
 'C:\Users\jhu\AppData\Local\Temp\spark-2ad09490-c0c6-41e2-addb-63087ce0ae63' 
 but it is not a directory
 That entry seems to have slain the compiler.  Shall I replayyour session? I 
 can re-run each line except the last one.[y/n]
 Abandoning crashed session.
 
 Thanks!
 -Terry