See this recent thread:

http://search-hadoop.com/m/q3RTtFW7iMDkrj61/Spark+shell+oom+&subj=java+lang+OutOfMemoryError+PermGen+space



> On Jul 16, 2015, at 8:51 PM, Terry Hole <hujie.ea...@gmail.com> wrote:
> 
> Hi,
> 
> Background: The spark shell will get out of memory error after dealing lots 
> of spark work. 
> 
> Is there any method which can reset the spark shell to the startup status? I 
> tried ":reset", but it seems not working: i can not create spark context 
> anymore (some compile error as below) after the ":reset". (I have to restart 
> the shell after OOM to workaround) 
> 
> == Expanded type of tree ==
> TypeRef(TypeSymbol(class $read extends Serializable))
> uncaught exception during compilation: java.lang.AssertionError
> java.lang.AssertionError: assertion failed: Tried to find '$line16' in 
> 'C:\Users\jhu\AppData\Local\Temp\spark-2ad09490-c0c6-41e2-addb-63087ce0ae63' 
> but it is not a directory
> That entry seems to have slain the compiler.  Shall I replayyour session? I 
> can re-run each line except the last one.[y/n]
> Abandoning crashed session.
> 
> Thanks!
> -Terry

Reply via email to