Hi Friends,
I need a help 
I need to install pyspark on jupyter notebook.. can someone please help me.. 
please 

Sent from my iPhone

> On Apr 24, 2021, at 2:09 PM, Austin Hernandez <[email protected]> wrote:
> 
> I was running a high-throughput computations of thousands of items in a 
> Jupyter notebook. At some point, the program crashed and I am unable to open 
> up the jupyter notebook again; I get the 'SBOX_FATAL_MEMORY_EXCEEDED' error 
> in Chrome. I have cleared all outputs, restarted the kernel, restarted the 
> server, and made copies/downloaded + uploaded the notebook, all to no avail. 
> Upon downloading the notebook, I see that the size of it is 143000 KB! 
> Obviously that is far too large, but I have no idea how it just ballooned in 
> size after the crash... It was working fine up until now. Any help is 
> appreciated with getting this notebook back as I cannot even access the code, 
> which is pretty important. 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Project Jupyter" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected].
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/jupyter/b8e5cabf-68fc-4807-bb35-b5b70eac9befn%40googlegroups.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/55C2FB93-FCA5-4C20-A73B-71723A7F06B8%40gmail.com.

Reply via email to