Re: Question about installing Apache Spark [PySpark] computer requirements

2024-07-29 Thread Meena Rajani
You probably have to increase jvm/jdk memory size https://stackoverflow.com/questions/1565388/increase-heap-size-in-java On Mon, Jul 29, 2024 at 9:36 PM mike Jadoo wrote: > Thanks. I just downloaded the corretto but I got this error message, > which was the same as before. [It was shared wi

Re: Question about installing Apache Spark [PySpark] computer requirements

2024-07-29 Thread Sadha Chilukoori
Hi Mike, This appears to be an access issue on Windows + Python. Can you try setting up the PYTHON_PATH environment variable as described in this stackoverflow post https://stackoverflow.com/questions/60414394/createprocess-error-5-access-is-denied-pyspark - Sadha On Mon, Jul 29, 2024 at 3:39 PM

Re: Question about installing Apache Spark [PySpark] computer requirements

2024-07-29 Thread mike Jadoo
Thanks. I just downloaded the corretto but I got this error message, which was the same as before. [It was shared with me that this saying that I have limited resources, i think] ---Py4JJavaError

Re: Question about installing Apache Spark [PySpark] computer requirements

2024-07-29 Thread Sadha Chilukoori
Hi Mike, I'm not sure about the minimum requirements of a machine for running Spark. But to run some Pyspark scripts (and Jupiter notbebooks) on a local machine, I found the following steps are the easiest. I installed Amazon corretto and updated the java_home variable as instructed here https:/