Hello,
Anyone has any comment or ideas regarding:
https://stackoverflow.com/questions/71610435/how-to-overwrite-pyspark-dataframe-schema-without-data-scan
please?
Cheers - Rafal
In Java 11+, you will need to tell the JVM to allow access to internal
packages in some cases, for any JVM application. You will need flags like
"--add-opens=java.base/sun.nio.ch=ALL-UNNAMED", which you can see in the
pom.xml file for the project.
Spark 3.2 does not necessarily work with Java 17 (
Hi guys,
spark-sql_2.12:3.2.1 is used in our application.
It throws following exceptions when the app runs using JRE17
java.lang.IllegalAccessError: class
org.apache.spark.storage.StorageUtils$ (in unnamed module @0x451f1bd4)
cannot access class sun.nio.ch.DirectBuffer (in module java.base)
beca