[ 
https://issues.apache.org/jira/browse/SPARK-37295?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-37295.
----------------------------------
    Resolution: Won't Fix

I'm resolving this - there isn't any easy way to avoid this than upgrading to 
JDK 17.

> illegal reflective access operation has occurred; Please consider reporting 
> this to the maintainers
> ---------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-37295
>                 URL: https://issues.apache.org/jira/browse/SPARK-37295
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.1.2
>         Environment: MacBook pro running mac OS 11.6
> spark-3.1.2-bin-hadoop3.2
> it is not clear to me how spark finds java. I believe I also have java 8 
> installed somewhere
> ```
> $ which java
> ~/anaconda3/envs/extraCellularRNA/bin/java
> $ java -version
> openjdk version "11.0.6" 2020-01-14
> OpenJDK Runtime Environment (build 11.0.6+8-b765.1)
> OpenJDK 64-Bit Server VM (build 11.0.6+8-b765.1, mixed mode)
> ```
>  
>            Reporter: Andrew Davidson
>            Priority: Major
>
> ```
>    spark = SparkSession\
>                 .builder\
>                 .appName("TestEstimatedScalingFactors")\
>                 .getOrCreate()
> ```
> generates the following warning
> ```
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform 
> (file:/Users/xxx/googleUCSC/kimLab/extraCellularRNA/terra/deseq/spark-3.1.2-bin-hadoop3.2/jars/spark-unsafe_2.12-3.1.2.jar)
>  to constructor java.nio.DirectByteBuffer(long,int)
> WARNING: Please consider reporting this to the maintainers of 
> org.apache.spark.unsafe.Platform
> WARNING: Use --illegal-access=warn to enable warnings of further illegal 
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> 21/11/11 12:51:02 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> ```
> I am using pyspark spark-3.1.2-bin-hadoop3.2 on a MacBook pro running mac OS 
> 11.6
>  
> My small unit test see to work okay how ever It fails when I try and run on 
> 3.2.0
>  
> I
>  
> Any idea how I track down this issue? Kind regards
>  
> Andy
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to