Gotcha. Seeing as there's a lot of large projects who used the unsafe API
either directly or indirectly (via netty, etc..) it's a bit surprising that
it was so thoroughly closed off without an escape hatch, but I'm sure there
was a lively discussion around it...

Cheers
Andrew

On Wed, Apr 13, 2022 at 09:07 Sean Owen <sro...@gmail.com> wrote:

> It is intentionally closed by the JVM going forward, as direct access is
> discouraged. But it's still necessary for Spark. In some cases, like direct
> mem access, there is a new API but it's in Java 17 I think, and we can't
> assume Java 17 any time soon.
>
> On Wed, Apr 13, 2022 at 9:05 AM Andrew Melo <andrew.m...@gmail.com> wrote:
>
>> Hi Sean,
>>
>> Out of curiosity, will Java 11+ always require special flags to access
>> the unsafe direct memory interfaces, or is this something that will either
>> be addressed by the spec (by making an "approved" interface) or by
>> libraries (with some other workaround)?
>>
>> Thanks
>> Andrew
>>
>> On Tue, Apr 12, 2022 at 08:45 Sean Owen <sro...@gmail.com> wrote:
>>
>>> In Java 11+, you will need to tell the JVM to allow access to internal
>>> packages in some cases, for any JVM application. You will need flags like
>>> "--add-opens=java.base/sun.nio.ch=ALL-UNNAMED", which you can see in
>>> the pom.xml file for the project.
>>>
>>> Spark 3.2 does not necessarily work with Java 17 (3.3 should have
>>> support), but it may well work after you address those flags.
>>>
>>> On Tue, Apr 12, 2022 at 7:05 AM Arunachalam Sibisakkaravarthi <
>>> arunacha...@mcruncher.com> wrote:
>>>
>>>> Hi guys,
>>>>
>>>> spark-sql_2.12:3.2.1 is used in our application.
>>>>
>>>> It throws following exceptions when the app runs using JRE17
>>>>
>>>> java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ 
>>>> (in unnamed module @0x451f1bd4) cannot access class 
>>>> sun.nio.ch.DirectBuffer (in module java.base) because module java.base 
>>>> does not export sun.nio.ch to unnamed module @0x451f1bd43    at 
>>>> org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)4     
>>>>   at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)5 
>>>> at 
>>>> org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)6
>>>>     at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)7   
>>>>  at 
>>>> org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)8 
>>>>   at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)9       at 
>>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)10     at 
>>>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)11     
>>>>   at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)12       
>>>> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)13   
>>>>      at 
>>>> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)14
>>>>    at scala.Option.getOrElse(Option.scala:189)15   at 
>>>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
>>>>
>>>> How do we fix this?
>>>>
>>>>
>>>>
>>>>
>>>> *Thanks And RegardsSibi.ArunachalammCruncher*
>>>>
>>> --
>> It's dark in this basement.
>>
> --
It's dark in this basement.

Reply via email to