It is not officially supported, yes. Try Spark 3.3 from the branch if you
want to try Java 17

On Wed, Apr 13, 2022, 9:36 PM Arunachalam Sibisakkaravarthi <
arunacha...@mcruncher.com> wrote:

> Thanks everyone for giving your feedback.
> Jvm option "--add-opens=java.base/sun.nio.ch=ALL-UNNAMED" resolved the
> issue "cannot access class sun.nio.ch.DirectBuffer"
> But still Spark throws some other exception
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
> in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage
> 0.0 (TID 0) (ldap executor driver): java.io.InvalidObjectException:
> ReflectiveOperationException during deserialization
> at java.base/java.lang.invoke.SerializedLambda.readResolve(Unknown Source)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
>
> Caused by: java.lang.reflect.InvocationTargetException: null
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.base/java.lang.reflect.Method.invoke(Unknown Source)
> ... 86 common frames omitted
>
> Caused by: java.lang.IllegalArgumentException: too many arguments
> at java.base/java.lang.invoke.LambdaMetafactory.altMetafactory(Unknown
> Source)
> at
> scala.runtime.LambdaDeserializer$.makeCallSite$1(LambdaDeserializer.scala:105)
> at
> scala.runtime.LambdaDeserializer$.deserializeLambda(LambdaDeserializer.scala:114)
> at
> scala.runtime.LambdaDeserialize.deserializeLambda(LambdaDeserialize.java:38)
>
> Maybe we need to change the subject to say "spark-sql_2.12 doesn't work
> with jdk17 " or should I open another discussion?
>
>
>
>
>
>
>
>
>
> *Thanks And RegardsSibi.ArunachalammCruncher*
>
>
> On Wed, Apr 13, 2022 at 10:16 PM Sean Owen <sro...@gmail.com> wrote:
>
>> Yes I think that's a change that has caused difficulties, but, these
>> internal APIs were always discouraged. Hey, one is even called 'unsafe'.
>> There is an escape hatch, the JVM arg below.
>>
>> On Wed, Apr 13, 2022, 9:09 AM Andrew Melo <andrew.m...@gmail.com> wrote:
>>
>>> Gotcha. Seeing as there's a lot of large projects who used the unsafe
>>> API either directly or indirectly (via netty, etc..) it's a bit surprising
>>> that it was so thoroughly closed off without an escape hatch, but I'm sure
>>> there was a lively discussion around it...
>>>
>>> Cheers
>>> Andrew
>>>
>>> On Wed, Apr 13, 2022 at 09:07 Sean Owen <sro...@gmail.com> wrote:
>>>
>>>> It is intentionally closed by the JVM going forward, as direct access
>>>> is discouraged. But it's still necessary for Spark. In some cases, like
>>>> direct mem access, there is a new API but it's in Java 17 I think, and we
>>>> can't assume Java 17 any time soon.
>>>>
>>>> On Wed, Apr 13, 2022 at 9:05 AM Andrew Melo <andrew.m...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi Sean,
>>>>>
>>>>> Out of curiosity, will Java 11+ always require special flags to access
>>>>> the unsafe direct memory interfaces, or is this something that will either
>>>>> be addressed by the spec (by making an "approved" interface) or by
>>>>> libraries (with some other workaround)?
>>>>>
>>>>> Thanks
>>>>> Andrew
>>>>>
>>>>> On Tue, Apr 12, 2022 at 08:45 Sean Owen <sro...@gmail.com> wrote:
>>>>>
>>>>>> In Java 11+, you will need to tell the JVM to allow access to
>>>>>> internal packages in some cases, for any JVM application. You will need
>>>>>> flags like "--add-opens=java.base/sun.nio.ch=ALL-UNNAMED", which you
>>>>>> can see in the pom.xml file for the project.
>>>>>>
>>>>>> Spark 3.2 does not necessarily work with Java 17 (3.3 should have
>>>>>> support), but it may well work after you address those flags.
>>>>>>
>>>>>> On Tue, Apr 12, 2022 at 7:05 AM Arunachalam Sibisakkaravarthi <
>>>>>> arunacha...@mcruncher.com> wrote:
>>>>>>
>>>>>>> Hi guys,
>>>>>>>
>>>>>>> spark-sql_2.12:3.2.1 is used in our application.
>>>>>>>
>>>>>>> It throws following exceptions when the app runs using JRE17
>>>>>>>
>>>>>>> java.lang.IllegalAccessError: class 
>>>>>>> org.apache.spark.storage.StorageUtils$ (in unnamed module @0x451f1bd4) 
>>>>>>> cannot access class sun.nio.ch.DirectBuffer (in module java.base) 
>>>>>>> because module java.base does not export sun.nio.ch to unnamed module 
>>>>>>> @0x451f1bd43 at 
>>>>>>> org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)4  
>>>>>>>      at 
>>>>>>> org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)5 at 
>>>>>>> org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)6
>>>>>>>     at 
>>>>>>> org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)7    at 
>>>>>>> org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)8
>>>>>>>    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)9       at 
>>>>>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)10     at 
>>>>>>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)11  
>>>>>>>      at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)12  
>>>>>>>      at 
>>>>>>> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)13   
>>>>>>>      at 
>>>>>>> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)14
>>>>>>>    at scala.Option.getOrElse(Option.scala:189)15   at 
>>>>>>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
>>>>>>>
>>>>>>> How do we fix this?
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *Thanks And RegardsSibi.ArunachalammCruncher*
>>>>>>>
>>>>>> --
>>>>> It's dark in this basement.
>>>>>
>>>> --
>>> It's dark in this basement.
>>>
>>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to