[ 
https://issues.apache.org/jira/browse/SPARK-24421?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-24421:
------------------------------
    Docs Text: 
Provisional release notes text:

Spark 3 attempts to avoid the JVM's default limit on total size of memory 
allocated by direct buffers, for user convenience, by accessing some internal 
JDK classes directly. This is no longer possible in Java 9 and later, because 
of the new module encapsulation system. 

For many usages of Spark, this will not matter, as the default 
{{MaxDirectMemorySize}} may be more than sufficient for all direct buffer 
allocation.

If it isn't, it can be made to work again by allowing the access explicitly 
with the JVM argument. {{--add-opens java.base/java.lang=ALL-UNNAMED}}. Of 
course this can also be resolved by explicitly setting 
{{-XX:MaxDirectMemorySize=}} to a sufficiently large value.

> Accessing sun.misc.Cleaner in JDK11
> -----------------------------------
>
>                 Key: SPARK-24421
>                 URL: https://issues.apache.org/jira/browse/SPARK-24421
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build
>    Affects Versions: 3.0.0
>            Reporter: DB Tsai
>            Priority: Major
>              Labels: release-notes
>
> Many internal APIs such as unsafe are encapsulated in JDK9+, see 
> http://openjdk.java.net/jeps/260 for detail.
> To use Unsafe, we need to add *jdk.unsupported* to our code’s module 
> declaration:
> {code:java}
> module java9unsafe {
>     requires jdk.unsupported;
> }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to