[ 
https://issues.apache.org/jira/browse/SPARK-27704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-27704.
-------------------------------
    Resolution: Not A Problem

Do you mean garbage collector? the user can already choose the GC by setting 
JVM options. There is no default in Spark. I don't think there's anything to do 
here.

> Change default class loader to ParallelGC
> -----------------------------------------
>
>                 Key: SPARK-27704
>                 URL: https://issues.apache.org/jira/browse/SPARK-27704
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build
>    Affects Versions: 3.0.0
>            Reporter: Mihaly Toth
>            Priority: Major
>
> In JDK 11 the default class loader changed from ParallelGC to G1GC. Even 
> though this gc performs better on pause times and interactivity, most of the 
> tasks that need to be processed are more sensitive to throughput and the to 
> the amount of memory. G1 sacrifices these to some extend to avoid the big 
> pauses. As a result the user may perceive a regression compared to JDK 8. 
> Even worse, the regression may not be limited to performance only but some 
> jobs may start failing in case they do not fit into the memory they used to 
> be happy with when running with previous JDK.
> Some other kind of apps, like streaming ones, may rather use G1 because of 
> their more interactive, more realtime needs.
> With this jira it is proposed to have a configurable default GC for all spark 
> applications. This may be overridable by the user through command line 
> parameters. The default value of the default GC (in case it is not provided 
> in spark-defaults.conf) could be ParallelGC.
> I do not see this change required but I think it would benefit to the user 
> experience.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to