[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2026-01-01 Thread Hyukjin Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-54753:
-
Fix Version/s: 4.1.1
   4.0.2

> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0, 4.0.1
>Reporter: xihuan
>Assignee: xihuan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.2, 4.2.0, 4.1.1
>
> Attachments: image-2025-12-23-05-38-53-324.png, 
> image-2025-12-24-03-18-24-468.png, pom.xml, screenshot-1.png
>
>
> For Apache Spark *4.0.1* local mode, the memory is not released after long 
> time running, while downgrade the spark version {*}v3.5.6{*}, there is no 
> issue.
> The issue can be reproduced with simple test cases:
>  
>  
> {code:java}
> package spark;
> import org.apache.spark.sql.Dataset;
> import org.apache.spark.sql.Row;
> import org.apache.spark.sql.SparkSession;
> import org.slf4j.Logger;
> import org.slf4j.LoggerFactory;
> import java.util.concurrent.Executors;
> import java.util.concurrent.ScheduledExecutorService;
> public class SparkApp {
>     private static final Logger log = LoggerFactory.getLogger(SparkApp.class);
>     private static final String SPARK_MASTER_URL = "local[4]";
>     private static final String SPARK_MEMORY = "500m";
>     public static void main(String[] args) {
>         log.debug("Starting application...");
>         SparkSession sparkSession = SparkSession.builder()
>                 .appName("Test Application")
>                 .master(SPARK_MASTER_URL)
>                 .config("spark.driver.memory", SPARK_MEMORY)
>                 
> //.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // 
> To get rid of the memory leak
>                 .getOrCreate();
>         processData(sparkSession);
>     }
>     private static void processData(SparkSession sparkSession) {
>         
>         while (true) {
>             //load from a local csv file
>             Dataset dataset = sparkSession.read().csv("c:/City.csv");
>             dataset.show(5);
>             log.debug("Persist dataset...");
>             // This is the place where the memory leak occurs
>             dataset.persist();
>             // Do something...
>             log.debug("Do something with the persisted dataset");
>             // ...and unpersist the dataset
>             log.debug("Unpersist dataset...");
>             dataset.unpersist();
>             log.debug("Processing data completed");
>         }
>     }
> }
> {code}
>  
> with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
> it will encounter OOM.
> !screenshot-1.png|width=1070,height=504!
>  
> While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
> issue occurs.
> !image-2025-12-23-05-38-53-324.png|width=1075,height=415!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2025-12-24 Thread xihuan (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xihuan updated SPARK-54753:
---
Attachment: image-2025-12-24-03-18-24-468.png

> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0, 4.0.1
>Reporter: xihuan
>Priority: Critical
>  Labels: pull-request-available
> Attachments: image-2025-12-23-05-38-53-324.png, 
> image-2025-12-24-03-18-24-468.png, pom.xml, screenshot-1.png
>
>
> For Apache Spark *4.0.1* local mode, the memory is not released after long 
> time running, while downgrade the spark version {*}v3.5.6{*}, there is no 
> issue.
> The issue can be reproduced with simple test cases:
>  
>  
> {code:java}
> package spark;
> import org.apache.spark.sql.Dataset;
> import org.apache.spark.sql.Row;
> import org.apache.spark.sql.SparkSession;
> import org.slf4j.Logger;
> import org.slf4j.LoggerFactory;
> import java.util.concurrent.Executors;
> import java.util.concurrent.ScheduledExecutorService;
> public class SparkApp {
>     private static final Logger log = LoggerFactory.getLogger(SparkApp.class);
>     private static final String SPARK_MASTER_URL = "local[4]";
>     private static final String SPARK_MEMORY = "500m";
>     public static void main(String[] args) {
>         log.debug("Starting application...");
>         SparkSession sparkSession = SparkSession.builder()
>                 .appName("Test Application")
>                 .master(SPARK_MASTER_URL)
>                 .config("spark.driver.memory", SPARK_MEMORY)
>                 
> //.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // 
> To get rid of the memory leak
>                 .getOrCreate();
>         processData(sparkSession);
>     }
>     private static void processData(SparkSession sparkSession) {
>         
>         while (true) {
>             //load from a local csv file
>             Dataset dataset = sparkSession.read().csv("c:/City.csv");
>             dataset.show(5);
>             log.debug("Persist dataset...");
>             // This is the place where the memory leak occurs
>             dataset.persist();
>             // Do something...
>             log.debug("Do something with the persisted dataset");
>             // ...and unpersist the dataset
>             log.debug("Unpersist dataset...");
>             dataset.unpersist();
>             log.debug("Processing data completed");
>         }
>     }
> }
> {code}
>  
> with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
> it will encounter OOM.
> !screenshot-1.png|width=1070,height=504!
>  
> While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
> issue occurs.
> !image-2025-12-23-05-38-53-324.png|width=1075,height=415!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2025-12-24 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated SPARK-54753:
---
Labels: pull-request-available  (was: )

> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0, 4.0.1
>Reporter: xihuan
>Priority: Critical
>  Labels: pull-request-available
> Attachments: image-2025-12-23-05-38-53-324.png, pom.xml, 
> screenshot-1.png
>
>
> For Apache Spark *4.0.1* local mode, the memory is not released after long 
> time running, while downgrade the spark version {*}v3.5.6{*}, there is no 
> issue.
> The issue can be reproduced with simple test cases:
>  
>  
> {code:java}
> package spark;
> import org.apache.spark.sql.Dataset;
> import org.apache.spark.sql.Row;
> import org.apache.spark.sql.SparkSession;
> import org.slf4j.Logger;
> import org.slf4j.LoggerFactory;
> import java.util.concurrent.Executors;
> import java.util.concurrent.ScheduledExecutorService;
> public class SparkApp {
>     private static final Logger log = LoggerFactory.getLogger(SparkApp.class);
>     private static final String SPARK_MASTER_URL = "local[4]";
>     private static final String SPARK_MEMORY = "500m";
>     public static void main(String[] args) {
>         log.debug("Starting application...");
>         SparkSession sparkSession = SparkSession.builder()
>                 .appName("Test Application")
>                 .master(SPARK_MASTER_URL)
>                 .config("spark.driver.memory", SPARK_MEMORY)
>                 
> //.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // 
> To get rid of the memory leak
>                 .getOrCreate();
>         processData(sparkSession);
>     }
>     private static void processData(SparkSession sparkSession) {
>         
>         while (true) {
>             //load from a local csv file
>             Dataset dataset = sparkSession.read().csv("c:/City.csv");
>             dataset.show(5);
>             log.debug("Persist dataset...");
>             // This is the place where the memory leak occurs
>             dataset.persist();
>             // Do something...
>             log.debug("Do something with the persisted dataset");
>             // ...and unpersist the dataset
>             log.debug("Unpersist dataset...");
>             dataset.unpersist();
>             log.debug("Processing data completed");
>         }
>     }
> }
> {code}
>  
> with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
> it will encounter OOM.
> !screenshot-1.png|width=1070,height=504!
>  
> While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
> issue occurs.
> !image-2025-12-23-05-38-53-324.png|width=1075,height=415!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2025-12-23 Thread xihuan (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xihuan updated SPARK-54753:
---
Affects Version/s: 4.0.0

> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0, 4.0.1
>Reporter: xihuan
>Priority: Critical
> Attachments: image-2025-12-23-05-38-53-324.png, pom.xml, 
> screenshot-1.png
>
>
> For Apache Spark *4.0.1* local mode, the memory is not released after long 
> time running, while downgrade the spark version {*}v3.5.6{*}, there is no 
> issue.
> The issue can be reproduced with simple test cases:
>  
>  
> {code:java}
> package spark;
> import org.apache.spark.sql.Dataset;
> import org.apache.spark.sql.Row;
> import org.apache.spark.sql.SparkSession;
> import org.slf4j.Logger;
> import org.slf4j.LoggerFactory;
> import java.util.concurrent.Executors;
> import java.util.concurrent.ScheduledExecutorService;
> public class SparkApp {
>     private static final Logger log = LoggerFactory.getLogger(SparkApp.class);
>     private static final String SPARK_MASTER_URL = "local[4]";
>     private static final String SPARK_MEMORY = "500m";
>     public static void main(String[] args) {
>         log.debug("Starting application...");
>         SparkSession sparkSession = SparkSession.builder()
>                 .appName("Test Application")
>                 .master(SPARK_MASTER_URL)
>                 .config("spark.driver.memory", SPARK_MEMORY)
>                 
> //.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // 
> To get rid of the memory leak
>                 .getOrCreate();
>         processData(sparkSession);
>     }
>     private static void processData(SparkSession sparkSession) {
>         
>         while (true) {
>             //load from a local csv file
>             Dataset dataset = sparkSession.read().csv("c:/City.csv");
>             dataset.show(5);
>             log.debug("Persist dataset...");
>             // This is the place where the memory leak occurs
>             dataset.persist();
>             // Do something...
>             log.debug("Do something with the persisted dataset");
>             // ...and unpersist the dataset
>             log.debug("Unpersist dataset...");
>             dataset.unpersist();
>             log.debug("Processing data completed");
>         }
>     }
> }
> {code}
>  
> with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
> it will encounter OOM.
> !screenshot-1.png|width=1070,height=504!
>  
> While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
> issue occurs.
> !image-2025-12-23-05-38-53-324.png|width=1075,height=415!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2025-12-23 Thread xihuan (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xihuan updated SPARK-54753:
---
Description: 
For Apache Spark *4.0.1* local mode, the memory is not released after long time 
running, while downgrade the spark version {*}v3.5.6{*}, there is no issue.

The issue can be reproduced with simple test cases:

 

 
{code:java}
package spark;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
public class SparkApp {
    private static final Logger log = LoggerFactory.getLogger(SparkApp.class);
    private static final String SPARK_MASTER_URL = "local[4]";
    private static final String SPARK_MEMORY = "500m";
    public static void main(String[] args) {
        log.debug("Starting application...");
        SparkSession sparkSession = SparkSession.builder()
                .appName("Test Application")
                .master(SPARK_MASTER_URL)
                .config("spark.driver.memory", SPARK_MEMORY)
                
//.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // To 
get rid of the memory leak
                .getOrCreate();
        processData(sparkSession);
    }
    private static void processData(SparkSession sparkSession) {
        
        while (true) {
            //load from a local csv file
            Dataset dataset = sparkSession.read().csv("c:/City.csv");
            dataset.show(5);
            log.debug("Persist dataset...");
            // This is the place where the memory leak occurs
            dataset.persist();
            // Do something...
            log.debug("Do something with the persisted dataset");
            // ...and unpersist the dataset
            log.debug("Unpersist dataset...");
            dataset.unpersist();
            log.debug("Processing data completed");
        }
    }
}
{code}

 
with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
it will encounter OOM.

!screenshot-1.png|width=1070,height=504!

 

While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
issue occurs.

!image-2025-12-23-05-38-53-324.png|width=1075,height=415!

 

 

 

  was:
For Apache Spark *4.0.1* local mode, the memory is not released after long time 
running, while downgrade the spark version {*}v3.5.6{*}, there is no issue.

The issue can be reproduced with simple test cases:

```scala

package spark;

import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;

public class SparkApp {
    private static final Logger log = LoggerFactory.getLogger(SparkApp.class);

    private static final String SPARK_MASTER_URL = "local[4]";
    private static final String SPARK_MEMORY = "500m";

    public static void main(String[] args) {
        log.debug("Starting application...");

        SparkSession sparkSession = SparkSession.builder()
                .appName("Test Application")
                .master(SPARK_MASTER_URL)
                .config("spark.driver.memory", SPARK_MEMORY)
                
//.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // To 
get rid of the memory leak
                .getOrCreate();

        processData(sparkSession);
    }

    private static void processData(SparkSession sparkSession) {
        
        while (true) {
            //load from a local csv file
            Dataset dataset = sparkSession.read().csv("c:/City.csv");
            dataset.show(5);

            log.debug("Persist dataset...");
            // This is the place where the memory leak occurs
            dataset.persist();

            // Do something...
            log.debug("Do something with the persisted dataset");

            // ...and unpersist the dataset
            log.debug("Unpersist dataset...");
            dataset.unpersist();

            log.debug("Processing data completed");
        }
    }

}

```
 
with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
it will encounter OOM.

!screenshot-1.png|width=1070,height=504!

 

While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
issue occurs.

!image-2025-12-23-05-38-53-324.png|width=1075,height=415!

 

 

 


> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.1
>  

[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2025-12-23 Thread xihuan (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xihuan updated SPARK-54753:
---
Description: 
For Apache Spark *4.0.1* local mode, the memory is not released after long time 
running, while downgrade the spark version {*}v3.5.6{*}, there is no issue.

The issue can be reproduced with simple test cases:

```scala

package spark;

import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;

public class SparkApp {
    private static final Logger log = LoggerFactory.getLogger(SparkApp.class);

    private static final String SPARK_MASTER_URL = "local[4]";
    private static final String SPARK_MEMORY = "500m";

    public static void main(String[] args) {
        log.debug("Starting application...");

        SparkSession sparkSession = SparkSession.builder()
                .appName("Test Application")
                .master(SPARK_MASTER_URL)
                .config("spark.driver.memory", SPARK_MEMORY)
                
//.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // To 
get rid of the memory leak
                .getOrCreate();

        processData(sparkSession);
    }

    private static void processData(SparkSession sparkSession) {
        
        while (true) {
            //load from a local csv file
            Dataset dataset = sparkSession.read().csv("c:/City.csv");
            dataset.show(5);

            log.debug("Persist dataset...");
            // This is the place where the memory leak occurs
            dataset.persist();

            // Do something...
            log.debug("Do something with the persisted dataset");

            // ...and unpersist the dataset
            log.debug("Unpersist dataset...");
            dataset.unpersist();

            log.debug("Processing data completed");
        }
    }

}

```
 
with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
it will encounter OOM.

!screenshot-1.png|width=1070,height=504!

 

While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
issue occurs.

!image-2025-12-23-05-38-53-324.png|width=1075,height=415!

 

 

 

  was:
For Apache Spark *4.0.1* local mode, the memory is not released after long time 
running, while downgrade the spark version {*}v3.5.6{*}, there is no issue.

The issue can be reproduced with simple test cases:

```scala
package spark;

import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;

public class SparkApp {
private static final Logger log = LoggerFactory.getLogger(SparkApp.class);

private static final String SPARK_MASTER_URL = "local[4]";
private static final String SPARK_MEMORY = "500m";

public static void main(String[] args)

{ log.debug("Starting application..."); SparkSession sparkSession = 
SparkSession.builder() .appName("Test Application") .master(SPARK_MASTER_URL) 
.config("spark.driver.memory", SPARK_MEMORY) 
//.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // To 
get rid of the memory leak .getOrCreate(); processData(sparkSession); }

private static void processData(SparkSession sparkSession) {

while (true)

{ //load from a local csv file Dataset dataset = 
sparkSession.read().csv("c:/City.csv"); dataset.show(5); log.debug("Persist 
dataset..."); // This is the place where the memory leak occurs 
dataset.persist(); // Do something... log.debug("Do something with the 
persisted dataset"); // ...and unpersist the dataset log.debug("Unpersist 
dataset..."); dataset.unpersist(); log.debug("Processing data completed"); }

}

}

```
 
with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
it will encounter OOM.

!screenshot-1.png|width=1070,height=504!

 

While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
issue occurs.

!image-2025-12-23-05-38-53-324.png|width=1075,height=415!

 

 

 


> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.1
>Reporter: xihuan
>Priority: Critical
> Attachments: image-2025-12-23-05-38-53-324.png, pom.xml, 
> screenshot-1.png
>
>
> For Apache Spark *4.0.1* local mode, the memory is not released after long 
> time running, while downgrade the spark version {*}v3.5.6{*}, th

[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2025-12-23 Thread xihuan (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xihuan updated SPARK-54753:
---
Description: 
For Apache Spark *4.0.1* local mode, the memory is not released after long time 
running, while downgrade the spark version {*}v3.5.6{*}, there is no issue.

The issue can be reproduced with simple test cases:

```scala
package spark;

import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;

public class SparkApp {
private static final Logger log = LoggerFactory.getLogger(SparkApp.class);

private static final String SPARK_MASTER_URL = "local[4]";
private static final String SPARK_MEMORY = "500m";

public static void main(String[] args)

{ log.debug("Starting application..."); SparkSession sparkSession = 
SparkSession.builder() .appName("Test Application") .master(SPARK_MASTER_URL) 
.config("spark.driver.memory", SPARK_MEMORY) 
//.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // To 
get rid of the memory leak .getOrCreate(); processData(sparkSession); }

private static void processData(SparkSession sparkSession) {

while (true)

{ //load from a local csv file Dataset dataset = 
sparkSession.read().csv("c:/City.csv"); dataset.show(5); log.debug("Persist 
dataset..."); // This is the place where the memory leak occurs 
dataset.persist(); // Do something... log.debug("Do something with the 
persisted dataset"); // ...and unpersist the dataset log.debug("Unpersist 
dataset..."); dataset.unpersist(); log.debug("Processing data completed"); }

}

}

```
 
with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
it will encounter OOM.

!screenshot-1.png|width=1070,height=504!

 

While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
issue occurs.

!image-2025-12-23-05-38-53-324.png|width=1075,height=415!

 

 

 

  was:
For Apache Spark *4.0.1* local mode, the memory is not released after long time 
running, while downgrade the spark version {*}v3.5.6{*}, there is no issue.

The issue can be reproduced with simple test cases:

```
package spark;

import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;

public class SparkApp {
private static final Logger log = LoggerFactory.getLogger(SparkApp.class);

private static final String SPARK_MASTER_URL = "local[4]";
private static final String SPARK_MEMORY = "500m";

public static void main(String[] args)

{ log.debug("Starting application..."); SparkSession sparkSession = 
SparkSession.builder() .appName("Test Application") .master(SPARK_MASTER_URL) 
.config("spark.driver.memory", SPARK_MEMORY) 
//.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // To 
get rid of the memory leak .getOrCreate(); processData(sparkSession); }

private static void processData(SparkSession sparkSession) {

while (true)

{ //load from a local csv file Dataset dataset = 
sparkSession.read().csv("c:/City.csv"); dataset.show(5); log.debug("Persist 
dataset..."); // This is the place where the memory leak occurs 
dataset.persist(); // Do something... log.debug("Do something with the 
persisted dataset"); // ...and unpersist the dataset log.debug("Unpersist 
dataset..."); dataset.unpersist(); log.debug("Processing data completed"); }

}

}

```
 
with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
it will encounter OOM.

!screenshot-1.png|width=1070,height=504!

 

While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
issue occurs.

!image-2025-12-23-05-38-53-324.png|width=1075,height=415!

 

 

 


> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.1
>Reporter: xihuan
>Priority: Critical
> Attachments: image-2025-12-23-05-38-53-324.png, pom.xml, 
> screenshot-1.png
>
>
> For Apache Spark *4.0.1* local mode, the memory is not released after long 
> time running, while downgrade the spark version {*}v3.5.6{*}, there is no 
> issue.
> The issue can be reproduced with simple test cases:
> ```scala
> package spark;
> import org.apache.spark.sql.Dataset;
> import org.apache.spark.sql.Row;
> import org.apache.spark.sql.SparkSession;
> import org.slf4j.Logger;
> import org.slf4j.LoggerFactory;
> import java.util.co

[jira] [Updated] (SPARK-54753) memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset

2025-12-23 Thread xihuan (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-54753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xihuan updated SPARK-54753:
---
Summary: memory leak in Apache Spark 4.0.1 as we persist/unpersist the 
dataset  (was: memory leak in Apache Spark 4.0.1)

> memory leak in Apache Spark 4.0.1 as we persist/unpersist the dataset
> -
>
> Key: SPARK-54753
> URL: https://issues.apache.org/jira/browse/SPARK-54753
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.1
>Reporter: xihuan
>Priority: Critical
> Attachments: image-2025-12-23-05-38-53-324.png, pom.xml, 
> screenshot-1.png
>
>
> For Apache Spark *4.0.1* local mode, the memory is not released after long 
> time running, while downgrade the spark version {*}v3.5.6{*}, there is no 
> issue.
> The issue can be reproduced with simple test cases:
> ```
> package spark;
> import org.apache.spark.sql.Dataset;
> import org.apache.spark.sql.Row;
> import org.apache.spark.sql.SparkSession;
> import org.slf4j.Logger;
> import org.slf4j.LoggerFactory;
> import java.util.concurrent.Executors;
> import java.util.concurrent.ScheduledExecutorService;
> public class SparkApp {
> private static final Logger log = LoggerFactory.getLogger(SparkApp.class);
> private static final String SPARK_MASTER_URL = "local[4]";
> private static final String SPARK_MEMORY = "500m";
> public static void main(String[] args)
> { log.debug("Starting application..."); SparkSession sparkSession = 
> SparkSession.builder() .appName("Test Application") .master(SPARK_MASTER_URL) 
> .config("spark.driver.memory", SPARK_MEMORY) 
> //.config("spark.sql.sources.bucketing.autoBucketedScan.enabled", false) // 
> To get rid of the memory leak .getOrCreate(); processData(sparkSession); }
> private static void processData(SparkSession sparkSession) {
> while (true)
> { //load from a local csv file Dataset dataset = 
> sparkSession.read().csv("c:/City.csv"); dataset.show(5); log.debug("Persist 
> dataset..."); // This is the place where the memory leak occurs 
> dataset.persist(); // Do something... log.debug("Do something with the 
> persisted dataset"); // ...and unpersist the dataset log.debug("Unpersist 
> dataset..."); dataset.unpersist(); log.debug("Processing data completed"); }
> }
> }
> ```
>  
> with *Apache Spark 4.0.0 or 4.0.1* as JVM heap size to 1GB, around 9 minutes, 
> it will encounter OOM.
> !screenshot-1.png|width=1070,height=504!
>  
> While, {*}with Apache Spark 3.5.6, with max heap size of 500MB{*}, no such 
> issue occurs.
> !image-2025-12-23-05-38-53-324.png|width=1075,height=415!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]