Here is an update.

The following are moved to 4.1.0 (SPARK-51166) from 4.0.0 (SPARK-44111) for now
because those issues are not ready with proper PRs.

SPARK-47110     Reenble AmmoniteTest tests in Maven builds
SPARK-48139     Re-enable `SparkSessionE2ESuite.interrupt tag`
SPARK-48163     Fix Flaky Test: `SparkConnectServiceSuite.SPARK-43923: ...`
SPARK-49586     Add addArtifact API to PySpark
SPARK-50205     Re-enable 
`SparkSessionJobTaggingAndCancellationSuite.Cancellation APIs...`
SPARK-50748     Fix a flaky test: `SparkSessionE2ESuite.interrupt all - 
background queries, ...`
SPARK-50771     Fix a flaky test: BlockInfoManagerSuite.SPARK-38675 - 
concurrent unlock ...
SPARK-50888     Fix Flaky Test: `SparkConnectServiceSuite.SPARK-44776: 
LocalTableScanExe`
SPARK-50889     Fix Flaky Test: `SparkSessionE2ESuite.interrupt operation`
SPARK-51019     Fix Flaky Test: `SPARK-47148: AQE should avoid to submit 
shuffle job on cancellation`
SPARK-51046     `SubExprEliminationBenchmark` fails at `CodeGenerator`

Dongjoon

On 2025/02/11 23:00:46 Dongjoon Hyun wrote:
> Hi, All.
> 
> While the whole community is very busy for preparing Apache Spark 4.0.0 RC,
> SPARK-51166 is created as a new umbrella JIRA issue to embrace
> - all re-targeted JIRA issues
> - up-coming feature links
> 
> For now, it includes the following and I hope this could be another helpful
> resource
> for you to easily identify what is not a part of Apache Spark 4.0 but will
> arrive soon in 2025.
> 
> SPARK-51167 Build and Run Spark on Java 25
> SPARK-51169 Support Python 3.14
> SPARK-51168 Upgrade Hadoop to 3.4.2
> SPARK-51064 Enable `spark.sql.sources.v2.bucketing.enabled` by default
> SPARK-51165 Enable `spark.master.rest.enabled` by default
> SPARK-51073 Remove `Unstable` from `SparkSessionExtensionsProvider`
> SPARK-51155 Make SparkContext show total runtime after stopping
> SPARK-51140 Sort the params before saving
> SPARK-51148 Upgrade `zstd-jni` to 1.5.6-10
> SPARK-51133 Upgrade Apache `commons-pool2` to 2.12.1
> SPARK-51048 Support stop java spark context with exit code
> SPARK-51023 log remote address on RPC exception
> SPARK-50582 Add quote builtin function
> 
> According to the Apache Spark versioning policy, Apache Spark 4.1.0 is
> scheduled to occur after 6 months from Apache Spark 4.0.0 release.
> 
> https://spark.apache.org/versioning-policy.html
> 
> Please include all missed items into this umbrella JIRA issue
> in order not to forget them.
> 
> Best Regards,
> Dongjoon.
> 

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to