This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new f0d9f993fc3e [SPARK-55386][INFRA] Run `Java 17/25` Maven install tests
on PR build only
f0d9f993fc3e is described below
commit f0d9f993fc3e584e28db7c03eaaa494d4b8ce8e1
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Feb 5 22:01:13 2026 -0800
[SPARK-55386][INFRA] Run `Java 17/25` Maven install tests on PR build only
### What changes were proposed in this pull request?
This PR aims to run `Java 17` and `Java 25` Maven **build and install**
tests on PR builder only. In other words, it will run on a repository which is
not `apache/spark`.
Since we still have daily Maven jobs, it's okay to skip this **build and
install** tests.
### Why are the changes needed?
To meet the **ASF Policy**:
**1. 20 JOB RULE**
> All workflows MUST have a job concurrency level less than or equal to 20.
This means a workflow cannot have more than 20 jobs running at the same time
across all matrices.
Currently, our CI seems to trigger more than 20 jobs although it's not
currently at the same time.
<img width="552" height="67" alt="Screenshot 2026-02-05 at 20 49 58"
src="https://github.com/user-attachments/assets/38b6733f-7cd4-46ad-948f-25b1dcfcd492"
/>
<img width="787" height="70" alt="Screenshot 2026-02-05 at 20 38 47"
src="https://github.com/user-attachments/assets/2d6d01bc-c522-4d6f-be7c-5cad59520d83"
/>
**2. RUNNING TIME RULE**
> The average number of minutes a project uses per calendar week MUST NOT
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
https://infra-reports.apache.org/#ghactions&project=spark&hours=168
This PR also reduces the CI cost by skipping `Java 17 and 25` Maven build
on commit builders.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manual review.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #54170 from dongjoon-hyun/SPARK-55386.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.github/workflows/build_and_test.yml | 8 ++++++--
1 file changed, 6 insertions(+), 2 deletions(-)
diff --git a/.github/workflows/build_and_test.yml
b/.github/workflows/build_and_test.yml
index 0d17919a0b54..96740e1f0e9e 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -105,6 +105,8 @@ jobs:
buf=true
ui=true
docs=true
+ java17=true
+ java25=true
else
pyspark_install=false
pandas=false
@@ -116,6 +118,8 @@ jobs:
buf=false
ui=false
docs=false
+ java17=false
+ java25=false
fi
build=`./dev/is-changed.py -m
"core,unsafe,kvstore,avro,utils,utils-java,network-common,network-shuffle,repl,launcher,examples,sketch,variant,api,catalyst,hive-thriftserver,mllib-local,mllib,graphx,streaming,sql-kafka-0-10,streaming-kafka-0-10,streaming-kinesis-asl,kubernetes,hadoop-cloud,spark-ganglia-lgpl,profiler,protobuf,yarn,connect,sql,hive,pipelines"`
precondition="
@@ -128,8 +132,8 @@ jobs:
\"tpcds-1g\": \"$tpcds\",
\"docker-integration-tests\": \"$docker\",
\"lint\" : \"true\",
- \"java17\" : \"$build\",
- \"java25\" : \"$build\",
+ \"java17\" : \"$java17\",
+ \"java25\" : \"$java25\",
\"docs\" : \"$docs\",
\"yarn\" : \"$yarn\",
\"k8s-integration-tests\" : \"$kubernetes\",
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]