This is an automated email from the ASF dual-hosted git repository.
yangjie01 pushed a commit to branch branch-3.5
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.5 by this push:
new acccf53b5793 [SPARK-50212][BUILD][3.5] Fix the conditional check for
executing the `build` task in `build_and_test.yml`
acccf53b5793 is described below
commit acccf53b579348f84d342eb26bc083b21ca25a3d
Author: yangjie01 <[email protected]>
AuthorDate: Tue Nov 5 14:13:53 2024 +0800
[SPARK-50212][BUILD][3.5] Fix the conditional check for executing the
`build` task in `build_and_test.yml`
### What changes were proposed in this pull request?
Comparing
https://github.com/apache/spark/blob/c53dac05058c48ae1edad7912e8cc82533839ca0/.github/workflows/build_and_test.yml#L102
and
https://github.com/apache/spark/blob/9d472661daad4703628e9fbf0ba9922abeed7354/.github/workflows/build_and_test.yml#L97,
the master branch uses `dev/is-changed.py` to check for changes in additional
modules: `variant`, `api`, `streaming-kinesis-asl`, `protobuf`, and `connect`.
Among these, the `api`, `protobuf`, and `connect` modules also exist in the `b
[...]
Therefore, this pr includes the following changes:
1. Adds `is-changed` checks for the `api`, `protobuf`, and `connect`
modules.
2. In `dev/sparktestsupport/modules.py`, adds the definition for the `api`
module, aligning with the master branch.
3. In `dev/sparktestsupport/modules.py`, adds the definition for the
`utils` module, aligning with the master branch. Prior to this PR, although
`dev/is-changed.py` was used to check for changes in the `utils` module, its
definition was missing from `dev/sparktestsupport/modules.py`.
### Why are the changes needed?
Fix the conditional check for executing the `build` task in
`build_and_test.yml`
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
- Pass GitHub Actions
- Manually verified the effectiveness of this pull request:
Before:
Set the latest commit by executing `export
APACHE_SPARK_REF=9d472661daad4703628e9fbf0ba9922abeed7354`. Then manually edit
files in the `api`, `protobuf`, or `connect` modules and commit them. Run the
following command:
```
./dev/is-changed.py -m
"core,unsafe,kvstore,avro,utils,network-common,network-shuffle,repl,launcher,examples,sketch,graphx,catalyst,hive-thriftserver,streaming,sql-kafka-0-10,streaming-kafka-0-10,mllib-local,mllib,yarn,mesos,kubernetes,hadoop-cloud,spark-ganglia-lgpl,sql,hive,connect,protobuf,api"
```
The console will print `false`.
After:
Set the latest commit by executing `export
APACHE_SPARK_REF=41446b3d98cfccf5c6f6ddb8bc3c7c6c1b1c3f54`. Then manually edit
files in the `api`, `protobuf`, or `connect` modules and commit them. Run the
same command as before:
```
./dev/is-changed.py -m
"core,unsafe,kvstore,avro,utils,network-common,network-shuffle,repl,launcher,examples,sketch,graphx,catalyst,hive-thriftserver,streaming,sql-kafka-0-10,streaming-kafka-0-10,mllib-local,mllib,yarn,mesos,kubernetes,hadoop-cloud,spark-ganglia-lgpl,sql,hive,connect,protobuf,api"
```
The console will now print `true`.
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #48744 from LuciferYang/is-change-3.5.
Authored-by: yangjie01 <[email protected]>
Signed-off-by: yangjie01 <[email protected]>
---
.github/workflows/build_and_test.yml | 2 +-
dev/sparktestsupport/modules.py | 24 ++++++++++++++++++++----
2 files changed, 21 insertions(+), 5 deletions(-)
diff --git a/.github/workflows/build_and_test.yml
b/.github/workflows/build_and_test.yml
index b016a29a86be..7ec8f94a6292 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -94,7 +94,7 @@ jobs:
tpcds=false
docker=false
fi
- build=`./dev/is-changed.py -m
"core,unsafe,kvstore,avro,utils,network-common,network-shuffle,repl,launcher,examples,sketch,graphx,catalyst,hive-thriftserver,streaming,sql-kafka-0-10,streaming-kafka-0-10,mllib-local,mllib,yarn,mesos,kubernetes,hadoop-cloud,spark-ganglia-lgpl,sql,hive"`
+ build=`./dev/is-changed.py -m
"core,unsafe,kvstore,avro,utils,network-common,network-shuffle,repl,launcher,examples,sketch,graphx,catalyst,hive-thriftserver,streaming,sql-kafka-0-10,streaming-kafka-0-10,mllib-local,mllib,yarn,mesos,kubernetes,hadoop-cloud,spark-ganglia-lgpl,sql,hive,connect,protobuf,api"`
precondition="
{
\"build\": \"$build\",
diff --git a/dev/sparktestsupport/modules.py b/dev/sparktestsupport/modules.py
index d29fc8726018..5df59476007a 100644
--- a/dev/sparktestsupport/modules.py
+++ b/dev/sparktestsupport/modules.py
@@ -113,6 +113,14 @@ tags = Module(
],
)
+utils = Module(
+ name="utils",
+ dependencies=[tags],
+ source_file_regexes=[
+ "common/utils/",
+ ],
+)
+
kvstore = Module(
name="kvstore",
dependencies=[tags],
@@ -126,7 +134,7 @@ kvstore = Module(
network_common = Module(
name="network-common",
- dependencies=[tags],
+ dependencies=[tags, utils],
source_file_regexes=[
"common/network-common/",
],
@@ -148,7 +156,7 @@ network_shuffle = Module(
unsafe = Module(
name="unsafe",
- dependencies=[tags],
+ dependencies=[tags, utils],
source_file_regexes=[
"common/unsafe",
],
@@ -179,7 +187,7 @@ sketch = Module(
core = Module(
name="core",
- dependencies=[kvstore, network_common, network_shuffle, unsafe, launcher],
+ dependencies=[kvstore, network_common, network_shuffle, unsafe, launcher,
utils],
source_file_regexes=[
"core/",
],
@@ -188,9 +196,17 @@ core = Module(
],
)
+api = Module(
+ name="api",
+ dependencies=[utils, unsafe],
+ source_file_regexes=[
+ "sql/api/",
+ ],
+)
+
catalyst = Module(
name="catalyst",
- dependencies=[tags, sketch, core],
+ dependencies=[tags, sketch, core, api],
source_file_regexes=[
"sql/catalyst/",
],
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]