This is an automated email from the ASF dual-hosted git repository.
ruifengz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 78c2c2674b46 [SPARK-46436][INFRA] Clean up compatibility
configurations related to branch-3.3 daily testing in `build_and_test.yml`
78c2c2674b46 is described below
commit 78c2c2674b46fcc60e781919550bf1735afc1b85
Author: yangjie01 <[email protected]>
AuthorDate: Mon Dec 18 14:07:02 2023 +0800
[SPARK-46436][INFRA] Clean up compatibility configurations related to
branch-3.3 daily testing in `build_and_test.yml`
### What changes were proposed in this pull request?
This pr aims to clean up compatibility configurations related to branch-3.3
daily testing in `build_and_test.yml` Since Apache Spark 3.3 has reached EOL,
there is no need for daily testing anymore.
### Why are the changes needed?
Apache Spark 3.3 has reached EOL, there is no need for daily testing
anymore.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Pass GitHub Actions
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #44392 from LuciferYang/3-3-daily.
Authored-by: yangjie01 <[email protected]>
Signed-off-by: Ruifeng Zheng <[email protected]>
---
.github/workflows/build_and_test.yml | 33 ++++++++++-----------------------
1 file changed, 10 insertions(+), 23 deletions(-)
diff --git a/.github/workflows/build_and_test.yml
b/.github/workflows/build_and_test.yml
index 27d3c86686bb..bdcb1dd1ea5c 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -57,11 +57,7 @@ jobs:
GITHUB_PREV_SHA: ${{ github.event.before }}
outputs:
required: ${{ steps.set-outputs.outputs.required }}
- image_url: >-
- ${{
- (inputs.branch == 'branch-3.3' &&
'dongjoon/apache-spark-github-action-image:20220207')
- || steps.infra-image-outputs.outputs.image_url
- }}
+ image_url: ${{ steps.infra-image-outputs.outputs.image_url }}
steps:
- name: Checkout Spark repository
uses: actions/checkout@v4
@@ -292,10 +288,9 @@ jobs:
needs: precondition
# Currently, enable docker build from cache for `master` and branch (since
3.4) jobs
if: >-
- (fromJson(needs.precondition.outputs.required).pyspark == 'true' ||
+ fromJson(needs.precondition.outputs.required).pyspark == 'true' ||
fromJson(needs.precondition.outputs.required).lint == 'true' ||
- fromJson(needs.precondition.outputs.required).sparkr == 'true') &&
- (inputs.branch != 'branch-3.3')
+ fromJson(needs.precondition.outputs.required).sparkr == 'true'
runs-on: ubuntu-latest
permissions:
packages: write
@@ -684,15 +679,7 @@ jobs:
- name: Java linter
run: ./dev/lint-java
- name: Spark connect jvm client mima check
- if: inputs.branch != 'branch-3.3'
run: ./dev/connect-jvm-client-mima-check
- - name: Install Python linter dependencies for branch-3.3
- if: inputs.branch == 'branch-3.3'
- run: |
- # SPARK-44554: Copy from
https://github.com/apache/spark/blob/073d0b60d31bf68ebacdc005f59b928a5902670f/.github/workflows/build_and_test.yml#L501-L508
- # Should delete this section after SPARK 3.3 EOL.
- python3.9 -m pip install 'flake8==3.9.0' pydata_sphinx_theme
'mypy==0.920' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' numpydoc
'jinja2<3.0.0' 'black==21.12b0'
- python3.9 -m pip install 'pandas-stubs==1.2.0.53' ipython
- name: Install Python linter dependencies for branch-3.4
if: inputs.branch == 'branch-3.4'
run: |
@@ -708,7 +695,7 @@ jobs:
python3.9 -m pip install 'flake8==3.9.0' pydata_sphinx_theme
'mypy==0.982' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' numpydoc
'jinja2<3.0.0' 'black==22.6.0'
python3.9 -m pip install 'pandas-stubs==1.2.0.53' ipython
'grpcio==1.59.3' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0'
- name: Install Python linter dependencies
- if: inputs.branch != 'branch-3.3' && inputs.branch != 'branch-3.4' &&
inputs.branch != 'branch-3.5'
+ if: inputs.branch != 'branch-3.4' && inputs.branch != 'branch-3.5'
run: |
python3.9 -m pip install 'flake8==3.9.0' pydata_sphinx_theme
'mypy==0.982' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' numpydoc jinja2
'black==23.9.1'
python3.9 -m pip install 'pandas-stubs==1.2.0.53' ipython
'grpcio==1.59.3' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0'
@@ -729,16 +716,16 @@ jobs:
if: inputs.branch == 'branch-3.5'
run: if test -f ./dev/connect-check-protos.py; then
PATH=$PATH:$HOME/buf/bin PYTHON_EXECUTABLE=python3.9
./dev/connect-check-protos.py; fi
# Should delete this section after SPARK 3.5 EOL.
- - name: Install JavaScript linter dependencies for branch-3.3, branch-3.4,
branch-3.5
- if: inputs.branch == 'branch-3.3' || inputs.branch == 'branch-3.4' ||
inputs.branch == 'branch-3.5'
+ - name: Install JavaScript linter dependencies for branch-3.4, branch-3.5
+ if: inputs.branch == 'branch-3.4' || inputs.branch == 'branch-3.5'
run: |
apt update
apt-get install -y nodejs npm
- name: JS linter
run: ./dev/lint-js
# Should delete this section after SPARK 3.5 EOL.
- - name: Install R linter dependencies for branch-3.3, branch-3.4,
branch-3.5
- if: inputs.branch == 'branch-3.3' || inputs.branch == 'branch-3.4' ||
inputs.branch == 'branch-3.5'
+ - name: Install R linter dependencies for branch-3.4, branch-3.5
+ if: inputs.branch == 'branch-3.4' || inputs.branch == 'branch-3.5'
run: |
apt update
apt-get install -y libcurl4-openssl-dev libgit2-dev libssl-dev
libxml2-dev \
@@ -749,8 +736,8 @@ jobs:
- name: Install R linter dependencies and SparkR
run: ./R/install-dev.sh
# Should delete this section after SPARK 3.5 EOL.
- - name: Install dependencies for documentation generation for branch-3.3,
branch-3.4, branch-3.5
- if: inputs.branch == 'branch-3.3' || inputs.branch == 'branch-3.4' ||
inputs.branch == 'branch-3.5'
+ - name: Install dependencies for documentation generation for branch-3.4,
branch-3.5
+ if: inputs.branch == 'branch-3.4' || inputs.branch == 'branch-3.5'
run: |
# pandoc is required to generate PySpark APIs as well in nbsphinx.
apt-get update -y
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]