This is an automated email from the ASF dual-hosted git repository. dongjoon pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 32ba5c1db62c [SPARK-48133][INFRA] Run `sparkr` only in PR builders and Daily CIs 32ba5c1db62c is described below commit 32ba5c1db62caaaa2674e8acced56f89ed840bf9 Author: Dongjoon Hyun <dh...@apple.com> AuthorDate: Sun May 5 13:19:23 2024 -0700 [SPARK-48133][INFRA] Run `sparkr` only in PR builders and Daily CIs ### What changes were proposed in this pull request? This PR aims to run `sparkr` only in PR builder and Daily Python CIs. In other words, only the commit builder will skip it by default. ### Why are the changes needed? To reduce GitHub Action usage to meet ASF INFRA policy. - https://infra.apache.org/github-actions-policy.html > All workflows MUST have a job concurrency level less than or equal to 20. This means a workflow cannot have more than 20 jobs running at the same time across all matrices. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Manual review. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #46389 from dongjoon-hyun/SPARK-48133. Authored-by: Dongjoon Hyun <dh...@apple.com> Signed-off-by: Dongjoon Hyun <dh...@apple.com> --- .github/workflows/build_and_test.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/build_and_test.yml b/.github/workflows/build_and_test.yml index c87e8921b48e..f626cd72be15 100644 --- a/.github/workflows/build_and_test.yml +++ b/.github/workflows/build_and_test.yml @@ -76,17 +76,17 @@ jobs: id: set-outputs run: | if [ -z "${{ inputs.jobs }}" ]; then - pyspark=true; sparkr=true; pyspark_modules=`cd dev && python -c "import sparktestsupport.modules as m; print(','.join(m.name for m in m.all_modules if m.name.startswith('pyspark')))"` pyspark=`./dev/is-changed.py -m $pyspark_modules` if [[ "${{ github.repository }}" != 'apache/spark' ]]; then pandas=$pyspark kubernetes=`./dev/is-changed.py -m kubernetes` + sparkr=`./dev/is-changed.py -m sparkr` else pandas=false kubernetes=false + sparkr=false fi - sparkr=`./dev/is-changed.py -m sparkr` # 'build' is always true for now. # It does not save significant time and most of PRs trigger the build. precondition=" --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org