Yikun commented on code in PR #36285:
URL: https://github.com/apache/spark/pull/36285#discussion_r858527449
##########
.github/workflows/build_and_test.yml:
##########
@@ -393,9 +397,11 @@ jobs:
sparkr:
needs: [configure-jobs, precondition]
+ # Run scheduled jobs only with JDK 17
+ # Run regular jobs only if sparkr changes exist
Review Comment:
better also mentioned for `Apache Spark only` and `for commit in both Apache
Spark and forked repository`
##########
.github/workflows/build_and_test.yml:
##########
@@ -140,11 +145,10 @@ jobs:
build:
name: "Build modules (${{ format('{0}, {1} job',
needs.configure-jobs.outputs.branch, needs.configure-jobs.outputs.type) }}):
${{ matrix.modules }} ${{ matrix.comment }} (JDK ${{ matrix.java }}, ${{
matrix.hadoop }}, ${{ matrix.hive }})"
needs: [configure-jobs, precondition]
- # Run scheduled jobs for Apache Spark only
- # Run regular jobs for commit in both Apache Spark and forked repository
+ # Run for all scheduled jobs
+ # Run for all other jobs only when changes exist
Review Comment:
I think the original note still valid in here, and more useful. WDYT?
##########
.github/workflows/build_and_test.yml:
##########
@@ -140,11 +145,10 @@ jobs:
build:
name: "Build modules (${{ format('{0}, {1} job',
needs.configure-jobs.outputs.branch, needs.configure-jobs.outputs.type) }}):
${{ matrix.modules }} ${{ matrix.comment }} (JDK ${{ matrix.java }}, ${{
matrix.hadoop }}, ${{ matrix.hive }})"
needs: [configure-jobs, precondition]
- # Run scheduled jobs for Apache Spark only
- # Run regular jobs for commit in both Apache Spark and forked repository
+ # Run for all scheduled jobs
+ # Run for all other jobs only when changes exist
if: >-
- (github.repository == 'apache/spark' &&
needs.configure-jobs.outputs.type == 'scheduled')
- || (needs.configure-jobs.outputs.type == 'regular' &&
fromJson(needs.precondition.outputs.required).build == 'true')
+ needs.configure-jobs.outputs.type == 'scheduled' ||
fromJson(needs.precondition.outputs.required).build == 'true'
Review Comment:
better to keep regular condition in here (even it's redundant now), it would
be double check if somebody add new type configure-jobs in future.
##########
.github/workflows/build_and_test.yml:
##########
@@ -576,6 +582,7 @@ jobs:
java-11-17:
needs: [configure-jobs, precondition]
+ # Run regular jobs only if changes exist
Review Comment:
better also mentioned for `commit in both Apache Spark and forked repository`
##########
.github/workflows/build_and_test.yml:
##########
@@ -48,6 +48,11 @@ jobs:
configure-jobs:
name: Configure jobs
runs-on: ubuntu-20.04
+ # All other jobs in this workflow depend on this job,
+ # so the entire workflow is skipped when these conditions evaluate to
false:
+ # Run all jobs for Apache Spark repository
+ # Run only non-scheduled jobs for forked repositories
Review Comment:
Good notes, but we better to still keep below notes to mention fork/upstream
trigger scopre
##########
.github/workflows/build_and_test.yml:
##########
@@ -284,12 +288,12 @@ jobs:
pyspark:
needs: [configure-jobs, precondition]
- # Run PySpark coverage scheduled jobs for Apache Spark only
- # Run scheduled jobs with JDK 17 in Apache Spark
- # Run regular jobs for commit in both Apache Spark and forked repository
+ # Run PySpark coverage scheduled jobs
+ # Run scheduled jobs only with JDK 17
+ # Run regular jobs only if pyspark changes exist
Review Comment:
same
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]