HyukjinKwon commented on a change in pull request #29057:
URL: https://github.com/apache/spark/pull/29057#discussion_r452566671
##########
File path: .github/workflows/master.yml
##########
@@ -9,148 +9,233 @@ on:
- master
jobs:
+ # TODO(SPARK-32248): Recover JDK 11 builds
+ # Build: build Spark and run the tests for specified modules.
build:
-
+ name: "Build modules: ${{ matrix.modules }} ${{ matrix.comment }} (JDK ${{
matrix.java }}, ${{ matrix.hadoop }}, ${{ matrix.hive }})"
runs-on: ubuntu-latest
strategy:
+ fail-fast: false
matrix:
- java: [ '1.8', '11' ]
- hadoop: [ 'hadoop-2.7', 'hadoop-3.2' ]
- hive: [ 'hive-1.2', 'hive-2.3' ]
- exclude:
- - java: '11'
- hive: 'hive-1.2'
- - hadoop: 'hadoop-3.2'
- hive: 'hive-1.2'
- name: Build Spark - JDK${{ matrix.java }}/${{ matrix.hadoop }}/${{
matrix.hive }}
-
+ java:
+ - 1.8
+ hadoop:
+ - hadoop3.2
+ hive:
+ - hive2.3
Review comment:
Yeah, I think this isn't possible at this moment. It would be great if
we can figure out a way to do it.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]