rdblue commented on a change in pull request #3335:
URL: https://github.com/apache/iceberg/pull/3335#discussion_r734101056
##########
File path: .github/workflows/spark-ci.yml
##########
@@ -98,3 +98,30 @@ jobs:
name: test logs
path: |
**/build/testlogs
+
+ spark32-tests:
+ runs-on: ubuntu-latest
+ strategy:
+ matrix:
+ jvm: [11]
+ spark: ['3.2']
+ env:
+ SPARK_LOCAL_IP: localhost
+ steps:
+ - uses: actions/checkout@v2
+ - uses: actions/setup-java@v1
+ with:
+ java-version: ${{ matrix.jvm }}
+ - uses: actions/cache@v2
+ with:
+ path: ~/.gradle/caches
+ key: ${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle') }}
+ restore-keys: ${{ runner.os }}-gradle
+ - run: echo -e "$(ip addr show eth0 | grep "inet\b" | awk '{print $2}' |
cut -d/ -f1)\t$(hostname -f) $(hostname -s)" | sudo tee -a /etc/hosts
+ - run: ./gradlew -DsparkVersions=${{ matrix.spark }} -DhiveVersions=
-DflinkVersions= :iceberg-spark:iceberg-spark_3.2:check
:iceberg-spark:iceberg-spark_3.2-extensions:check
:iceberg-spark:iceberg-spark_3.2-runtime:check -Pquick=true -x javadoc
Review comment:
It might be? The default is to include Flink 1.13 so that we can run
things like `assemble` to trigger compilation of all modules. Then we remove
Flink by specifically setting the versions to an empty string where we don't
want to even load the project, like in the "core" tests job where we want to
run `check` but exclude Flink. I think otherwise we'd have to remove specific
targets for Flink or call `check` for each module.
Here, since we are already calling `check` for each module individually we
may not need it.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]