Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/22441#discussion_r218458224
--- Diff: dev/create-release/release-build.sh ---
@@ -414,15 +437,15 @@ if [[ "$1" == "publish-release" ]]; then
-DskipTests $PUBLISH_PROFILES $SCALA_2_10_PROFILES clean install
fi
- #./dev/change-scala-version.sh 2.12
- #$MVN -DzincPort=$ZINC_PORT -Dmaven.repo.local=$tmp_repo \
- # -DskipTests $SCALA_2_12_PROFILES §$PUBLISH_PROFILES clean install
+ if ! is_dry_run && [[ $PUBLISH_SCALA_2_12 = 1 ]]; then
+ ./dev/change-scala-version.sh 2.12
+ $MVN -DzincPort=$((ZINC_PORT + 2)) -Dmaven.repo.local=$tmp_repo
-Dscala-2.12 \
+ -DskipTests $PUBLISH_PROFILES $SCALA_2_12_PROFILES clean install
+ fi
--- End diff --
I still think we want a cleanup step that always changes the scala version
back to the 'default' of 2.11 right now.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]