Github user dongjoon-hyun commented on a diff in the pull request:
https://github.com/apache/spark/pull/23118#discussion_r235674477
--- Diff: build/mvn ---
@@ -116,7 +116,8 @@ install_zinc() {
# the build/ folder
install_scala() {
# determine the Scala version used in Spark
- local scala_version=`grep "scala.version" "${_DIR}/../pom.xml" | head
-n1 | awk -F '[<>]' '{print $3}'`
+ local scala_binary_version=`grep "scala.binary.version"
"${_DIR}/../pom.xml" | head -n1 | awk -F '[<>]' '{print $3}'`
+ local scala_version=`grep "scala.version" "${_DIR}/../pom.xml" | grep
${scala_binary_version} | head -n1 | awk -F '[<>]' '{print $3}'`
--- End diff --
Thank you for review, @viirya .
Previously, it doesn't matched; `scala.binary.version=2.11` and
`scala.version=2.12.7`.
Now, they will be matched; `scala.binary.version=2.11` and
`scala.version=2.11.12`.
By default (without `change-scala-version.sh 2.11`), it's
`scala.binary.version=2.12` and `scala.version=2.12.7`.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]