Github user dongjoon-hyun commented on a diff in the pull request:
https://github.com/apache/spark/pull/23118#discussion_r235799947
--- Diff: build/mvn ---
@@ -116,7 +116,8 @@ install_zinc() {
# the build/ folder
install_scala() {
# determine the Scala version used in Spark
- local scala_version=`grep "scala.version" "${_DIR}/../pom.xml" | head
-n1 | awk -F '[<>]' '{print $3}'`
+ local scala_binary_version=`grep "scala.binary.version"
"${_DIR}/../pom.xml" | head -n1 | awk -F '[<>]' '{print $3}'`
+ local scala_version=`grep "scala.version" "${_DIR}/../pom.xml" | grep
${scala_binary_version} | head -n1 | awk -F '[<>]' '{print $3}'`
--- End diff --
@viirya . I think you are confused about the meaning of `scala.version`.
In maven/sbt, `scala.version` is controlled by profile `-Pscala-2.11`.
This `build/mvn` doesn't know the profile `-Pscala-2.11`. That's the
problem this PR want to fix.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]