Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/23118#discussion_r235679113
--- Diff: build/mvn ---
@@ -116,7 +116,8 @@ install_zinc() {
# the build/ folder
install_scala() {
# determine the Scala version used in Spark
- local scala_version=`grep "scala.version" "${_DIR}/../pom.xml" | head
-n1 | awk -F '[<>]' '{print $3}'`
+ local scala_binary_version=`grep "scala.binary.version"
"${_DIR}/../pom.xml" | head -n1 | awk -F '[<>]' '{print $3}'`
+ local scala_version=`grep "scala.version" "${_DIR}/../pom.xml" | grep
${scala_binary_version} | head -n1 | awk -F '[<>]' '{print $3}'`
--- End diff --
I have above question because I think after current change, `scala.version`
and `scala.binary.version` are still not matched in pom.xml, isn't
Or we just need to do as current change to download & install correct Scala
and this is enough?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]