Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23098#discussion_r235198775
  
    --- Diff: dev/create-release/release-build.sh ---
    @@ -110,16 +110,18 @@ fi
     # Depending on the version being built, certain extra profiles need to be 
activated, and
     # different versions of Scala are supported.
     BASE_PROFILES="-Pmesos -Pyarn"
    -PUBLISH_SCALA_2_10=0
    -SCALA_2_10_PROFILES="-Pscala-2.10"
    -SCALA_2_11_PROFILES=
    +
    +# TODO: revisit for Scala 2.13
    +
    +PUBLISH_SCALA_2_11=1
    --- End diff --
    
    @vanzin @cloud-fan you may want to look at this. It's getting a little 
hairy in this script.
    
    I recall that the goal was to use this script to create older Spark 
releases, so it needs logic for older versions. But looking at it, I don't 
think it actually creates quite the same release as older versions anyway. Is 
it OK to clean house here and assume only Spark 3 will be built from this 
script? I already deleted some really old logic here (Spark < 2.2)


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to