This is an automated email from the ASF dual-hosted git repository.

prashant pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
     new a693960  [SPARK-32556][INFRA][2.4] Fix release script to URL encode 
the user provided passwords
a693960 is described below

commit a69396079db77ae0383b83963674f8213e2b0460
Author: Prashant Sharma <prash...@in.ibm.com>
AuthorDate: Fri Aug 7 11:58:46 2020 +0530

    [SPARK-32556][INFRA][2.4] Fix release script to URL encode the user 
provided passwords
    
    ### What changes were proposed in this pull request?
    1. URL encode the `ASF_PASSWORD` of the release manager.
    2. force delete a `.gitignore` file as it may be absent, specially in the 
subsequent runs of the script. And causes the release script failure.
    3. Update the image to install qpdf and jq dependency.
    4. Increase the JVM HEAP memory for MAVEN build.
    
    ### Why are the changes needed?
    Release script takes hours to run, and if a single failure happens about 
somewhere midway, then either one has to get down to manually doing stuff or re 
run the entire script. (This is my understanding) So, I have made the fixes of 
a few failures, discovered so far.
    
    1. If the release manager password contains a char, that is not allowed in 
URL, then it fails the build at the clone spark step.
    `git clone "https://$ASF_USERNAME:$ASF_PASSWORD$ASF_SPARK_REPO"; -b 
$GIT_BRANCH`
    
              ^^^ Fails with bad URL
    
    `ASF_USERNAME` may not be URL encoded, but we need to encode `ASF_PASSWORD`.
    
    2. If the `.gitignore` file is missing, it fails the build at `rm 
.gitignore` step.
    
    3. Ran into out of memory issues.
    
    ```
    [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [ 
57.801 s]
    [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [01:13 
min]
    [INFO] Spark Kinesis Integration .......................... SUCCESS [ 
59.862 s]
    [INFO] Spark Project Examples ............................. SUCCESS [02:10 
min]
    [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ 
45.454 s]
    [INFO] Spark Avro ......................................... SUCCESS [01:58 
min]
    [INFO] Spark Project External Flume Sink .................. SUCCESS [01:34 
min]
    [INFO] Spark Project External Flume ....................... FAILURE [20:04 
min]
    [INFO] Spark Project External Flume Assembly .............. SKIPPED
    [INFO] Spark Project Kinesis Assembly 2.4.7 ............... SKIPPED
    [INFO] 
------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] 
------------------------------------------------------------------------
    [INFO] Total time: 54:10 min
    [INFO] Finished at: 2020-08-06T10:10:12Z
    [INFO] 
------------------------------------------------------------------------
    [ERROR] Java heap space -> [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, 
please read the following articles:
    [ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
    
    ```
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    
    By running the release for branch-2.4, using both type of passwords, i.e. 
passwords with special chars and without it.
    
    For other branches, will followup in other PRs targeted for those branches.
    
    Closes #29371 from ScrapCodes/release-script-fixs.
    
    Lead-authored-by: Prashant Sharma <prash...@in.ibm.com>
    Co-authored-by: Prashant Sharma <prash...@apache.org>
    Signed-off-by: Prashant Sharma <prash...@apache.org>
---
 dev/create-release/release-build.sh    | 2 +-
 dev/create-release/release-tag.sh      | 6 +++++-
 dev/create-release/release-util.sh     | 2 +-
 dev/create-release/spark-rm/Dockerfile | 2 ++
 4 files changed, 9 insertions(+), 3 deletions(-)

diff --git a/dev/create-release/release-build.sh 
b/dev/create-release/release-build.sh
index 3c287fd..db9bd5e 100755
--- a/dev/create-release/release-build.sh
+++ b/dev/create-release/release-build.sh
@@ -172,7 +172,7 @@ fi
 DEST_DIR_NAME="$SPARK_PACKAGE_VERSION"
 
 git clean -d -f -x
-rm .gitignore
+rm -f .gitignore
 cd ..
 
 if [[ "$1" == "package" ]]; then
diff --git a/dev/create-release/release-tag.sh 
b/dev/create-release/release-tag.sh
index 517c7f7..630c549 100755
--- a/dev/create-release/release-tag.sh
+++ b/dev/create-release/release-tag.sh
@@ -64,8 +64,12 @@ init_maven_sbt
 
 ASF_SPARK_REPO="gitbox.apache.org/repos/asf/spark.git"
 
+function uriencode { jq -nSRr --arg v "$1" '$v|@uri'; }
+
+declare -r ENCODED_ASF_PASSWORD=$(uriencode "$ASF_PASSWORD")
+
 rm -rf spark
-git clone "https://$ASF_USERNAME:$ASF_PASSWORD@$ASF_SPARK_REPO"; -b $GIT_BRANCH
+git clone "https://$ASF_USERNAME:$ENCODED_ASF_PASSWORD@$ASF_SPARK_REPO"; -b 
$GIT_BRANCH
 cd spark
 
 git config user.name "$GIT_NAME"
diff --git a/dev/create-release/release-util.sh 
b/dev/create-release/release-util.sh
index 7a2ffa7..cec6a8c 100755
--- a/dev/create-release/release-util.sh
+++ b/dev/create-release/release-util.sh
@@ -225,7 +225,7 @@ function init_maven_sbt {
   if [[ $JAVA_VERSION < "1.8." ]]; then
     # Needed for maven central when using Java 7.
     SBT_OPTS="-Dhttps.protocols=TLSv1.1,TLSv1.2"
-    MVN_EXTRA_OPTS="-Dhttps.protocols=TLSv1.1,TLSv1.2"
+    MVN_EXTRA_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=1g 
-Dhttps.protocols=TLSv1.1,TLSv1.2"
     MVN="$MVN $MVN_EXTRA_OPTS"
   fi
   export MVN MVN_EXTRA_OPTS SBT_OPTS
diff --git a/dev/create-release/spark-rm/Dockerfile 
b/dev/create-release/spark-rm/Dockerfile
index dc8b10d..951bb51 100644
--- a/dev/create-release/spark-rm/Dockerfile
+++ b/dev/create-release/spark-rm/Dockerfile
@@ -60,6 +60,8 @@ RUN apt-get clean && apt-get update && $APT_INSTALL gnupg 
ca-certificates apt-tr
   $APT_INSTALL nodejs && \
   # Install needed python packages. Use pip for installing packages (for 
consistency).
   $APT_INSTALL libpython2.7-dev libpython3-dev python-pip python3-pip && \
+  # qpdf is required for CRAN checks to pass.
+  $APT_INSTALL qpdf jq && \
   pip install --upgrade pip && hash -r pip && \
   pip install setuptools && \
   pip install $BASE_PIP_PKGS && \


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to