[ 
https://issues.apache.org/jira/browse/HIVE-19562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16478336#comment-16478336
 ] 

Hive QA commented on HIVE-19562:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923824/HIVE-19562.3.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11017/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11017/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11017/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:04:22.204
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11017/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:04:22.208
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:04:23.334
+ rm -rf ../yetus_PreCommit-HIVE-Build-11017
+ mkdir ../yetus_PreCommit-HIVE-Build-11017
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11017
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11017/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/data/conf/spark/local/hive-site.xml: does not exist in index
error: a/data/conf/spark/standalone/hive-site.xml: does not exist in index
error: a/data/conf/spark/yarn-cluster/hive-site.xml: does not exist in index
Going to apply patch with: git apply -p1
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc5358732397349962615.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc5358732397349962615.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2392:5: 
Decision can match input such as "KW_CHECK {KW_EXISTS, KW_TINYINT}" using 
multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2392:5: 
Decision can match input such as "KW_CHECK KW_DATETIME" using multiple 
alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2392:5: 
Decision can match input such as "KW_CHECK KW_DATE {LPAREN, StringLiteral}" 
using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2392:5: 
Decision can match input such as "KW_CHECK KW_STRUCT LESSTHAN" using multiple 
alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2392:5: 
Decision can match input such as "KW_CHECK KW_UNIONTYPE LESSTHAN" using 
multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
org/apache/hadoop/hive/ql/parse/HintParser.g
Generating vector expression code
Generating vector expression test code
Processing annotations
Annotations processed
Processing annotations
No elements to process
Processing annotations
Annotations processed
Processing annotations
No elements to process
[ERROR] Failed to execute goal on project hive-llap-server: Could not resolve 
dependencies for project org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT: 
Failed to collect dependencies for 
[org.apache.hive:hive-exec:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-common:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-llap-common:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-llap-client:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-llap-tez:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-shims:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-serde:jar:3.1.0-SNAPSHOT (compile), 
commons-codec:commons-codec:jar:1.7 (compile), 
commons-lang:commons-lang:jar:2.6 (compile), 
io.netty:netty-all:jar:4.1.17.Final (compile), io.netty:netty:jar:3.10.5.Final 
(compile), org.apache.avro:avro:jar:1.7.7 (compile), 
org.apache.thrift:libthrift:jar:0.9.3 (compile), com.tdunning:json:jar:1.8 
(compile), org.apache.hadoop:hadoop-common:jar:3.1.0 (compile?), 
org.apache.hadoop:hadoop-yarn-services-core:jar:3.1.0 (compile?), 
org.apache.hadoop:hadoop-mapreduce-client-core:jar:3.1.0 (compile?), 
org.apache.orc:orc-core:jar:1.4.3 (compile), 
org.apache.tez:tez-runtime-internals:jar:0.9.1 (compile?), 
org.apache.tez:tez-runtime-library:jar:0.9.1 (compile?), 
org.codehaus.jettison:jettison:jar:1.1 (compile), 
org.eclipse.jetty:jetty-server:jar:9.3.8.v20160314 (compile), 
org.eclipse.jetty:jetty-util:jar:9.3.8.v20160314 (compile), 
org.apache.hive:hive-standalone-metastore:jar:tests:3.1.0-SNAPSHOT (test), 
org.apache.hadoop:hadoop-common:jar:tests:3.1.0 (test), 
org.apache.hadoop:hadoop-hdfs:jar:3.1.0 (test), 
org.apache.hive:hive-llap-common:jar:tests:3.1.0-SNAPSHOT (compile), 
org.apache.hadoop:hadoop-hdfs:jar:tests:3.1.0 (test), junit:junit:jar:4.11 
(test), org.mockito:mockito-all:jar:1.10.19 (test), 
com.sun.jersey:jersey-servlet:jar:1.19 (test), 
org.apache.hbase:hbase-hadoop2-compat:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-client:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-server:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-mapreduce:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-common:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-hadoop-compat:jar:2.0.0-alpha4 (compile), 
org.slf4j:slf4j-api:jar:1.7.10 (compile)]: Failed to read artifact descriptor 
for org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
(https://maven.java.net/content/repositories/snapshots): Failed to transfer 
file: 
https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
 Return code is: 402 , ReasonPhrase:Payment Required. -> [Help 1]
[ERROR] Failed to execute goal on project hive-hbase-handler: Could not resolve 
dependencies for project org.apache.hive:hive-hbase-handler:jar:3.1.0-SNAPSHOT: 
Failed to collect dependencies for 
[org.apache.hive:hive-exec:jar:3.1.0-SNAPSHOT (compile), 
commons-lang:commons-lang:jar:2.6 (compile), 
org.apache.hadoop:hadoop-common:jar:3.1.0 (compile?), 
org.apache.hadoop:hadoop-mapreduce-client-core:jar:3.1.0 (compile?), 
org.apache.hbase:hbase-hadoop2-compat:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-client:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-server:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-mapreduce:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-common:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-hadoop-compat:jar:2.0.0-alpha4 (compile), 
org.apache.hadoop:hadoop-hdfs:jar:tests:3.1.0 (test), 
org.apache.hbase:hbase-hadoop2-compat:jar:tests:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-common:jar:tests:2.0.0-alpha4 (test), 
org.apache.hbase:hbase-server:jar:tests:2.0.0-alpha4 (test), 
org.apache.hbase:hbase-mapreduce:jar:tests:2.0.0-alpha4 (test), 
org.apache.hbase:hbase-hadoop-compat:jar:tests:2.0.0-alpha4 (test), 
org.eclipse.jetty:jetty-runner:jar:9.3.8.v20160314 (test), 
com.sun.jersey:jersey-servlet:jar:1.19 (test), 
org.apache.hadoop:hadoop-common:jar:tests:3.1.0 (test), junit:junit:jar:4.11 
(test), org.apache.avro:avro:jar:1.7.6 (compile), 
org.slf4j:slf4j-api:jar:1.7.10 (compile), org.mockito:mockito-all:jar:1.10.19 
(test)]: Failed to read artifact descriptor for 
org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
(https://maven.java.net/content/repositories/snapshots): Failed to transfer 
file: 
https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
 Return code is: 402 , ReasonPhrase:Payment Required. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-llap-server
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923824 - PreCommit-HIVE-Build

> Flaky test: TestMiniSparkOnYarn FileNotFoundException in spark-submit
> ---------------------------------------------------------------------
>
>                 Key: HIVE-19562
>                 URL: https://issues.apache.org/jira/browse/HIVE-19562
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HIVE-19562.1.patch, HIVE-19562.3.patch
>
>
> Seeing sporadic failures during test setup. Specifically, when spark-submit 
> runs this error (or a similar error) gets thrown:
> {code}
> 2018-05-15T10:55:02,112  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient: Exception in thread "main" 
> java.io.FileNotFoundException: File 
> file:/tmp/spark-56e217f7-b8a5-4c63-9a6b-d737a64f2820/__spark_libs__7371510645900072447.zip
>  does not exist
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:867)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:442)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:365)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:316)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.yarn.Client.copyFileToRemote(Client.scala:356)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.yarn.Client.org$apache$spark$deploy$yarn$Client$$distribute$1(Client.scala:478)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:565)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:863)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.yarn.Client.run(Client.scala:1146)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1518)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:      at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}
> Essentially, Spark is writing some files for container localization to a tmp 
> dir, and that tmp dir is getting deleted. We have seen a lot of issues with 
> writing files to {{/tmp/}} in the past, so its probably best to write these 
> files to a test-specific dir.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to