Jira: https://issues.apache.org/jira/browse/OOZIE-3112
Build: https://builds.apache.org/job/PreCommit-OOZIE-Build/197/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE
###########################
[...truncated 1.61 MB...]
+1 There are no new bugs found in [sharelib/oozie].
+1 There are no new bugs found in [sharelib/hcatalog].
+1 There are no new bugs found in [sharelib/streaming].
+1 There are no new bugs found in [sharelib/hive].
+1 There are no new bugs found in [sharelib/sqoop].
-1 There are [3] new bugs found below threshold in [sharelib/spark] that must
be fixed.
You can find the FindBugs diff here (look for the red and orange ones):
sharelib/spark/findbugs-new.html
The most important FindBugs errors are:
At SparkArgsExtractor.java:[line 376]: Found reliance on default encoding in
org.apache.oozie.action.hadoop.SparkArgsExtractor.loadProperties(File,
Properties): new java.io.FileReader(File)
At SparkArgsExtractor.java:[line 364]: Found reliance on default encoding in
org.apache.oozie.action.hadoop.SparkArgsExtractor.mergeAndAddPropetiesFile(List,
String): new java.io.FileWriter(String)
At SparkArgsExtractor.java:[line 359]: File(...) reads a file whose location
might be specified by user input
At SparkArgsExtractor.java:[line 147]: At SparkArgsExtractor.java:[line 144]
At SparkArgsExtractor.java:[line 298]: At SparkArgsExtractor.java:[line 190]
+1 There are no new bugs found in [sharelib/hive2].
+1 There are no new bugs found in [sharelib/pig].
+1 BACKWARDS_COMPATIBILITY
+1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient
annotations
+1 the patch does not modify JPA files
-1 TESTS
Tests run: 2055
Tests failed: 2
Tests errors: 1
The patch failed the following testcases:
testSetupMethodsWithSparkConfiguration(org.apache.oozie.action.hadoop.TestSparkActionExecutor)
testQuotedDriverAndExecutorExtraJavaOptionsParsing(org.apache.oozie.action.hadoop.TestSparkArgsExtractor)
Tests failing with errors:
testJMXInstrumentation(org.apache.oozie.util.TestMetricsInstrumentation)
+1 DISTRO
+1 distro tarball builds with the patch
----------------------------
-1 Overall result, please check the reported -1(s)
There is at least one warning, please check
The full output of the test-patch run is available at
https://builds.apache.org/job/PreCommit-OOZIE-Build/197/
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:--
0100 3706k 100 3706k 0 0 3907k 0 --:--:-- --:--:-- --:--:-- 6401k
Adding comment to JIRA
Comment added.
test-patch exit code: 1
Build step 'Execute shell' marked build as failure
[description-setter] Description set: OOZIE-3112
Archiving artifacts
[Fast Archiver] Compressed 1.82 MB of artifacts by 24.1% relative to #164
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
###################################################################################
############################## FAILED TESTS (if any)
##############################
3 tests failed.
FAILED: org.apache.oozie.util.TestMetricsInstrumentation.testJMXInstrumentation
Error Message:
Unable to open socket file: target process not responding or HotSpot VM not
loaded
Stack Trace:
com.sun.tools.attach.AttachNotSupportedException: Unable to open socket file:
target process not responding or HotSpot VM not loaded
at
sun.tools.attach.LinuxVirtualMachine.<init>(LinuxVirtualMachine.java:106)
at
sun.tools.attach.LinuxAttachProvider.attachVirtualMachine(LinuxAttachProvider.java:78)
at com.sun.tools.attach.VirtualMachine.attach(VirtualMachine.java:250)
at
org.apache.oozie.util.TestMetricsInstrumentation.testJMXInstrumentation(TestMetricsInstrumentation.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at junit.framework.TestCase.runTest(TestCase.java:176)
at junit.framework.TestCase.runBare(TestCase.java:141)
at junit.framework.TestResult$1.protect(TestResult.java:122)
at junit.framework.TestResult.runProtected(TestResult.java:142)
at junit.framework.TestResult.run(TestResult.java:125)
at junit.framework.TestCase.run(TestCase.java:129)
at junit.framework.TestSuite.runTest(TestSuite.java:255)
at junit.framework.TestSuite.run(TestSuite.java:250)
at
org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
FAILED:
org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSetupMethodsWithSparkConfiguration
Error Message:
expected:<3> but was:<1>
Stack Trace:
junit.framework.AssertionFailedError: expected:<3> but was:<1>
at junit.framework.Assert.fail(Assert.java:57)
at junit.framework.Assert.failNotEquals(Assert.java:329)
at junit.framework.Assert.assertEquals(Assert.java:78)
at junit.framework.Assert.assertEquals(Assert.java:234)
at junit.framework.Assert.assertEquals(Assert.java:241)
at junit.framework.TestCase.assertEquals(TestCase.java:409)
at
org.apache.oozie.action.hadoop.TestSparkActionExecutor._testSetupMethods(TestSparkActionExecutor.java:149)
at
org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSetupMethodsWithSparkConfiguration(TestSparkActionExecutor.java:102)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at junit.framework.TestCase.runTest(TestCase.java:176)
at junit.framework.TestCase.runBare(TestCase.java:141)
at junit.framework.TestResult$1.protect(TestResult.java:122)
at junit.framework.TestResult.runProtected(TestResult.java:142)
at junit.framework.TestResult.run(TestResult.java:125)
at junit.framework.TestCase.run(TestCase.java:129)
at junit.framework.TestSuite.runTest(TestSuite.java:255)
at junit.framework.TestSuite.run(TestSuite.java:250)
at
org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
FAILED:
org.apache.oozie.action.hadoop.TestSparkArgsExtractor.testQuotedDriverAndExecutorExtraJavaOptionsParsing
Error Message:
Spark args mismatch expected:<[--master, yarn, --deploy-mode, client, --name,
Spark Copy File, --class, org.apache.oozie.example.SparkFileCopy, --conf,
spark.executor.extraJavaOptions=-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=/tmp -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.driver.extraJavaOptions=-Xmn2703m -XX:SurvivorRatio=2
-XX:ParallelGCThreads=20 -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.executor.extraClassPath=$PWD/*, --conf,
spark.driver.extraClassPath=$PWD/*, --conf,
spark.yarn.security.tokens.hadoopfs.enabled=false, --conf,
spark.yarn.security.tokens.hive.enabled=false, --conf,
spark.yarn.security.tokens.hbase.enabled=false, --conf,
spark.yarn.security.credentials.hadoopfs.enabled=false, --conf,
spark.yarn.security.credentials.hive.enabled=false, --conf,
spark.yarn.security.credentials.hbase.enabled=false, --files,
spark-log4j.properties,hive-site.xml, --conf, spark.yarn.jar=null, --verbose,
/lib/test.jar, arg0, arg1]> but was:<[--master, yarn, --deploy-mode, client,
--name, Spark Copy File, --class, org.apache.oozie.example.SparkFileCopy,
--conf, spark.executor.extraJavaOptions=-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=/tmp -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.driver.extraJavaOptions=-Xmn2703m -XX:SurvivorRatio=2
-XX:ParallelGCThreads=20 -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.executor.extraClassPath=$PWD/*, --conf,
spark.driver.extraClassPath=$PWD/*, --conf,
spark.yarn.security.tokens.hadoopfs.enabled=false, --conf,
spark.yarn.security.tokens.hive.enabled=false, --conf,
spark.yarn.security.tokens.hbase.enabled=false, --conf,
spark.yarn.security.credentials.hadoopfs.enabled=false, --conf,
spark.yarn.security.credentials.hive.enabled=false, --conf,
spark.yarn.security.credentials.hbase.enabled=false, --properties-file,
spark-defaults-generated.properties, --files,
spark-log4j.properties,hive-site.xml, --conf, spark.yarn.jar=null, --verbose,
/lib/test.jar, arg0, arg1]>
Stack Trace:
java.lang.AssertionError: Spark args mismatch expected:<[--master, yarn,
--deploy-mode, client, --name, Spark Copy File, --class,
org.apache.oozie.example.SparkFileCopy, --conf,
spark.executor.extraJavaOptions=-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=/tmp -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.driver.extraJavaOptions=-Xmn2703m -XX:SurvivorRatio=2
-XX:ParallelGCThreads=20 -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.executor.extraClassPath=$PWD/*, --conf,
spark.driver.extraClassPath=$PWD/*, --conf,
spark.yarn.security.tokens.hadoopfs.enabled=false, --conf,
spark.yarn.security.tokens.hive.enabled=false, --conf,
spark.yarn.security.tokens.hbase.enabled=false, --conf,
spark.yarn.security.credentials.hadoopfs.enabled=false, --conf,
spark.yarn.security.credentials.hive.enabled=false, --conf,
spark.yarn.security.credentials.hbase.enabled=false, --files,
spark-log4j.properties,hive-site.xml, --conf, spark.yarn.jar=null, --verbose,
/lib/test.jar, arg0, arg1]> but was:<[--master, yarn, --deploy-mode, client,
--name, Spark Copy File, --class, org.apache.oozie.example.SparkFileCopy,
--conf, spark.executor.extraJavaOptions=-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=/tmp -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.driver.extraJavaOptions=-Xmn2703m -XX:SurvivorRatio=2
-XX:ParallelGCThreads=20 -Dlog4j.configuration=spark-log4j.properties, --conf,
spark.executor.extraClassPath=$PWD/*, --conf,
spark.driver.extraClassPath=$PWD/*, --conf,
spark.yarn.security.tokens.hadoopfs.enabled=false, --conf,
spark.yarn.security.tokens.hive.enabled=false, --conf,
spark.yarn.security.tokens.hbase.enabled=false, --conf,
spark.yarn.security.credentials.hadoopfs.enabled=false, --conf,
spark.yarn.security.credentials.hive.enabled=false, --conf,
spark.yarn.security.credentials.hbase.enabled=false, --properties-file,
spark-defaults-generated.properties, --files,
spark-log4j.properties,hive-site.xml, --conf, spark.yarn.jar=null, --verbose,
/lib/test.jar, arg0, arg1]>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:743)
at org.junit.Assert.assertEquals(Assert.java:118)
at
org.apache.oozie.action.hadoop.TestSparkArgsExtractor.testQuotedDriverAndExecutorExtraJavaOptionsParsing(TestSparkArgsExtractor.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)