[ 
https://issues.apache.org/jira/browse/HADOOP-15711?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16763241#comment-16763241
 ] 

Jonathan Hung commented on HADOOP-15711:
----------------------------------------

Attached 002 patch:
 * Install openjdk8 in Dockerfile
 * Set default to openjdk7
 * add -Dhttps.protocols=TLSv1.2 to MAVEN_OPTS (hit an issue similar to 
HBASE-21074):
{noformat}
ERROR] [ERROR] Some problems were encountered while processing the POMs:
[ERROR] Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.0 or one of its dependencies could not 
be resolved: Failed to read artifact descriptor for 
org.apache.felix:maven-bundle-plugin:jar:2.5.0 @
@
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]   
[ERROR]   The project org.apache.hadoop:hadoop-main:2.10.0-SNAPSHOT 
(/build/source/pom.xml) has 1 error
[ERROR]     Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.0 or one of its dependencies could not 
be resolved: Failed to read artifact descriptor for 
org.apache.felix:maven-bundle-plugin:jar:2.5.0: Could not transfer artifact 
org.apache.felix:maven-bundle-plugin:pom:2.5.0 from/to central 
(https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version 
-> [Help 2]{noformat}
Also set these configs in the 
[precommit-HADOOP|https://builds.apache.org/view/H-L/view/Hadoop/job/PreCommit-HADOOP-Build/]
 jenkins job: 
 * 
{noformat}
YETUS_ARGS+=("--java-home=/usr/lib/jvm/java-8-openjdk-amd64")
YETUS_ARGS+=("--multijdkdirs=/usr/lib/jvm/java-7-openjdk-amd64")
YETUS_ARGS+=("--multijdktests=compile"){noformat}
Assuming all goes well, I will set these configs in 
[precommit-HDFS|https://builds.apache.org/view/H-L/view/Hadoop/job/PreCommit-HDFS-Build/],
 
[precommit-YARN|https://builds.apache.org/view/H-L/view/Hadoop/job/PreCommit-YARN-Build/],
 
[precommit-MAPREDUCE|https://builds.apache.org/view/H-L/view/Hadoop/job/PreCommit-MAPREDUCE-Build/],
 and the nightly [branch-2 
build|https://builds.apache.org/view/H-L/view/Hadoop/job/hadoop-qbt-branch2-java7-linux-x86]
 (and re-enable the latter).

> Fix branch-2 builds
> -------------------
>
>                 Key: HADOOP-15711
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15711
>             Project: Hadoop Common
>          Issue Type: Task
>            Reporter: Jonathan Hung
>            Priority: Critical
>         Attachments: HADOOP-15711-branch-2.002.patch, 
> HADOOP-15711.001.branch-2.patch
>
>
> Branch-2 builds have been disabled for a while: 
> https://builds.apache.org/view/H-L/view/Hadoop/job/hadoop-qbt-branch2-java7-linux-x86/
> A test run here causes hdfs tests to hang: 
> https://builds.apache.org/view/H-L/view/Hadoop/job/hadoop-qbt-branch2-java7-linux-x86-jhung/4/
> Running hadoop-hdfs tests locally reveal some errors such 
> as:{noformat}[ERROR] 
> testComplexAppend2(org.apache.hadoop.hdfs.TestFileAppend2)  Time elapsed: 
> 0.059 s  <<< ERROR!
> java.lang.OutOfMemoryError: unable to create new native thread
>         at java.lang.Thread.start0(Native Method)
>         at java.lang.Thread.start(Thread.java:714)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSImage.saveFSImageInAllDirs(FSImage.java:1164)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSImage.saveFSImageInAllDirs(FSImage.java:1128)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:174)
>         at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1172)
>         at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:403)
>         at 
> org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:234)
>         at 
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:1080)
>         at 
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:883)
>         at 
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:514)
>         at 
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:473)
>         at 
> org.apache.hadoop.hdfs.TestFileAppend2.testComplexAppend(TestFileAppend2.java:489)
>         at 
> org.apache.hadoop.hdfs.TestFileAppend2.testComplexAppend2(TestFileAppend2.java:543)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43){noformat}
> I was able to get more tests passing locally by increasing the max user 
> process count on my machine. But the error suggests that there's an issue in 
> the tests themselves. Not sure if the error seen locally is the same reason 
> as why jenkins builds are failing, I wasn't able to confirm based on the 
> jenkins builds' lack of output.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to