[jira] [Created] (HADOOP-17151) Need to upgrad the version of jetty

2020-07-23 Thread liusheng (Jira)
liusheng created HADOOP-17151:
-

 Summary: Need to upgrad the version of jetty
 Key: HADOOP-17151
 URL: https://issues.apache.org/jira/browse/HADOOP-17151
 Project: Hadoop Common
  Issue Type: Bug
Reporter: liusheng


I have tried to configure and start Hadoop KMS service, it was failed to start 
the error log messages:
{noformat}
2020-07-23 10:57:31,872 INFO  Server - jetty-9.4.20.v20190813; built: 
2019-08-13T21:28:18.144Z; git: 84700530e645e812b336747464d6fbbf370c9a20; jvm 
1.8.0_252-8u252-b09-1~18.04-b09
2020-07-23 10:57:31,899 INFO  session - DefaultSessionIdManager workerName=node0
2020-07-23 10:57:31,899 INFO  session - No SessionScavenger set, using defaults
2020-07-23 10:57:31,901 INFO  session - node0 Scavenging every 66ms
2020-07-23 10:57:31,912 INFO  ContextHandler - Started 
o.e.j.s.ServletContextHandler@5bf0d49{logs,/logs,file:///opt/hadoop-3.4.0-SNAPSHOT/logs/,AVAILABLE}
2020-07-23 10:57:31,913 INFO  ContextHandler - Started 
o.e.j.s.ServletContextHandler@7c7a06ec{static,/static,jar:file:/opt/hadoop-3.4.0-SNAPSHOT/share/hadoop/common/hadoop-kms-3.4.0-SNAPSHOT.jar!/webapps/static,AVAILABLE}
2020-07-23 10:57:31,986 INFO  TypeUtil - JVM Runtime does not support Modules
2020-07-23 10:57:32,015 INFO  KMSWebApp - 
-
2020-07-23 10:57:32,015 INFO  KMSWebApp -   Java runtime version : 
1.8.0_252-8u252-b09-1~18.04-b09
2020-07-23 10:57:32,015 INFO  KMSWebApp -   User: hadoop
2020-07-23 10:57:32,015 INFO  KMSWebApp -   KMS Hadoop Version: 3.4.0-SNAPSHOT
2020-07-23 10:57:32,015 INFO  KMSWebApp - 
-
2020-07-23 10:57:32,023 INFO  KMSACLs - 'CREATE' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'DELETE' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'ROLLOVER' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'GET' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'GET_KEYS' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'GET_METADATA' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'SET_KEY_MATERIAL' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'GENERATE_EEK' ACL '*'
2020-07-23 10:57:32,024 INFO  KMSACLs - 'DECRYPT_EEK' ACL '*'
2020-07-23 10:57:32,025 INFO  KMSACLs - default.key.acl. for KEY_OP 'READ' is 
set to '*'
2020-07-23 10:57:32,025 INFO  KMSACLs - default.key.acl. for KEY_OP 
'MANAGEMENT' is set to '*'
2020-07-23 10:57:32,025 INFO  KMSACLs - default.key.acl. for KEY_OP 
'GENERATE_EEK' is set to '*'
2020-07-23 10:57:32,025 INFO  KMSACLs - default.key.acl. for KEY_OP 
'DECRYPT_EEK' is set to '*'
2020-07-23 10:57:32,080 INFO  KMSAudit - Initializing audit logger class 
org.apache.hadoop.crypto.key.kms.server.SimpleKMSAuditLogger
2020-07-23 10:57:32,537 INFO  KMSWebServer - SHUTDOWN_MSG:
/
SHUTDOWN_MSG: Shutting down KMSWebServer at 
hadoop-benchmark/172.17.0.2{noformat}
I have googled the error and found there is a simlar issue: 
[https://github.com/eclipse/jetty.project/issues/4064]

It looks like a bug of jetty and has  been fixed in jetty>=9.4.21, currently 
Hadoop use the jetty is version of 9.4.20, see hadoop-project/pom.xml.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16780) Track unstable tests according to aarch CI due to OOM

2019-12-29 Thread liusheng (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16780?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

liusheng resolved HADOOP-16780.
---
Resolution: Fixed

> Track unstable tests according to aarch CI due to OOM
> -
>
> Key: HADOOP-16780
> URL: https://issues.apache.org/jira/browse/HADOOP-16780
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: liusheng
>Priority: Major
>
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestDFSClientRetries.testLeaseRenewSocketTimeout|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestDFSClientRetries/testLeaseRenewSocketTimeout/]|1.9
>  sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksum1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksum1/]|2
>  min 53 sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksum3|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksum3/]|2
>  min 13 sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksumWithMissedDataBlocksRangeQuery4|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksumWithMissedDataBlocksRangeQuery4/]|1
>  min 32 sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksumWithMissedDataBlocksRangeQuery5|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksumWithMissedDataBlocksRangeQuery5/]|11
>  sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksumWithMissedDataBlocksRangeQuery6|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksumWithMissedDataBlocksRangeQuery6/]|3.8
>  sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksumWithMissedDataBlocksRangeQuery7|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksumWithMissedDataBlocksRangeQuery7/]|7.7
>  sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksumWithMissedDataBlocksRangeQuery8|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksumWithMissedDataBlocksRangeQuery8/]|3.7
>  sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksum.testStripedFileChecksumWithMissedDataBlocksRangeQuery9|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksum/testStripedFileChecksumWithMissedDataBlocksRangeQuery9/]|4
>  sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> [org.apache.hadoop.hdfs.TestFileChecksumCompositeCrc.testStripedFileChecksumWithMissedDataBlocksRangeQuery6|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/testReport/org.apache.hadoop.hdfs/TestFileChecksumCompositeCrc/testStripedFileChecksumWithMissedDataBlocksRangeQuery6/]|10
>  sec|[1|https://builds.apache.org/job/Hadoop-qbt-linux-ARM-trunk/55/]|
> |!https://builds.apache.org/static/8ad18952/images/16x16/document_add.png!  
> 

[jira] [Created] (HADOOP-16614) Missing leveldbjni package of aarch64 platform

2019-09-28 Thread liusheng (Jira)
liusheng created HADOOP-16614:
-

 Summary: Missing leveldbjni package of aarch64 platform
 Key: HADOOP-16614
 URL: https://issues.apache.org/jira/browse/HADOOP-16614
 Project: Hadoop Common
  Issue Type: Bug
Reporter: liusheng


Currently, Hadoop denpend on the *leveldbjni-all:1.8* package, but the ARM 
platform cannot be supported.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16603) Lack of aarch64 platform support of dependent PhantomJS

2019-09-25 Thread liusheng (Jira)
liusheng created HADOOP-16603:
-

 Summary: Lack of aarch64 platform support of dependent PhantomJS
 Key: HADOOP-16603
 URL: https://issues.apache.org/jira/browse/HADOOP-16603
 Project: Hadoop Common
  Issue Type: Bug
Reporter: liusheng


Hadoop depend the "PhantomJS-2.1.1"[1] library and import it by 
"phantomjs-maven-plugin:0.7", but there is an artifact of phantomjs for aarch64 
in "com.github.klieber" group used by Hadoop[2].

[1] 
[https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L1703-L1707]

[[2] 
https://search.maven.org/artifact/com.github.klieber/phantomjs/2.1.1/N%2FA|https://search.maven.org/artifact/com.github.klieber/phantomjs/2.1.1/N%2FA]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org