hadoop-yetus commented on PR #6535:
URL: https://github.com/apache/hadoop/pull/6535#issuecomment-1933237018

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 20s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +0 :ok: |  shelldocs  |   0m  0s |  |  Shelldocs was not available.  |
   | +1 :green_heart: |  @author  |   0m  1s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 13 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  31m 59s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   8m  7s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   7m 23s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   2m  0s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |  12m  9s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   4m 33s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   4m 50s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |  16m 29s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  34m 51s |  |  branch has no errors 
when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 21s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |  17m  0s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   7m 47s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   7m 47s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   7m 16s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   7m 16s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/blanks-eol.txt)
 |  The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix 
<<patch_file>>. Refer https://git-scm.com/docs/git-apply  |
   | +1 :green_heart: |  checkstyle  |   1m 56s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   6m 38s |  |  the patch passed  |
   | -1 :x: |  shellcheck  |   0m  0s | 
[/results-shellcheck.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/results-shellcheck.txt)
 |  The patch generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0)  |
   | +1 :green_heart: |  javadoc  |   4m 22s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   4m 57s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | -1 :x: |  spotbugs  |  16m 31s | 
[/new-spotbugs-root.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/new-spotbugs-root.html)
 |  root generated 19 new + 0 unchanged - 0 fixed = 19 total (was 0)  |
   | +1 :green_heart: |  shadedclient  |  19m 16s |  |  patch has no errors 
when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | -1 :x: |  unit  | 628m 32s | 
[/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/patch-unit-root.txt)
 |  root in the patch failed.  |
   | -1 :x: |  asflicense  |   0m 48s | 
[/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/results-asflicense.txt)
 |  The patch generated 18 ASF License warnings.  |
   |  |   | 826m 36s |  |  |
   
   
   | Reason | Tests |
   |-------:|:------|
   | SpotBugs | module:root |
   |  |  Random object created and used only once in 
org.apache.hadoop.compat.AbstractHdfsCompatCase.getUniquePath(Path)  At 
AbstractHdfsCompatCase.java:only once in 
org.apache.hadoop.compat.AbstractHdfsCompatCase.getUniquePath(Path)  At 
AbstractHdfsCompatCase.java:[line 60] |
   |  |  org.apache.hadoop.compat.HdfsCompatEnvironment.getStoragePolicyNames() 
may expose internal representation by returning 
HdfsCompatEnvironment.defaultStoragePolicyNames  At 
HdfsCompatEnvironment.java:by returning 
HdfsCompatEnvironment.defaultStoragePolicyNames  At 
HdfsCompatEnvironment.java:[line 115] |
   |  |  Call to method of static java.text.DateFormat in 
org.apache.hadoop.compat.HdfsCompatEnvironment.init()  At 
HdfsCompatEnvironment.java:java.text.DateFormat in 
org.apache.hadoop.compat.HdfsCompatEnvironment.init()  At 
HdfsCompatEnvironment.java:[line 64] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.HdfsCompatShellScope.readLines(File):in 
org.apache.hadoop.compat.HdfsCompatShellScope.readLines(File): new 
java.io.FileReader(File)  At HdfsCompatShellScope.java:[line 356] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.HdfsCompatibility.printReport(HdfsCompatReport, 
OutputStream):in 
org.apache.hadoop.compat.HdfsCompatibility.printReport(HdfsCompatReport, 
OutputStream): new java.io.OutputStreamWriter(OutputStream)  At 
HdfsCompatibility.java:[line 209] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.HdfsCompatibility.printReport(HdfsCompatReport, 
OutputStream):in 
org.apache.hadoop.compat.HdfsCompatibility.printReport(HdfsCompatReport, 
OutputStream): String.getBytes()  At HdfsCompatibility.java:[line 208] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.cases.function.HdfsCompatCreate.createFile():in 
org.apache.hadoop.compat.cases.function.HdfsCompatCreate.createFile(): 
String.getBytes()  At HdfsCompatCreate.java:[line 100] |
   |  |  Random object created and used only once in 
org.apache.hadoop.compat.cases.function.HdfsCompatLocal.prepare()  At 
HdfsCompatLocal.java:only once in 
org.apache.hadoop.compat.cases.function.HdfsCompatLocal.prepare()  At 
HdfsCompatLocal.java:[line 54] |
   |  |  Random object created and used only once in 
org.apache.hadoop.compat.cases.function.HdfsCompatTpcds.create()  At 
HdfsCompatTpcds.java:only once in 
org.apache.hadoop.compat.cases.function.HdfsCompatTpcds.create()  At 
HdfsCompatTpcds.java:[line 56] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.getXAttr():in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.getXAttr(): 
String.getBytes()  At HdfsCompatXAttr.java:[line 57] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.getXAttrs():in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.getXAttrs(): 
String.getBytes()  At HdfsCompatXAttr.java:[line 65] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.listXAttrs():in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.listXAttrs(): 
String.getBytes()  At HdfsCompatXAttr.java:[line 77] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.removeXAttr():in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.removeXAttr(): 
String.getBytes()  At HdfsCompatXAttr.java:[line 87] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.setXAttr():in 
org.apache.hadoop.compat.cases.function.HdfsCompatXAttr.setXAttr(): 
String.getBytes()  At HdfsCompatXAttr.java:[line 48] |
   |  |  Found reliance on default encoding in 
org.apache.hadoop.compat.cases.implement.HdfsCompatFileSystemImpl.lambda$setXAttr$77():in
 
org.apache.hadoop.compat.cases.implement.HdfsCompatFileSystemImpl.lambda$setXAttr$77():
 String.getBytes()  At HdfsCompatFileSystemImpl.java:[line 611] |
   |  |  org.apache.hadoop.compat.suites.HdfsCompatSuiteForAll.getApiCases() 
may expose internal representation by returning HdfsCompatSuiteForAll.API_CASES 
 At HdfsCompatSuiteForAll.java:by returning HdfsCompatSuiteForAll.API_CASES  At 
HdfsCompatSuiteForAll.java:[line 61] |
   |  |  org.apache.hadoop.compat.suites.HdfsCompatSuiteForAll.getShellCases() 
may expose internal representation by returning 
HdfsCompatSuiteForAll.SHELL_CASES  At HdfsCompatSuiteForAll.java:by returning 
HdfsCompatSuiteForAll.SHELL_CASES  At HdfsCompatSuiteForAll.java:[line 66] |
   |  |  
org.apache.hadoop.compat.suites.HdfsCompatSuiteForShell.getShellCases() may 
expose internal representation by returning HdfsCompatSuiteForShell.SHELL_CASES 
 At HdfsCompatSuiteForShell.java:by returning 
HdfsCompatSuiteForShell.SHELL_CASES  At HdfsCompatSuiteForShell.java:[line 50] |
   |  |  org.apache.hadoop.compat.suites.HdfsCompatSuiteForTpcds.getApiCases() 
may expose internal representation by returning 
HdfsCompatSuiteForTpcds.API_CASES  At HdfsCompatSuiteForTpcds.java:by returning 
HdfsCompatSuiteForTpcds.API_CASES  At HdfsCompatSuiteForTpcds.java:[line 37] |
   | Failed junit tests | hadoop.hdfs.server.datanode.TestDirectoryScanner |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6535 |
   | Optional Tests | dupname asflicense mvnsite codespell detsecrets 
markdownlint compile javac javadoc mvninstall unit shadedclient xmllint 
shellcheck shelldocs spotbugs checkstyle |
   | uname | Linux 3c19973f82c5 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / d822ff5dae91e367d3bcae21c0e8bb03d38a8ca6 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/testReport/ |
   | Max. process+thread count | 4784 (vs. ulimit of 5500) |
   | modules | C: . hadoop-compat-bench U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 shellcheck=0.7.0 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to