[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17788119#comment-17788119 ] Ayush Saxena commented on HADOOP-18957: --- Committed to trunk. Thanx [~pj.fanning] for the contribution!!! > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Assignee: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17788118#comment-17788118 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1819576205 Failed test passes locally ``` [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl [INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.9 s - in org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl [INFO] [INFO] Results: [INFO] [INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0 ``` > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Assignee: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17788117#comment-17788117 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn merged PR #6231: URL: https://github.com/apache/hadoop/pull/6231 > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Assignee: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787587#comment-17787587 ] ASF GitHub Bot commented on HADOOP-18957: - hadoop-yetus commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1817828183 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 33s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 4s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 90 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 2s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 31m 0s | | trunk passed | | +1 :green_heart: | compile | 16m 4s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 14m 50s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | checkstyle | 4m 35s | | trunk passed | | +1 :green_heart: | mvnsite | 18m 36s | | trunk passed | | +1 :green_heart: | javadoc | 8m 26s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 35s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 67m 35s | | trunk passed | | +1 :green_heart: | shadedclient | 60m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 44s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 47m 21s | | the patch passed | | +1 :green_heart: | compile | 15m 45s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 15m 45s | | the patch passed | | +1 :green_heart: | compile | 14m 48s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | javac | 14m 48s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 25s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/6/artifact/out/results-checkstyle-root.txt) | root: The patch generated 9 new + 3004 unchanged - 26 fixed = 3013 total (was 3030) | | +1 :green_heart: | mvnsite | 13m 24s | | the patch passed | | +1 :green_heart: | javadoc | 8m 18s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 31s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 76m 32s | | the patch passed | | +1 :green_heart: | shadedclient | 60m 31s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 738m 57s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/6/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 20s | | The patch does not generate ASF License warnings. | | | | 1191m 37s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6231 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 3d952d5678a7 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 218321b665716aa887f83693540e4bc017ad8a50 | | Default Java | Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | Multi-JDK versions |
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787501#comment-17787501 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1817543679 The list of 11 checkstyle issues are largely pre-existing issues With the indentation issues, the highlighted lines match the surrounding lines in terms of style. With the `charsetUTF8` variables, those var names were pre-existing and I think it is better not to change the names. In https://github.com/apache/hadoop/pull/6231/commits/218321b665716aa887f83693540e4bc017ad8a50, I have tried to fix one line length issue and tried to fix a pre-existing indentation issue in a test. > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787498#comment-17787498 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1817535488 There is a link with the checkstyle warnings, just click on it https://github.com/apache/hadoop/assets/25608848/a3a6a1b3-d616-47ab-8b79-7d5330b2bd2a;> Some 11 lines https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/5/artifact/out/results-checkstyle-root.txt ``` ./hadoop-common-project/hadoop-registry/src/main/java/org/apache/hadoop/registry/client/impl/zk/RegistrySecurity.java:299: digestAuthData = authPair.getBytes(StandardCharsets.UTF_8);: 'block' child has incorrect indentation level 10, expected level should be 8. [Indentation] ./hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/protocol/datatransfer/sasl/TestSaslDataTransfer.java:204: DFSTestUtil.readFile(fs, PATH).getBytes(StandardCharsets.UTF_8));: 'DFSTestUtil' has incorrect indentation level 6, expected level should be 8. [Indentation] ./hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/TaskLog.java:117: StandardCharsets.UTF_8));: 'StandardCharsets' has incorrect indentation level 6, expected level should be 8. [Indentation] ./hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/test/java/org/apache/hadoop/mapred/TestConcatenatedCompressedInput.java:299: new String(uncompressedBuf, 0, numBytesUncompressed, StandardCharsets.UTF_8);: 'new' has incorrect indentation level 8, expected level should be 10. [Indentation] ./hadoop-mapreduce-project/hadoop-mapreduce-examples/src/main/java/org/apache/hadoop/examples/dancing/DistributedPentomino.java:144: (fs.create(input), 64*1024), StandardCharsets.UTF_8));: 'operator new lparen' has incorrect indentation level 22, expected level should be 6. [Indentation] ./hadoop-mapreduce-project/hadoop-mapreduce-examples/src/main/java/org/apache/hadoop/examples/dancing/DistributedPentomino.java:144: (fs.create(input), 64*1024), StandardCharsets.UTF_8));:23: '(' should be on the previous line. [MethodParamPad] ./hadoop-tools/hadoop-gridmix/src/main/java/org/apache/hadoop/mapred/gridmix/CompressionEmulationUtil.java:103: private static final Charset charsetUTF8 = StandardCharsets.UTF_8;:32: Name 'charsetUTF8' must match pattern '^[A-Z][A-Z0-9]*(_[A-Z0-9]+)*$'. [ConstantName] ./hadoop-tools/hadoop-gridmix/src/main/java/org/apache/hadoop/mapred/gridmix/DistributedCacheEmulator.java:117: private static final Charset charsetUTF8 = StandardCharsets.UTF_8;:32: Name 'charsetUTF8' must match pattern '^[A-Z][A-Z0-9]*(_[A-Z0-9]+)*$'. [ConstantName] ./hadoop-tools/hadoop-gridmix/src/main/java/org/apache/hadoop/mapred/gridmix/GenerateDistCacheData.java:100: private static final Charset charsetUTF8 = StandardCharsets.UTF_8;:32: Name 'charsetUTF8' must match pattern '^[A-Z][A-Z0-9]*(_[A-Z0-9]+)*$'. [ConstantName] ./hadoop-tools/hadoop-streaming/src/main/java/org/apache/hadoop/streaming/StreamBaseRecordReader.java:107: String recordStr = new String(record, start, Math.min(len, statusMaxRecordChars_), StandardCharsets.UTF_8);: Line is longer than 100 characters (found 113). [LineLength] ./hadoop-tools/hadoop-streaming/src/test/java/org/apache/hadoop/streaming/TestUnconsumedInput.java:61: out.write(input.getBytes(StandardCharsets.UTF_8));: 'for' child has incorrect indentation level 8, expected level should be 6. [Indentation] ``` > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787496#comment-17787496 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1817533605 > yetus reports some checksyle warnings @ayushtkn do you have tips and tricks for how to find checkstyle issues caused by a PR? This PR is pretty big touching 100s of files and there are 1000s of pre-existing checkstyle issues. I tried concentrating on going through LineLength and UnusedImports issues but there are 100s of them and after spending a while on them I hadn't yet reached one that was caused by me. To be honest, I'm not really in a good position time wise to read through such a large checkstyle output. I code mainly in Scala and tools like scalafmt will not only spot checkstyle issues, it will fix them too. > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787484#comment-17787484 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1817500126 yetus reports some checksyle warnings > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787472#comment-17787472 ] ASF GitHub Bot commented on HADOOP-18957: - hadoop-yetus commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1817472686 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 90 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 28s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 35m 18s | | trunk passed | | +1 :green_heart: | compile | 18m 7s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 16m 30s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | checkstyle | 4m 52s | | trunk passed | | +1 :green_heart: | mvnsite | 18m 52s | | trunk passed | | +1 :green_heart: | javadoc | 8m 44s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 29s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 68m 53s | | trunk passed | | +1 :green_heart: | shadedclient | 66m 58s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 43s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 52m 24s | | the patch passed | | +1 :green_heart: | compile | 17m 46s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 17m 46s | | the patch passed | | +1 :green_heart: | compile | 16m 34s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | javac | 16m 34s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 56s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/5/artifact/out/results-checkstyle-root.txt) | root: The patch generated 11 new + 3012 unchanged - 15 fixed = 3023 total (was 3027) | | +1 :green_heart: | mvnsite | 14m 21s | | the patch passed | | +1 :green_heart: | javadoc | 8m 46s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 31s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 77m 29s | | the patch passed | | +1 :green_heart: | shadedclient | 67m 5s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 779m 14s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/5/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 26s | | The patch does not generate ASF License warnings. | | | | 1264m 12s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6231 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 61d7e9963ebe 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 1f89c99ec6ae61e9af91d5a6c06999ea448f616a | | Default Java | Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | Multi-JDK versions |
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787206#comment-17787206 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning commented on code in PR #6231: URL: https://github.com/apache/hadoop/pull/6231#discussion_r1397322918 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/web/ServletUtils.java: ## @@ -24,14 +24,15 @@ import javax.servlet.http.HttpServletRequest; import java.io.IOException; import java.nio.charset.Charset; +import java.nio.charset.StandardCharsets; import java.util.List; /** * Servlet utility methods. */ @InterfaceAudience.Private class ServletUtils { - private static final Charset UTF8_CHARSET = Charset.forName("UTF-8"); + private static final Charset UTF8_CHARSET = StandardCharsets.UTF_8; Review Comment: added https://github.com/apache/hadoop/pull/6231/commits/1f89c99ec6ae61e9af91d5a6c06999ea448f616a > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787197#comment-17787197 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on code in PR #6231: URL: https://github.com/apache/hadoop/pull/6231#discussion_r1397307869 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/web/ServletUtils.java: ## @@ -24,14 +24,15 @@ import javax.servlet.http.HttpServletRequest; import java.io.IOException; import java.nio.charset.Charset; +import java.nio.charset.StandardCharsets; import java.util.List; /** * Servlet utility methods. */ @InterfaceAudience.Private class ServletUtils { - private static final Charset UTF8_CHARSET = Charset.forName("UTF-8"); + private static final Charset UTF8_CHARSET = StandardCharsets.UTF_8; Review Comment: Yep, that was what I thought > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787195#comment-17787195 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning commented on code in PR #6231: URL: https://github.com/apache/hadoop/pull/6231#discussion_r1397293839 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/web/ServletUtils.java: ## @@ -24,14 +24,15 @@ import javax.servlet.http.HttpServletRequest; import java.io.IOException; import java.nio.charset.Charset; +import java.nio.charset.StandardCharsets; import java.util.List; /** * Servlet utility methods. */ @InterfaceAudience.Private class ServletUtils { - private static final Charset UTF8_CHARSET = Charset.forName("UTF-8"); + private static final Charset UTF8_CHARSET = StandardCharsets.UTF_8; Review Comment: it is used in one place. Are you suggesting that I remove this and make the code that uses this, use StandardCharsets.UTF_8 directly instead? > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787191#comment-17787191 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on code in PR #6231: URL: https://github.com/apache/hadoop/pull/6231#discussion_r1397278741 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/web/ServletUtils.java: ## @@ -24,14 +24,15 @@ import javax.servlet.http.HttpServletRequest; import java.io.IOException; import java.nio.charset.Charset; +import java.nio.charset.StandardCharsets; import java.util.List; /** * Servlet utility methods. */ @InterfaceAudience.Private class ServletUtils { - private static final Charset UTF8_CHARSET = Charset.forName("UTF-8"); + private static final Charset UTF8_CHARSET = StandardCharsets.UTF_8; Review Comment: Not required > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787190#comment-17787190 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on code in PR #6231: URL: https://github.com/apache/hadoop/pull/6231#discussion_r1397278368 ## pom.xml: ## @@ -225,6 +225,14 @@ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/x org.apache.hadoop.thirdparty.com.google.common.io.BaseEncoding.** + +true +Use java.nio.charset.StandardCharsets rather than Guava provided Charsets Review Comment: ok, then we are done here > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787188#comment-17787188 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning commented on code in PR #6231: URL: https://github.com/apache/hadoop/pull/6231#discussion_r1397275697 ## pom.xml: ## @@ -225,6 +225,14 @@ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/x org.apache.hadoop.thirdparty.com.google.common.io.BaseEncoding.** + +true +Use java.nio.charset.StandardCharsets rather than Guava provided Charsets Review Comment: Yes - we still need java.nio.charset.Charset. The StandardCharsets are implementations of the Charset interface and there are cases where we need to call Charset.forName when the charset is not hardcoded to UTF-8. > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17787187#comment-17787187 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on code in PR #6231: URL: https://github.com/apache/hadoop/pull/6231#discussion_r1397266601 ## pom.xml: ## @@ -225,6 +225,14 @@ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/x org.apache.hadoop.thirdparty.com.google.common.io.BaseEncoding.** + +true +Use java.nio.charset.StandardCharsets rather than Guava provided Charsets Review Comment: Are we ok having ``java.nio.charset.Charset`` ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/web/ServletUtils.java: ## @@ -24,14 +24,15 @@ import javax.servlet.http.HttpServletRequest; import java.io.IOException; import java.nio.charset.Charset; +import java.nio.charset.StandardCharsets; import java.util.List; /** * Servlet utility methods. */ @InterfaceAudience.Private class ServletUtils { - private static final Charset UTF8_CHARSET = Charset.forName("UTF-8"); + private static final Charset UTF8_CHARSET = StandardCharsets.UTF_8; Review Comment: I don't think we need this constant we can directly use ``StandardCharsets.UTF_8;`` below & eliminate `` import java.nio.charset.Charset`` > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17786260#comment-17786260 ] ASF GitHub Bot commented on HADOOP-18957: - hadoop-yetus commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1812093013 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 34m 27s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 4s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 1s | | The patch appears to include 90 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 30s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 33m 27s | | trunk passed | | +1 :green_heart: | compile | 17m 8s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 15m 12s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | checkstyle | 4m 33s | | trunk passed | | +1 :green_heart: | mvnsite | 17m 57s | | trunk passed | | +1 :green_heart: | javadoc | 8m 19s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 32s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 67m 23s | | trunk passed | | +1 :green_heart: | shadedclient | 60m 18s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 43s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 47m 17s | | the patch passed | | +1 :green_heart: | compile | 15m 40s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 15m 40s | | the patch passed | | +1 :green_heart: | compile | 14m 39s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | javac | 14m 39s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 24s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/4/artifact/out/results-checkstyle-root.txt) | root: The patch generated 11 new + 3016 unchanged - 15 fixed = 3027 total (was 3031) | | +1 :green_heart: | mvnsite | 13m 23s | | the patch passed | | +1 :green_heart: | javadoc | 8m 17s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 32s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 76m 23s | | the patch passed | | +1 :green_heart: | shadedclient | 60m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 730m 58s | | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 33s | | The patch does not generate ASF License warnings. | | | | 1219m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6231 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 0315c2901c8e 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 146b30280539b79bb5c47ffc657c658eeece841d | | Default Java | Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/4/testReport/ | | Max. process+thread count | 4007 (vs. ulimit of 5500) |
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17785884#comment-17785884 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1810173668 > Quick Pass & mostly looks good, can we have some restrictions to prevent future usage of these? Something like this or better? https://github.com/apache/hadoop/blob/trunk/pom.xml#L194 Good idea. I've committed a banned import for the Guava Charsets class. > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17785132#comment-17785132 ] ASF GitHub Bot commented on HADOOP-18957: - ayushtkn commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1806669003 I will be back by end of next week, if nobody volunteers by then, I can take of this :-) > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17784750#comment-17784750 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1805425049 @steveloughran @ayushtkn @slfan1989 Would any of you have time to look at this PR? > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17781271#comment-17781271 ] ASF GitHub Bot commented on HADOOP-18957: - hadoop-yetus commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1786746913 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 6s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 1s | | The patch appears to include 90 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 7s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 25s | | trunk passed | | +1 :green_heart: | compile | 16m 40s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 16m 3s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | checkstyle | 4m 55s | | trunk passed | | +1 :green_heart: | mvnsite | 39m 0s | | trunk passed | | +1 :green_heart: | javadoc | 35m 35s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 33m 40s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 53m 29s | | trunk passed | | +1 :green_heart: | shadedclient | 35m 27s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 20m 59s | | the patch passed | | +1 :green_heart: | compile | 16m 33s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 16m 33s | | the patch passed | | +1 :green_heart: | compile | 15m 49s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | javac | 15m 49s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 51s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/3/artifact/out/results-checkstyle-root.txt) | root: The patch generated 11 new + 3012 unchanged - 15 fixed = 3023 total (was 3027) | | +1 :green_heart: | mvnsite | 38m 54s | | the patch passed | | +1 :green_heart: | javadoc | 35m 30s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 33m 57s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 61m 45s | | the patch passed | | +1 :green_heart: | shadedclient | 34m 11s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 51s | | hadoop-maven-plugins in the patch passed. | | +1 :green_heart: | unit | 3m 34s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 0m 48s | | hadoop-auth-examples in the patch passed. | | +1 :green_heart: | unit | 19m 22s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 52s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 33s | | hadoop-registry in the patch passed. | | +1 :green_heart: | unit | 2m 52s | | hadoop-hdfs-client in the patch passed. | | +1 :green_heart: | unit | 217m 3s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 6m 43s | | hadoop-hdfs-httpfs in the patch passed. | | +1 :green_heart: | unit | 3m 27s | | hadoop-hdfs-nfs in the patch passed. | | +1 :green_heart: | unit | 1m 33s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 6m 15s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 4m 2s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 4m 55s | | hadoop-yarn-server-applicationhistoryservice in the patch passed. | | +1 :green_heart: | unit | 2m 20s | | hadoop-yarn-server-timelineservice in the patch passed. | | +1
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17780694#comment-17780694 ] ASF GitHub Bot commented on HADOOP-18957: - hadoop-yetus commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1784022857 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 6s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 1s | | The patch appears to include 90 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 17s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 9s | | trunk passed | | +1 :green_heart: | compile | 17m 11s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 16m 0s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | checkstyle | 4m 53s | | trunk passed | | +1 :green_heart: | mvnsite | 38m 57s | | trunk passed | | +1 :green_heart: | javadoc | 35m 27s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 33m 48s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 53m 34s | | trunk passed | | +1 :green_heart: | shadedclient | 35m 9s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 20m 57s | | the patch passed | | +1 :green_heart: | compile | 16m 29s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 16m 29s | | the patch passed | | +1 :green_heart: | compile | 15m 23s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | javac | 15m 23s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 46s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/2/artifact/out/results-checkstyle-root.txt) | root: The patch generated 11 new + 3012 unchanged - 15 fixed = 3023 total (was 3027) | | +1 :green_heart: | mvnsite | 38m 45s | | the patch passed | | +1 :green_heart: | javadoc | 35m 20s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 33m 42s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 61m 46s | | the patch passed | | +1 :green_heart: | shadedclient | 33m 55s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 46s | | hadoop-maven-plugins in the patch passed. | | +1 :green_heart: | unit | 3m 28s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 0m 48s | | hadoop-auth-examples in the patch passed. | | +1 :green_heart: | unit | 19m 24s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 55s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 33s | | hadoop-registry in the patch passed. | | +1 :green_heart: | unit | 2m 49s | | hadoop-hdfs-client in the patch passed. | | +1 :green_heart: | unit | 219m 15s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 6m 36s | | hadoop-hdfs-httpfs in the patch passed. | | +1 :green_heart: | unit | 3m 25s | | hadoop-hdfs-nfs in the patch passed. | | +1 :green_heart: | unit | 1m 34s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 6m 13s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 4m 4s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 4m 52s | | hadoop-yarn-server-applicationhistoryservice in the patch passed. | | +1 :green_heart: | unit | 2m 20s | | hadoop-yarn-server-timelineservice in the patch passed. | | +1
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17780607#comment-17780607 ] ASF GitHub Bot commented on HADOOP-18957: - hadoop-yetus commented on PR #6231: URL: https://github.com/apache/hadoop/pull/6231#issuecomment-1783767669 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 22s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 4s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 77 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 52s | | Maven dependency ordering for branch | | -1 :x: | mvninstall | 39m 17s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/1/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | +1 :green_heart: | compile | 19m 24s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 16m 35s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | checkstyle | 4m 51s | | trunk passed | | +1 :green_heart: | mvnsite | 29m 47s | | trunk passed | | +1 :green_heart: | javadoc | 26m 40s | | trunk passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 25m 27s | | trunk passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 44m 59s | | trunk passed | | +1 :green_heart: | shadedclient | 39m 16s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 28s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 17m 51s | | the patch passed | | +1 :green_heart: | compile | 17m 39s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 17m 39s | | the patch passed | | +1 :green_heart: | compile | 16m 36s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | javac | 16m 36s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 4m 47s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/1/artifact/out/results-checkstyle-root.txt) | root: The patch generated 27 new + 2833 unchanged - 15 fixed = 2860 total (was 2848) | | +1 :green_heart: | mvnsite | 29m 46s | | the patch passed | | +1 :green_heart: | javadoc | 26m 36s | | the patch passed with JDK Ubuntu-11.0.20.1+1-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 28m 29s | | the patch passed with JDK Private Build-1.8.0_382-8u382-ga-1~20.04.1-b05 | | +1 :green_heart: | spotbugs | 52m 52s | | the patch passed | | +1 :green_heart: | shadedclient | 38m 50s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 38s | | hadoop-maven-plugins in the patch passed. | | +1 :green_heart: | unit | 3m 24s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 0m 32s | | hadoop-auth-examples in the patch passed. | | +1 :green_heart: | unit | 19m 12s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 43s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 21s | | hadoop-registry in the patch passed. | | +1 :green_heart: | unit | 2m 37s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 240m 16s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6231/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 6m 10s | | hadoop-hdfs-httpfs in the patch passed. | | +1 :green_heart: | unit | 3m 13s | | hadoop-hdfs-nfs in the patch passed. | | +1 :green_heart: |
[jira] [Commented] (HADOOP-18957) Use StandardCharsets.UTF_8 constant
[ https://issues.apache.org/jira/browse/HADOOP-18957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17780378#comment-17780378 ] ASF GitHub Bot commented on HADOOP-18957: - pjfanning opened a new pull request, #6231: URL: https://github.com/apache/hadoop/pull/6231 ### Description of PR HADOOP-18957 ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > Use StandardCharsets.UTF_8 constant > --- > > Key: HADOOP-18957 > URL: https://issues.apache.org/jira/browse/HADOOP-18957 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > > * there are some places in the code that have to check for > UnsupportedCharsetException when explicitly using the charset name "UTF-8" > * using StandardCharsets.UTF_8 is more efficient because the Java libs > usually have to look up the charsets when you provide it as String param > instead > * also stop using Guava Charsets and use StandardCharsets -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org