[ https://issues.apache.org/jira/browse/HADOOP-19167?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17844616#comment-17844616 ]
ASF GitHub Bot commented on HADOOP-19167: ----------------------------------------- hadoop-yetus commented on PR #6798: URL: https://github.com/apache/hadoop/pull/6798#issuecomment-2100264516 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |:----:|----------:|--------:|:--------:|:-------:| | +0 :ok: | reexec | 0m 20s | | Docker mode activated. | |||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | |||| _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 34m 59s | | trunk passed | | +1 :green_heart: | compile | 9m 30s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 8m 56s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 0m 44s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 56s | | trunk passed | | +1 :green_heart: | javadoc | 0m 47s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 30s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 1m 24s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 44s | | branch has no errors when building and testing our client artifacts. | |||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 11m 11s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 11m 11s | | the patch passed | | +1 :green_heart: | compile | 10m 42s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 10m 42s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 44s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 9s | | the patch passed | | +1 :green_heart: | javadoc | 0m 40s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 27s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 1m 48s | | the patch passed | | +1 :green_heart: | shadedclient | 29m 2s | | patch has no errors when building and testing our client artifacts. | |||| _ Other Tests _ | | -1 :x: | unit | 18m 30s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6798/3/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 159m 52s | | | | Reason | Tests | |-------:|:------| | Failed junit tests | hadoop.metrics2.source.TestJvmMetrics | | | hadoop.net.TestSocketIOWithTimeout | | Subsystem | Report/Notes | |----------:|:-------------| | Docker | ClientAPI=1.44 ServerAPI=1.44 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6798/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6798 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux ee36b7172d7f 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 0c231ab8661c638138d9c402f2dffc493cde1aef | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6798/3/testReport/ | | Max. process+thread count | 1269 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6798/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Change of Codec configuration does not work > ------------------------------------------- > > Key: HADOOP-19167 > URL: https://issues.apache.org/jira/browse/HADOOP-19167 > Project: Hadoop Common > Issue Type: Bug > Components: compress > Reporter: Zhikai Hu > Priority: Minor > Labels: pull-request-available > > In one of my projects, I need to dynamically adjust compression level for > different files. > However, I found that in most cases the new compression level does not take > effect as expected, the old compression level continues to be used. > Here is the relevant code snippet: > ZStandardCodec zStandardCodec = new ZStandardCodec(); > zStandardCodec.setConf(conf); > conf.set("io.compression.codec.zstd.level", "5"); // level may change > dynamically > conf.set("io.compression.codec.zstd", zStandardCodec.getClass().getName()); > writer = SequenceFile.createWriter(conf, > SequenceFile.Writer.file(sequenceFilePath), > > SequenceFile.Writer.keyClass(LongWritable.class), > > SequenceFile.Writer.valueClass(BytesWritable.class), > > SequenceFile.Writer.compression(CompressionType.BLOCK)); > The reason is SequenceFile.Writer.init() method will call > CodecPool.getCompressor(codec, null) to get a compressor. > If the compressor is a reused instance, the conf is not applied because it is > passed as null: > public static Compressor getCompressor(CompressionCodec codec, Configuration > conf) { > Compressor compressor = borrow(compressorPool, codec.getCompressorType()); > if (compressor == null) > { compressor = codec.createCompressor(); LOG.info("Got brand-new compressor > ["+codec.getDefaultExtension()+"]"); } > else { > compressor.reinit(conf); //conf is null here > ...... > > Please also refer to my unit test to reproduce the bug. > To address this bug, I modified the code to ensure that the configuration is > read back from the codec when a compressor is reused. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org