Jdk17

2024-02-04 Thread bilwa st
Hi folks,

This is regarding jdk17 pending work to be done. Can we have a new pipeline
for jdk17 builds? I can see that all jira's under HADOOP-17177 which got
merged are not run on jdk17 as of now. It would be helpful if we have
dedicated pipeline.

Thanks,
Bilwa


[jira] [Created] (HADOOP-19066) AWS SDK V2 - Enabling FIPS should be allowed with central endpoint

2024-02-04 Thread Viraj Jasani (Jira)
Viraj Jasani created HADOOP-19066:
-

 Summary: AWS SDK V2 - Enabling FIPS should be allowed with central 
endpoint
 Key: HADOOP-19066
 URL: https://issues.apache.org/jira/browse/HADOOP-19066
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: fs/s3
Affects Versions: 3.5.0, 3.4.1
Reporter: Viraj Jasani


FIPS support can be enabled by setting "fs.s3a.endpoint.fips". Since the SDK 
considers overriding endpoint and enabling fips as mutually exclusive, we fail 
fast if fs.s3a.endpoint is set with fips support (details on HADOOP-18975).

Now, we no longer override SDK endpoint for central endpoint since we enable 
cross region access (details on HADOOP-19044) but we would still fail fast if 
endpoint is central and fips is enabled.

Changes proposed:
 * S3A to fail fast only if FIPS is enabled and non-central endpoint is 
configured.
 * Tests to ensure S3 bucket is accessible with default region us-east-2 with 
cross region access (expected with central endpoint).
 * Document FIPS support with central endpoint on connecting.html.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Re: [VOTE] Release Apache Hadoop Thirdparty 1.2.0 (RC1)

2024-02-04 Thread Xiaoqiao He
+1(binding).

I checked the following items:
- [X] Download links are valid.
- [X] Checksums and PGP signatures are valid.
- [X] LICENSE and NOTICE files are correct for the repository.
- [X] Source code artifacts have correct names matching the current release.
- [X] All files have license headers if necessary.
- [X] Building is OK using `mvn clean install` on JDK_1.8.0_202.
- [X] Built Hadoop trunk successfully with updated thirdparty (include
update protobuf shaded path).
- [X] No difference between tag and release src tar.

Good Luck!

Best Regards,
- He Xiaoqiao


On Sun, Feb 4, 2024 at 10:29 PM slfan1989  wrote:

> Hi folks,
>
> Xiaoqiao He and I have put together a release candidate (RC1) for Hadoop
> Thirdparty 1.2.0.
>
> The RC is available at:
> https://dist.apache.org/repos/dist/dev/hadoop/hadoop-thirdparty-1.2.0-RC1
>
> The RC tag is
> https://github.com/apache/hadoop-thirdparty/releases/tag/release-1.2.0-RC1
>
> The maven artifacts are staged at
> https://repository.apache.org/content/repositories/orgapachehadoop-1401
>
> Comparing to 1.1.1, there are three additional fixes:
>
> HADOOP-18197. Upgrade Protobuf-Java to 3.21.12
> https://github.com/apache/hadoop-thirdparty/pull/26
>
> HADOOP-18921. Upgrade to avro 1.11.3
> https://github.com/apache/hadoop-thirdparty/pull/24
>
> HADOOP-18843. Guava version 32.0.1 bump to fix CVE-2023-2976 (#23)
> https://github.com/apache/hadoop-thirdparty/pull/23
>
> You can find my public key at :
> https://dist.apache.org/repos/dist/release/hadoop/common/KEYS
>
> Best Regards,
> Shilun Fan.
>
>


Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86_64

2024-02-04 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/

[Feb 3, 2024, 11:20:04 AM] (github) HDFS-17369. Add uuid into datanode info for 
NameNodeMXBean (#6521) Contributed by Haiyang Hu.
[Feb 3, 2024, 11:26:30 AM] (github) HDFS-17353. Fix failing RBF module tests. 
(#6491) Contributed by Alexander Bogdanov
[Feb 3, 2024, 11:34:42 AM] (github) YARN-11362: Fix several typos in YARN 
codebase of misspelled resource (#6474) Contributed by EremenkoValentin.
[Feb 3, 2024, 2:48:52 PM] (github) HADOOP-19049. Fix 
StatisticsDataReferenceCleaner classloader leak (#6488)




-1 overall


The following subsystems voted -1:
blanks hadolint pathlen xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-common-project/hadoop-common/src/test/resources/xml/external-dtd.xml 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/fair-scheduler-invalid.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/yarn-site-with-invalid-allocation-file-ref.xml
 
  

   cc:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-compile-cc-root.txt
 [96K]

   javac:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-compile-javac-root.txt
 [12K]

   blanks:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/blanks-eol.txt
 [15M]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/blanks-tabs.txt
 [2.0M]

   checkstyle:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-checkstyle-root.txt
 [13M]

   hadolint:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-hadolint.txt
 [24K]

   pathlen:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-pathlen.txt
 [16K]

   pylint:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-pylint.txt
 [20K]

   shellcheck:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-shellcheck.txt
 [24K]

   xml:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/xml.txt
 [24K]

   javadoc:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1492/artifact/out/results-javadoc-javadoc-root.txt
 [244K]

Powered by Apache Yetus 0.14.0-SNAPSHOT   https://yetus.apache.org

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

[VOTE] Release Apache Hadoop Thirdparty 1.2.0 (RC1)

2024-02-04 Thread slfan1989
Hi folks,

Xiaoqiao He and I have put together a release candidate (RC1) for Hadoop
Thirdparty 1.2.0.

The RC is available at:
https://dist.apache.org/repos/dist/dev/hadoop/hadoop-thirdparty-1.2.0-RC1

The RC tag is
https://github.com/apache/hadoop-thirdparty/releases/tag/release-1.2.0-RC1

The maven artifacts are staged at
https://repository.apache.org/content/repositories/orgapachehadoop-1401

Comparing to 1.1.1, there are three additional fixes:

HADOOP-18197. Upgrade Protobuf-Java to 3.21.12
https://github.com/apache/hadoop-thirdparty/pull/26

HADOOP-18921. Upgrade to avro 1.11.3
https://github.com/apache/hadoop-thirdparty/pull/24

HADOOP-18843. Guava version 32.0.1 bump to fix CVE-2023-2976 (#23)
https://github.com/apache/hadoop-thirdparty/pull/23

You can find my public key at :
https://dist.apache.org/repos/dist/release/hadoop/common/KEYS

Best Regards,
Shilun Fan.


Re: [VOTE] Release Apache Hadoop Thirdparty 1.2.0 RC0

2024-02-04 Thread slfan1989
Thank you all very much for helping with the review and vote!

I will conclude the voting for Hadoop-thirdparty-1.2.0-RC0 and open the
voting for Hadoop-thirdparty-1.2.0-RC1.

Thank you for the feedback on the code diff, Ayush. I have already
submitted the changes to the Dockerfile and create-release files to the
Hadoop-thirdparty trunk branch and backported them to branch-1.2 and
branch-1.2.0. The issues you provided feedback on will be addressed in
Hadoop-thirdparty-1.2.0-RC1.

Once again, thank you Xiaoqiao He, Ayush Saxena, Takanobu Asanuma, PJ
Fanning, Shuyan Zhang for the review and vote.

Best Regards,
Shilun Fan.

On Sun, Feb 4, 2024 at 4:18 PM Shuyan Zhang  wrote:

> +1 (non-binding)
>
> - Verified hashes
> - LICENSE and NOTICE are included.
> - Rat check is ok. `mvn clean apache-rat:check`
> - `mvn clean install` works well
>
>
> slfan1989  于2024年2月2日周五 11:11写道:
>
> > Thank you very much for the review! I will avoid the diff.
> >
> > Best Regards,
> > Shilun Fan.
> >
> > On Fri, Feb 2, 2024 at 9:59 AM Takanobu Asanuma 
> > wrote:
> >
> > > It also looks good to me, except for the diff.
> > >
> > > * Verified signatures and hashes
> > > * Reviewed the documents
> > > * Successfully built from source with `mvn clean install`
> > > * Successfully compiled Hadoop trunk and branch-3.4 using the Hadoop
> > > thirdparty 1.2.0
> > >
> > > Anyway, since hadoop-thirdparty-1.1.1 has some high vulnerabilities,
> > > hadoop-thirdparty-1.2.0 would be required for Hadoop-3.4.0.
> > >
> > > Thanks,
> > > - Takanobu
> > >
> > > 2024年2月2日(金) 4:45 slfan1989 :
> > >
> > > > Thank you for helping to review Hadoop-Thirdparty-1.2.0-RC0 and
> > providing
> > > > feedback!
> > > >
> > > > I followed the "how to release" documentation and tried to package it
> > > using
> > > > create-release and Dockerfile, but I couldn't successfully package it
> > > > directly. Some modifications are required before compilation. I
> should
> > > > submit a pull request to fix this issue before
> > > > Hadoop-Thirdparty-1.2.0-RC0 compile.
> > > >
> > > > This is an area that needs improvement. We should ensure that the
> code
> > of
> > > > src is consistent with the tag.
> > > >
> > > > On Fri, Feb 2, 2024 at 2:25 AM Ayush Saxena 
> > wrote:
> > > >
> > > > >
> > > > > There is some diff b/w the git tag & the src tar, the Dockerfile &
> > the
> > > > > create-release are different, Why?
> > > > >
> > > > > Files hadoop-thirdparty/dev-support/bin/create-release and
> > > > > hadoop-thirdparty-1.2.0-src/dev-support/bin/create-release differ
> > > > >
> > > > > Files hadoop-thirdparty/dev-support/docker/Dockerfile and
> > > > > hadoop-thirdparty-1.2.0-src/dev-support/docker/Dockerfile differ
> > > > >
> > > > >
> > > > > ayushsaxena@ayushsaxena hadoop-thirdparty-1.2.0-RC0 % diff
> > > > > hadoop-thirdparty/dev-support/bin/create-release
> > > > > hadoop-thirdparty-1.2.0-src/dev-support/bin/create-release
> > > > >
> > > > > 444,446c444,446
> > > > >
> > > > > < echo "RUN groupadd --non-unique -g ${group_id} ${user_name}"
> > > > >
> > > > > < echo "RUN useradd -g ${group_id} -u ${user_id} -m
> ${user_name}"
> > > > >
> > > > > < echo "RUN chown -R ${user_name} /home/${user_name}"
> > > > >
> > > > > ---
> > > > >
> > > > > > echo "RUN groupadd --non-unique -g ${group_id} ${user_name};
> > exit
> > > > > 0;"
> > > > >
> > > > > > echo "RUN useradd -g ${group_id} -u ${user_id} -m
> ${user_name};
> > > > > exit 0;"
> > > > >
> > > > > > echo "RUN chown -R ${user_name} /home/${user_name}; exit 0;"
> > > > >
> > > > > ayushsaxena@ayushsaxena hadoop-thirdparty-1.2.0-RC0 % diff
> > > > > hadoop-thirdparty/dev-support/docker/Dockerfile
> > > > > hadoop-thirdparty-1.2.0-src/dev-support/docker/Dockerfile
> > > > >
> > > > > 103a104,105
> > > > >
> > > > > > RUN rm -f /etc/maven/settings.xml && ln -s
> > > /home/root/.m2/settings.xml
> > > > > /etc/maven/settings.xml
> > > > >
> > > > > >
> > > > >
> > > > > 126a129,130
> > > > >
> > > > > > RUN pip2 install setuptools-scm==5.0.2
> > > > >
> > > > > > RUN pip2 install lazy-object-proxy==1.5.0
> > > > >
> > > > > 159d162
> > > > >
> > > > > <
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > Other things look Ok,
> > > > > * Built from source
> > > > > * Verified Checksums
> > > > > * Verified Signatures
> > > > > * Validated files have ASF header
> > > > >
> > > > > Not sure if having diff b/w the git tag & src tar is ok, this
> doesn't
> > > > look
> > > > > like core code change though, can anybody check & confirm?
> > > > >
> > > > > -Ayush
> > > > >
> > > > >
> > > > > On Thu, 1 Feb 2024 at 13:39, Xiaoqiao He 
> > > wrote:
> > > > >
> > > > >> Gentle ping. @Ayush Saxena  @Steve Loughran
> > > > >>  @inigo...@apache.org 
> > > > >> @Masatake
> > > > >> Iwasaki  and some other folks.
> > > > >>
> > > > >> On Wed, Jan 31, 2024 at 10:17 AM slfan1989 
> > > > wrote:
> > > > >>
> > > > >> > Thank you for the review and vote! Looking forward to other
> forks
> > > > >> helping
>

Apache Hadoop qbt Report: branch-2.10+JDK7 on Linux/x86_64

2024-02-04 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/

No changes




-1 overall


The following subsystems voted -1:
asflicense hadolint mvnsite pathlen unit


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

Failed junit tests :

   hadoop.net.TestClusterTopology 
   hadoop.ipc.TestIPC 
   hadoop.fs.TestFileUtil 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.hdfs.TestLeaseRecovery2 
   
hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithUpgradeDomain 
   hadoop.hdfs.server.datanode.TestDirectoryScanner 
   hadoop.hdfs.TestFileLengthOnClusterRestart 
   hadoop.hdfs.TestDFSInotifyEventInputStream 
   hadoop.hdfs.server.namenode.snapshot.TestSnapshotBlocksMap 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   hadoop.fs.viewfs.TestViewFileSystemHdfs 
   hadoop.hdfs.server.balancer.TestBalancerWithHANameNodes 
   hadoop.hdfs.server.federation.router.TestRouterQuota 
   hadoop.hdfs.server.federation.router.TestRouterNamenodeHeartbeat 
   hadoop.hdfs.server.federation.resolver.order.TestLocalResolver 
   hadoop.hdfs.server.federation.resolver.TestMultipleDestinationResolver 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.mapreduce.v2.app.TestRuntimeEstimators 
   hadoop.mapreduce.lib.input.TestLineRecordReader 
   hadoop.mapred.TestLineRecordReader 
   hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter 
   hadoop.resourceestimator.service.TestResourceEstimatorService 
   hadoop.resourceestimator.solver.impl.TestLpSolver 
   hadoop.yarn.sls.TestSLSRunner 
   
hadoop.yarn.server.nodemanager.containermanager.linux.resources.TestNumaResourceAllocator
 
   
hadoop.yarn.server.nodemanager.containermanager.linux.resources.TestNumaResourceHandlerImpl
 
   hadoop.yarn.server.resourcemanager.TestClientRMService 
   hadoop.yarn.server.resourcemanager.recovery.TestFSRMStateStore 
   hadoop.yarn.server.resourcemanager.recovery.TestZKRMStateStore 
   
hadoop.yarn.server.resourcemanager.monitor.invariants.TestMetricsInvariantChecker
 
  

   cc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/diff-compile-javac-root.txt
  [488K]

   checkstyle:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/diff-checkstyle-root.txt
  [14M]

   hadolint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   mvnsite:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-mvnsite-root.txt
  [572K]

   pathlen:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/diff-patch-pylint.txt
  [20K]

   shellcheck:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/diff-patch-shellcheck.txt
  [72K]

   whitespace:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/whitespace-eol.txt
  [12M]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/whitespace-tabs.txt
  [1.3M]

   javadoc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-javadoc-root.txt
  [36K]

   unit:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
  [224K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [1.8M]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt
  [36K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs_src_contrib_bkjournal.txt
  [16K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-app.txt
  [44K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1292/artifact/out/patch-unit-hadoop-m

Re: [VOTE] Release Apache Hadoop Thirdparty 1.2.0 RC0

2024-02-04 Thread Shuyan Zhang
+1 (non-binding)

- Verified hashes
- LICENSE and NOTICE are included.
- Rat check is ok. `mvn clean apache-rat:check`
- `mvn clean install` works well


slfan1989  于2024年2月2日周五 11:11写道:

> Thank you very much for the review! I will avoid the diff.
>
> Best Regards,
> Shilun Fan.
>
> On Fri, Feb 2, 2024 at 9:59 AM Takanobu Asanuma 
> wrote:
>
> > It also looks good to me, except for the diff.
> >
> > * Verified signatures and hashes
> > * Reviewed the documents
> > * Successfully built from source with `mvn clean install`
> > * Successfully compiled Hadoop trunk and branch-3.4 using the Hadoop
> > thirdparty 1.2.0
> >
> > Anyway, since hadoop-thirdparty-1.1.1 has some high vulnerabilities,
> > hadoop-thirdparty-1.2.0 would be required for Hadoop-3.4.0.
> >
> > Thanks,
> > - Takanobu
> >
> > 2024年2月2日(金) 4:45 slfan1989 :
> >
> > > Thank you for helping to review Hadoop-Thirdparty-1.2.0-RC0 and
> providing
> > > feedback!
> > >
> > > I followed the "how to release" documentation and tried to package it
> > using
> > > create-release and Dockerfile, but I couldn't successfully package it
> > > directly. Some modifications are required before compilation. I should
> > > submit a pull request to fix this issue before
> > > Hadoop-Thirdparty-1.2.0-RC0 compile.
> > >
> > > This is an area that needs improvement. We should ensure that the code
> of
> > > src is consistent with the tag.
> > >
> > > On Fri, Feb 2, 2024 at 2:25 AM Ayush Saxena 
> wrote:
> > >
> > > >
> > > > There is some diff b/w the git tag & the src tar, the Dockerfile &
> the
> > > > create-release are different, Why?
> > > >
> > > > Files hadoop-thirdparty/dev-support/bin/create-release and
> > > > hadoop-thirdparty-1.2.0-src/dev-support/bin/create-release differ
> > > >
> > > > Files hadoop-thirdparty/dev-support/docker/Dockerfile and
> > > > hadoop-thirdparty-1.2.0-src/dev-support/docker/Dockerfile differ
> > > >
> > > >
> > > > ayushsaxena@ayushsaxena hadoop-thirdparty-1.2.0-RC0 % diff
> > > > hadoop-thirdparty/dev-support/bin/create-release
> > > > hadoop-thirdparty-1.2.0-src/dev-support/bin/create-release
> > > >
> > > > 444,446c444,446
> > > >
> > > > < echo "RUN groupadd --non-unique -g ${group_id} ${user_name}"
> > > >
> > > > < echo "RUN useradd -g ${group_id} -u ${user_id} -m ${user_name}"
> > > >
> > > > < echo "RUN chown -R ${user_name} /home/${user_name}"
> > > >
> > > > ---
> > > >
> > > > > echo "RUN groupadd --non-unique -g ${group_id} ${user_name};
> exit
> > > > 0;"
> > > >
> > > > > echo "RUN useradd -g ${group_id} -u ${user_id} -m ${user_name};
> > > > exit 0;"
> > > >
> > > > > echo "RUN chown -R ${user_name} /home/${user_name}; exit 0;"
> > > >
> > > > ayushsaxena@ayushsaxena hadoop-thirdparty-1.2.0-RC0 % diff
> > > > hadoop-thirdparty/dev-support/docker/Dockerfile
> > > > hadoop-thirdparty-1.2.0-src/dev-support/docker/Dockerfile
> > > >
> > > > 103a104,105
> > > >
> > > > > RUN rm -f /etc/maven/settings.xml && ln -s
> > /home/root/.m2/settings.xml
> > > > /etc/maven/settings.xml
> > > >
> > > > >
> > > >
> > > > 126a129,130
> > > >
> > > > > RUN pip2 install setuptools-scm==5.0.2
> > > >
> > > > > RUN pip2 install lazy-object-proxy==1.5.0
> > > >
> > > > 159d162
> > > >
> > > > <
> > > >
> > > >
> > > >
> > > >
> > > > Other things look Ok,
> > > > * Built from source
> > > > * Verified Checksums
> > > > * Verified Signatures
> > > > * Validated files have ASF header
> > > >
> > > > Not sure if having diff b/w the git tag & src tar is ok, this doesn't
> > > look
> > > > like core code change though, can anybody check & confirm?
> > > >
> > > > -Ayush
> > > >
> > > >
> > > > On Thu, 1 Feb 2024 at 13:39, Xiaoqiao He 
> > wrote:
> > > >
> > > >> Gentle ping. @Ayush Saxena  @Steve Loughran
> > > >>  @inigo...@apache.org 
> > > >> @Masatake
> > > >> Iwasaki  and some other folks.
> > > >>
> > > >> On Wed, Jan 31, 2024 at 10:17 AM slfan1989 
> > > wrote:
> > > >>
> > > >> > Thank you for the review and vote! Looking forward to other forks
> > > >> helping
> > > >> > with voting and verification.
> > > >> >
> > > >> > Best Regards,
> > > >> > Shilun Fan.
> > > >> >
> > > >> > On Tue, Jan 30, 2024 at 6:20 PM Xiaoqiao He <
> hexiaoq...@apache.org>
> > > >> wrote:
> > > >> >
> > > >> > > Thanks Shilun for driving it and making it happen.
> > > >> > >
> > > >> > > +1(binding).
> > > >> > >
> > > >> > > [x] Checksums and PGP signatures are valid.
> > > >> > > [x] LICENSE files exist.
> > > >> > > [x] NOTICE is included.
> > > >> > > [x] Rat check is ok. `mvn clean apache-rat:check`
> > > >> > > [x] Built from source works well: `mvn clean install`
> > > >> > > [x] Built Hadoop trunk with updated thirdparty successfully
> > (include
> > > >> > update
> > > >> > > protobuf shaded path).
> > > >> > >
> > > >> > > BTW, hadoop-thirdparty-1.2.0 will be included in release-3.4.0,
> > hope
> > > >> we
> > > >> > > could finish this vote before 2024/02/06(UTC) if there are no
> > > >> concerns.
> > > >> > > Thanks all.