[
https://issues.apache.org/jira/browse/HADOOP-19305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17888478#comment-17888478
]
ASF GitHub Bot commented on HADOOP-19305:
-----------------------------------------
slfan1989 commented on PR #7106:
URL: https://github.com/apache/hadoop/pull/7106#issuecomment-2406276542
@zhangbutao Sure!
Thank you for your continued attention to this issue! The change seems
feasible, but we may need to find a workaround.
1. Hadoop is currently in the process of releasing version 3.4.1 (which is
very resource-intensive for RM, especially with Steve investing a lot of time).
The RC3 for 3.4.1 has already been released, and if the vote passes, Steve will
release hadoop-3.4.1. Therefore, if we want to merge this PR, it might have to
wait until the release of 3.4.2.
2. If we can make adjustments in Hive, could we first implement unit test
modifications in Hive to address this issue? This way, we can apply
hadoop-3.4.0 to the Hive branch.
> Fix ProcessEnvironment ClassCastException in Shell.java
> -------------------------------------------------------
>
> Key: HADOOP-19305
> URL: https://issues.apache.org/jira/browse/HADOOP-19305
> Project: Hadoop Common
> Issue Type: Improvement
> Affects Versions: 3.4.0
> Reporter: Butao Zhang
> Assignee: Butao Zhang
> Priority: Major
> Labels: pull-request-available
>
> We tried to upgrade Hadoop version from 3.6.6 to 3.4.0 in Apache Hive
> HIVE-28191. But found exception:
> {code:java}
> Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to
> java.lang.ProcessEnvironment$Variable
> at
> java.lang.ProcessEnvironment$StringEnvironment.toEnvironmentBlock(ProcessEnvironment.java:273)
> ~[?:1.8.0_221]
> at
> java.lang.ProcessEnvironment.toEnvironmentBlock(ProcessEnvironment.java:298)
> ~[?:1.8.0_221]
> at java.lang.ProcessImpl.start(ProcessImpl.java:86) ~[?:1.8.0_221]
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) ~[?:1.8.0_221]
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:998)
> ~[hadoop-common-3.4.0.jar:?]
> at org.apache.hadoop.util.Shell.run(Shell.java:959)
> ~[hadoop-common-3.4.0.jar:?]
> at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1282)
> ~[hadoop-common-3.4.0.jar:?]
> at org.apache.hadoop.util.Shell.execCommand(Shell.java:1377)
> ~[hadoop-common-3.4.0.jar:?]
> at org.apache.hadoop.util.Shell.execCommand(Shell.java:1359)
> ~[hadoop-common-3.4.0.jar:?]
> at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1535)
> ~[hadoop-common-3.4.0.jar:?]
> at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfoByNonNativeIO(RawLocalFileSystem.java:1000)
> ~[hadoop-common-3.4.0.jar:?]
> at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:991)
> ~[hadoop-common-3.4.0.jar:?]
> at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:952)
> ~[hadoop-common-3.4.0.jar:?]
> at
> org.apache.hadoop.hive.ql.exec.Utilities.ensurePathIsWritable(Utilities.java:4954)
> ~[classes/:?]
> at
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:843)
> ~[classes/:?]
> at
> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:784)
> ~[classes/:?]
> at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:708)
> ~[classes/:?]
> at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:669)
> ~[classes/:?]
> at
> org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:182)
> ~[classes/:?]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_221]
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_221]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_221]
> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_221]
> at
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
> ~[classes/:?]
> at
> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
> ~[classes/:?]
> at
> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
> ~[classes/:?]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[?:1.8.0_221]
> at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_221]
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953)
> ~[hadoop-common-3.4.0.jar:?]
> at
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
> ~[classes/:?]
> at com.sun.proxy.$Proxy58.open(Unknown Source) ~[?:?]
> at
> org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:485)
> ~[classes/:?]
> ... 40 more {code}
>
> After some debugging, i found the failed hive tests
> {{TestRemoteHiveMetastoreWithHttpJwt#testValidJWT}} and
> {{TestHttpJwtAuthentication#testAuthorizedUser}} are related Hadoop3.4.0
> change [HADOOP-17009: Embrace Immutability of Java Collections
> hadoop#1974|https://github.com/apache/hadoop/pull/1974]
> [https://github.com/apache/hadoop/pull/1974/files#diff-372a0d25bcccd88b409a8149949628abd7d3472a1798bebe813e7617b0ef73c7L918-L920]
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]