Ashok Kumar created HDFS-17779:
----------------------------------
Summary: Container fails to launch with error :/bin/bash:
ADD_OPENS: No such file or directory
Key: HDFS-17779
URL: https://issues.apache.org/jira/browse/HDFS-17779
Project: Hadoop HDFS
Issue Type: Task
Reporter: Ashok Kumar
When launched container using distributedshell client, below issue is faced
during rolling-upgrade of cluster
{code:java}
Exception message: Launch container failedShell error output: Nonzero exit
code=1, error message='Invalid argument number'Shell output: main : command
provided 1main : run as user is ambari-qamain : requested yarn user is
ambari-qaGetting exit code file...Creating script paths...Writing pid
file...Writing to tmp file
/u01/hadoop/yarn/local/nmPrivate/application_1740561181252_0011/container_e06_1740561181252_0011_02_000001/container_e06_1740561181252_0011_02_000001.pid.tmpWriting
to cgroup task files...Creating local dirs...Launching container...[2025-02-26
09:51:47.245]Container exited with a non-zero exit code 1. Error file:
prelaunch.err.Last 4096 bytes of prelaunch.err :/bin/bash: ADD_OPENS: No such
file or directory[2025-02-26 09:51:47.245]Container exited with a non-zero exit
code 1. Error file: prelaunch.err.Last 4096 bytes of prelaunch.err :/bin/bash:
ADD_OPENS: No such file or directoryFor more detailed output, check the
application tracking page:
http://oash06mn0.sub10040615300.e2evcn.oraclevcn.com:8088/cluster/app/application_1740561181252_0011
Then click on links to logs of each attempt.. Failing the application. {code}
It is observed that distributedshell client jar is of hadoop 3.4.1 and
container where AM is launched is of hadoop version 3.3.3. In this case
"ADD_OPENS" is added by
[code|https://github.com/apache/hadoop/blob/626b227094027ed08883af97a0734d2db7863864/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-distributedshell/src/main/java/org/apache/hadoop/yarn/applications/distributedshell/Client.java#L956]
, client, is not replaced by hadoop 3.3.3. Since node is running with jdk8 and
hence its not able to infer "ADD_OPENS"
Use case: It happens during service check while rolling upgrade hadoop using
ambari.
Can we add this parameter conditionally as its added for map-reduce job, refer
[link|https://github.com/apache/hadoop/blob/626b227094027ed08883af97a0734d2db7863864/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/JobConf.java#L2213]
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]