[ 
https://issues.apache.org/jira/browse/HDFS-17570?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

wuchang updated HDFS-17570:
---------------------------
    Description: 
h1. Description of PR

the {{-Dhadoop.root.logger}} will use the value of {{{}HADOOP_ROOT_LOGGER{}}}.
In most cases, hadoop administrator tends to setup {{HADOOP_ROOT_LOGGER}} 
directly, it is effective in non-daemon mode and most hadoop administrator are 
not even aware of the existence of {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}.
So when he started up HADOOP as daemon mode, no matter how he set up 
{{{}HADOOP_ROOT_LOGGER{}}}, the {{-Dhadoop.root.logger}} is not changed because 
in Daemon mode, the {{HADOOP_ROOT_LOGGER}} is set up by 
{{HADOOP_DAEMON_ROOT_LOGGER}} which is in fact a default value. The code is as 
below:

 
  if [[ "${HADOOP_DAEMON_MODE}" != "default" ]]; then    
HADOOP_ROOT_LOGGER="${HADOOP_DAEMON_ROOT_LOGGER}"if [[ 
"${HADOOP_SUBCMD_SECURESERVICE}" = true ]]; then      
HADOOP_LOGFILE="hadoop-${HADOOP_SECURE_USER}-${HADOOP_IDENT_STRING}-${HADOOP_SUBCMD}-${HOSTNAME}.log"else
      
HADOOP_LOGFILE="hadoop-${HADOOP_IDENT_STRING}-${HADOOP_SUBCMD}-${HOSTNAME}.log"fifi
 
This makes log setup and troubleshooting becomes very time consuming before we 
finally figure out the reason and the best practice without reading the code.
h1. My change is:

In daemon mode
 * If user has customize(changed) {{HADOOP_ROOT_LOGGER}} but no 
{{{}HADOOP_DAEMON_ROOT_LOGGER{}}}, then we will respect {{HADOOP_ROOT_LOGGER}} 
as the final {{-Dhadoop.root.logger}} value, instead of setting up 
{{HADOOP_ROOT_LOGGER}} with the default {{HADOOP_DAEMON_ROOT_LOGGER}} and print 
a warning.
 * In other cases, we will use {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}(no matter it 
is default or customized), we will setup the {{HADOOP_ROOT_LOGGER}} with this 
customized {{HADOOP_DAEMON_ROOT_LOGGER}}

h3. How was this patch tested?
||Testing Case||Expected Result||
|Customized {{HADOOP_LOGLEVEL}}|Respect {{HADOOP_LOGLEVEL}}|
|Without customizing anything|Respect default {{HADOOP_DAEMON_ROOT_LOGGER}} in 
daemon mode|
|Customize the {{HADOOP_DAEMON_ROOT_LOGGER}} without customizing 
{{HADOOP_ROOT_LOGGER}}|Respect customized {{HADOOP_DAEMON_ROOT_LOGGER}}|
|Customize the {{HADOOP_ROOT_LOGGER}} but not customize 
{{HADOOP_DAEMON_ROOT_LOGGER}}|Respect customized {{HADOOP_ROOT_LOGGER}} with a 
warning|
|Customize both {{HADOOP_ROOT_LOGGER}} and 
{{HADOOP_DAEMON_ROOT_LOGGER}}|Respect customized {{HADOOP_DAEMON_ROOT_LOGGER}}|
 # Customize the {{{}HADOOP_LOGLEVEL{}}}, should respect the 
{{HADOOP_LOGLEVEL}} and {{HADOOP_DAEMON_ROOT_LOGGER}}
export HADOOP_LOGLEVEL=DEBUG
 
{code:java}
root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode 

hadoop 2852464 1 45 04:04 pts/1 00:00:09 
/usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
-Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
-Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
 -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
-Dcom.sun.management.jmxremote.authenticate=false 
-Dcom.sun.management.jmxremote.ssl=false 
-Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
-Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dyarn.log.file=test-host-name.dev.corp.com.log 
-Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
-Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
-Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dhadoop.log.file=test-host-name.dev.corp.com.log 
-Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
-Dhadoop.root.logger=DEBUG,RFA -Dhadoop.policy.file=hadoop-policy.xml 
org.apache.hadoop.hdfs.server.namenode.NameNode{code}
{{ }}
 
 # Without customizing anything, should respect default 
{{HADOOP_DAEMON_ROOT_LOGGER}}
root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
hadoop   2855220       1 43 04:05 pts/1    00:00:07 
/usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
-Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
-Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
 -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
-Dcom.sun.management.jmxremote.authenticate=false 
-Dcom.sun.management.jmxremote.ssl=false 
-Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
-Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dyarn.log.file=test-host-name.dev.corp.com.log 
-Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
-Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
-Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dhadoop.log.file=test-host-name.dev.corp.com.log 
-Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
-Dhadoop.root.logger=INFO,RFA -Dhadoop.policy.file=hadoop-policy.xml 
org.apache.hadoop.hdfs.server.namenode.NameNode
 
 # Customize the {{HADOOP_DAEMON_ROOT_LOGGER}} without customizing 
{{{}HADOOP_ROOT_LOGGER{}}}, should respect customized 
{{HADOOP_DAEMON_ROOT_LOGGER}}
export HADOOP_DAEMON_ROOT_LOGGER=TRACE,RFA
 
root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
 hadoop   2857093       1 99 04:06 pts/1    00:00:07 
/usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
-Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
-Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
 -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
-Dcom.sun.management.jmxremote.authenticate=false 
-Dcom.sun.management.jmxremote.ssl=false 
-Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
-Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dyarn.log.file=test-host-name.dev.corp.com.log 
-Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
-Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
-Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dhadoop.log.file=test-host-name.dev.corp.com.log 
-Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
-Dhadoop.root.logger=TRACE,RFA -Dhadoop.policy.file=hadoop-policy.xml 
org.apache.hadoop.hdfs.server.namenode.NameNode
 root     2857504 2745542  0 04:07 pts/1    00:00:00 grep --color=auto NameNode
 
 # Customize the {{HADOOP_ROOT_LOGGER}} but not customize 
{{{}HADOOP_DAEMON_ROOT_LOGGER{}}}, should respect customized 
{{HADOOP_ROOT_LOGGER}} with a warning
export HADOOP_ROOT_LOGGER=TRACE,console
 
root@rccd101-6a:/home/hadoop/current-hadoop# sudo -u hadoop bin/hdfs --daemon 
stop namenode;Using customized HADOOP_ROOT_LOGGER as final hadoop.root.logger. 
Recommending using HADOOP_LOGLEVEL or customize HADOOP_DAEMON_ROOT_LOGGER for 
logging in daemon mode.
 
root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
hadoop   2874729       1 99 04:16 pts/1    00:00:07 
/usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
-Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
-Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
 -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
-Dcom.sun.management.jmxremote.authenticate=false 
-Dcom.sun.management.jmxremote.ssl=false 
-Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
-Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dyarn.log.file=test-host-name.dev.corp.com.log 
-Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
-Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
-Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dhadoop.log.file=test-host-name.dev.corp.com.log 
-Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
-Dhadoop.root.logger=TRACE,console -Dhadoop.policy.file=hadoop-policy.xml 
org.apache.hadoop.hdfs.server.namenode.NameNode
 
 # Customize both {{HADOOP_ROOT_LOGGER}} and {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}, 
should respect customized {{HADOOP_DAEMON_ROOT_LOGGER}}
export HADOOP_DAEMON_ROOT_LOGGER=TRACE,RFAexport 
HADOOP_ROOT_LOGGER=TRACE,console
 
root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
hadoop   2877521       1 61 04:18 pts/1    00:00:07 
/usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
-Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
-Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
 -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
-Dcom.sun.management.jmxremote.authenticate=false 
-Dcom.sun.management.jmxremote.ssl=false 
-Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
-Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dyarn.log.file=test-host-name.dev.corp.com.log 
-Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
-Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
-Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
-Dhadoop.log.file=test-host-name.dev.corp.com.log 
-Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
-Dhadoop.root.logger=TRACE,RFA -Dhadoop.policy.file=hadoop-policy.xml 
org.apache.hadoop.hdfs.server.namenode.NameNode
 

h3. For code changes:
 *  Does the title or this PR starts with the corresponding JIRA issue id (e.g. 
'HADOOP-17799. Your PR title ...')?
 *  Object storage: have the integration tests been executed and the endpoint 
declared according to the connector-specific documentation?
 *  If adding new dependencies to the code, are these dependencies licensed in 
a way that is compatible for inclusion under [ASF 
2.0|http://www.apache.org/legal/resolved.html#category-a]?
 *  If applicable, have you updated the {{{}LICENSE{}}}, 
{{{}LICENSE-binary{}}}, {{NOTICE-binary}} files?

> Respect Non-Default HADOOP_ROOT_LOGGER when HADOOP_DAEMON_ROOT_LOGGER is not 
> specified in Daemon mode
> -----------------------------------------------------------------------------------------------------
>
>                 Key: HDFS-17570
>                 URL: https://issues.apache.org/jira/browse/HDFS-17570
>             Project: Hadoop HDFS
>          Issue Type: Improvement
>            Reporter: wuchang
>            Priority: Major
>              Labels: pull-request-available
>
> h1. Description of PR
> the {{-Dhadoop.root.logger}} will use the value of {{{}HADOOP_ROOT_LOGGER{}}}.
> In most cases, hadoop administrator tends to setup {{HADOOP_ROOT_LOGGER}} 
> directly, it is effective in non-daemon mode and most hadoop administrator 
> are not even aware of the existence of {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}.
> So when he started up HADOOP as daemon mode, no matter how he set up 
> {{{}HADOOP_ROOT_LOGGER{}}}, the {{-Dhadoop.root.logger}} is not changed 
> because in Daemon mode, the {{HADOOP_ROOT_LOGGER}} is set up by 
> {{HADOOP_DAEMON_ROOT_LOGGER}} which is in fact a default value. The code is 
> as below:
>  
>   if [[ "${HADOOP_DAEMON_MODE}" != "default" ]]; then    
> HADOOP_ROOT_LOGGER="${HADOOP_DAEMON_ROOT_LOGGER}"if [[ 
> "${HADOOP_SUBCMD_SECURESERVICE}" = true ]]; then      
> HADOOP_LOGFILE="hadoop-${HADOOP_SECURE_USER}-${HADOOP_IDENT_STRING}-${HADOOP_SUBCMD}-${HOSTNAME}.log"else
>       
> HADOOP_LOGFILE="hadoop-${HADOOP_IDENT_STRING}-${HADOOP_SUBCMD}-${HOSTNAME}.log"fifi
>  
> This makes log setup and troubleshooting becomes very time consuming before 
> we finally figure out the reason and the best practice without reading the 
> code.
> h1. My change is:
> In daemon mode
>  * If user has customize(changed) {{HADOOP_ROOT_LOGGER}} but no 
> {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}, then we will respect 
> {{HADOOP_ROOT_LOGGER}} as the final {{-Dhadoop.root.logger}} value, instead 
> of setting up {{HADOOP_ROOT_LOGGER}} with the default 
> {{HADOOP_DAEMON_ROOT_LOGGER}} and print a warning.
>  * In other cases, we will use {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}(no matter it 
> is default or customized), we will setup the {{HADOOP_ROOT_LOGGER}} with this 
> customized {{HADOOP_DAEMON_ROOT_LOGGER}}
> h3. How was this patch tested?
> ||Testing Case||Expected Result||
> |Customized {{HADOOP_LOGLEVEL}}|Respect {{HADOOP_LOGLEVEL}}|
> |Without customizing anything|Respect default {{HADOOP_DAEMON_ROOT_LOGGER}} 
> in daemon mode|
> |Customize the {{HADOOP_DAEMON_ROOT_LOGGER}} without customizing 
> {{HADOOP_ROOT_LOGGER}}|Respect customized {{HADOOP_DAEMON_ROOT_LOGGER}}|
> |Customize the {{HADOOP_ROOT_LOGGER}} but not customize 
> {{HADOOP_DAEMON_ROOT_LOGGER}}|Respect customized {{HADOOP_ROOT_LOGGER}} with 
> a warning|
> |Customize both {{HADOOP_ROOT_LOGGER}} and 
> {{HADOOP_DAEMON_ROOT_LOGGER}}|Respect customized 
> {{HADOOP_DAEMON_ROOT_LOGGER}}|
>  # Customize the {{{}HADOOP_LOGLEVEL{}}}, should respect the 
> {{HADOOP_LOGLEVEL}} and {{HADOOP_DAEMON_ROOT_LOGGER}}
> export HADOOP_LOGLEVEL=DEBUG
>  
> {code:java}
> root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode 
> hadoop 2852464 1 45 04:04 pts/1 00:00:09 
> /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
> -Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
> -Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
>  -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
> -Dcom.sun.management.jmxremote.authenticate=false 
> -Dcom.sun.management.jmxremote.ssl=false 
> -Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
> -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
> -Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dyarn.log.file=test-host-name.dev.corp.com.log 
> -Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
> -Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
> -Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dhadoop.log.file=test-host-name.dev.corp.com.log 
> -Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
> -Dhadoop.root.logger=DEBUG,RFA -Dhadoop.policy.file=hadoop-policy.xml 
> org.apache.hadoop.hdfs.server.namenode.NameNode{code}
> {{ }}
>  
>  # Without customizing anything, should respect default 
> {{HADOOP_DAEMON_ROOT_LOGGER}}
> root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
> hadoop   2855220       1 43 04:05 pts/1    00:00:07 
> /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
> -Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
> -Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
>  -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
> -Dcom.sun.management.jmxremote.authenticate=false 
> -Dcom.sun.management.jmxremote.ssl=false 
> -Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
> -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
> -Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dyarn.log.file=test-host-name.dev.corp.com.log 
> -Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
> -Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
> -Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dhadoop.log.file=test-host-name.dev.corp.com.log 
> -Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
> -Dhadoop.root.logger=INFO,RFA -Dhadoop.policy.file=hadoop-policy.xml 
> org.apache.hadoop.hdfs.server.namenode.NameNode
>  
>  # Customize the {{HADOOP_DAEMON_ROOT_LOGGER}} without customizing 
> {{{}HADOOP_ROOT_LOGGER{}}}, should respect customized 
> {{HADOOP_DAEMON_ROOT_LOGGER}}
> export HADOOP_DAEMON_ROOT_LOGGER=TRACE,RFA
>  
> root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
>  hadoop   2857093       1 99 04:06 pts/1    00:00:07 
> /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
> -Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
> -Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
>  -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
> -Dcom.sun.management.jmxremote.authenticate=false 
> -Dcom.sun.management.jmxremote.ssl=false 
> -Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
> -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
> -Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dyarn.log.file=test-host-name.dev.corp.com.log 
> -Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
> -Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
> -Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dhadoop.log.file=test-host-name.dev.corp.com.log 
> -Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
> -Dhadoop.root.logger=TRACE,RFA -Dhadoop.policy.file=hadoop-policy.xml 
> org.apache.hadoop.hdfs.server.namenode.NameNode
>  root     2857504 2745542  0 04:07 pts/1    00:00:00 grep --color=auto 
> NameNode
>  
>  # Customize the {{HADOOP_ROOT_LOGGER}} but not customize 
> {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}, should respect customized 
> {{HADOOP_ROOT_LOGGER}} with a warning
> export HADOOP_ROOT_LOGGER=TRACE,console
>  
> root@rccd101-6a:/home/hadoop/current-hadoop# sudo -u hadoop bin/hdfs --daemon 
> stop namenode;Using customized HADOOP_ROOT_LOGGER as final 
> hadoop.root.logger. Recommending using HADOOP_LOGLEVEL or customize 
> HADOOP_DAEMON_ROOT_LOGGER for logging in daemon mode.
>  
> root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
> hadoop   2874729       1 99 04:16 pts/1    00:00:07 
> /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
> -Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
> -Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
>  -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
> -Dcom.sun.management.jmxremote.authenticate=false 
> -Dcom.sun.management.jmxremote.ssl=false 
> -Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
> -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
> -Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dyarn.log.file=test-host-name.dev.corp.com.log 
> -Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
> -Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
> -Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dhadoop.log.file=test-host-name.dev.corp.com.log 
> -Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
> -Dhadoop.root.logger=TRACE,console -Dhadoop.policy.file=hadoop-policy.xml 
> org.apache.hadoop.hdfs.server.namenode.NameNode
>  
>  # Customize both {{HADOOP_ROOT_LOGGER}} and 
> {{{}HADOOP_DAEMON_ROOT_LOGGER{}}}, should respect customized 
> {{HADOOP_DAEMON_ROOT_LOGGER}}
> export HADOOP_DAEMON_ROOT_LOGGER=TRACE,RFAexport 
> HADOOP_ROOT_LOGGER=TRACE,console
>  
> root@rccd101-6a:/home/hadoop/current-hadoop# ps -ef|grep NameNode
> hadoop   2877521       1 61 04:18 pts/1    00:00:07 
> /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode 
> -Djava.net.preferIPv4Stack=true -Dhdfs.audit.logger=INFO,NullAppender 
> -Dlog4j.configuration=file:/home/hadoop/current-hadoop/etc/hadoop/log4j-namenode.properties
>  -XX:+UseG1GC -Xmx4096m -Dcom.sun.management.jmxremote=true 
> -Dcom.sun.management.jmxremote.authenticate=false 
> -Dcom.sun.management.jmxremote.ssl=false 
> -Dcom.sun.management.jmxremote.port=1029 -Dhadoop.security.logger=ERROR,RFAS 
> -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 
> -Dyarn.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dyarn.log.file=test-host-name.dev.corp.com.log 
> -Dyarn.home.dir=/home/hadoop/hadoop-3.2.4 -Dyarn.root.logger=INFO,console 
> -Djava.library.path=/home/hadoop/hadoop-3.2.4/lib/native 
> -Dhadoop.log.dir=/home/hadoop/hadoop-3.2.4/../logs 
> -Dhadoop.log.file=test-host-name.dev.corp.com.log 
> -Dhadoop.home.dir=/home/hadoop/hadoop-3.2.4 -Dhadoop.id.str=hadoop 
> -Dhadoop.root.logger=TRACE,RFA -Dhadoop.policy.file=hadoop-policy.xml 
> org.apache.hadoop.hdfs.server.namenode.NameNode
>  
> h3. For code changes:
>  *  Does the title or this PR starts with the corresponding JIRA issue id 
> (e.g. 'HADOOP-17799. Your PR title ...')?
>  *  Object storage: have the integration tests been executed and the endpoint 
> declared according to the connector-specific documentation?
>  *  If adding new dependencies to the code, are these dependencies licensed 
> in a way that is compatible for inclusion under [ASF 
> 2.0|http://www.apache.org/legal/resolved.html#category-a]?
>  *  If applicable, have you updated the {{{}LICENSE{}}}, 
> {{{}LICENSE-binary{}}}, {{NOTICE-binary}} files?



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to