hi!
I have some questions about ambari.
In my ambari dashboard:
JournalNodes2/3
JournalNodes Live
NFSGateways0/1
Started
but the service work fine,and if I start the shutdown server,the log like
this:
2016-08-29 09:22:48,840 -
File['/var/run/hadoop/hdfs/hadoop-hdfs-journalnode.pid'] {'action': ['delete'],
'not_if': 'ambari-sudo.sh -H -E test -f
/var/run/hadoop/hdfs/hadoop-hdfs-journalnode.pid && ambari-sudo.sh -H -E pgrep
-F /var/run/hadoop/hdfs/hadoop-hdfs-journalnode.pid'} 2016-08-29 09:22:48,911 -
Skipping File['/var/run/hadoop/hdfs/hadoop-hdfs-journalnode.pid'] due to not_if
2016-08-29 09:22:48,912 - Execute['ambari-sudo.sh su hdfs -l -s /bin/bash -c
'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh
--config /usr/hdp/current/hadoop-client/conf start journalnode'']
{'environment': {'HADOOP_LIBEXEC_DIR':
'/usr/hdp/current/hadoop-client/libexec'}, 'not_if': 'ambari-sudo.sh -H -E
test -f /var/run/hadoop/hdfs/hadoop-hdfs-journalnode.pid && ambari-sudo.sh -H
-E pgrep -F /var/run/hadoop/hdfs/hadoop-hdfs-journalnode.pid'} 2016-08-29
09:22:48,982 - Skipping Execute['ambari-sudo.sh su hdfs -l -s /bin/bash -c
'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh
--config /usr/hdp/current/hadoop-client/conf start journalnode''] due to
not_ifexactly,the journalnode pid file exists and can work fine now.What's
wrong the ambari?
thanks!