Hi,


I'm using Apache Ambari 2.7.5 to setup and maintain a Hadoop cluster on Centos 
7.9. Already managed to add a namenode and a datanode to local host (of 
ambari-master) but failed to add another datanode, it fails with warning 
encountered and inĀ  /var/lib/ambari-agent/data/output-466.txt the last lines 
are:

2023-05-14 08:20:01,741 - FS Type: HDFS

2023-05-14 08:20:01,741 - XmlConfig['core-site.xml'] {'group': 'hadoop', 
'conf_dir': '/etc/hadoop/conf', 'xml_include_file': None, 
'configuration_attributes': {u'final': {u'fs.defaultFS': u'true'}}, 'owner': 
'hdfs', 'only_if': 'ls /etc/hadoop/conf', 'configurations': ...}

2023-05-14 08:20:01,755 - Generating config: /etc/hadoop/conf/core-site.xml

2023-05-14 08:20:01,755 - File['/etc/hadoop/conf/core-site.xml'] {'owner': 
'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 
'encoding': 'UTF-8'}

2023-05-14 08:20:01,769 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] 
{'create_parents': True, 'mode': 0755, 'cd_access': 'a'}

2023-05-14 08:20:01,769 - No logsearch configuration exists at 
/var/lib/ambari-agent/cache/stacks/BGTP/1.0/services/HDFS/package/templates/input.config-hdfs.json.j2

2023-05-14 08:20:01,769 - Could not load 'version' from 
/var/lib/ambari-agent/data/structured-out-466.json



Since all necessary packages was already installed by ambari agent and all 
configs been generated, i tried to start the datanode manually using "hdfs 
--daemon start datanode" after creatig hdfs dir in /home and chown the hdfs 
directory and it successfully started and registered to namenode.



How could i find out why ambari does not finish the task of adding the 
datanode? How could i find out the steps that ambari is executing to install 
datanode and in which step its stucking?





Thanks in advance

Sent using https://www.zoho.com/mail/

Reply via email to