Hi all,

Here is a (probably THE) problem I have found.

stderr:
2026-01-22 11:37:56,687 - The 'hadoop-hdfs-client' component did not advertise 
a version. This may indicate a problem with the component packaging. However, 
the stack-select tool was able to report a single version installed (3.3.0). 
This is the version that will be reported.
Command aborted. Reason: 'Server considered task failed and automatically 
aborted it'
stdout:
2026-01-22 11:37:53,649 - Stack Feature Version Info: Cluster Stack=3.3.0, 
Command Stack=None, Command Version=None -> 3.3.0
2026-01-22 11:37:53,650 - Using hadoop conf dir: /etc/hadoop/conf
2026-01-22 11:37:53,651 - Skipping param: datanode_max_locked_memory, due to 
Configuration parameter 'dfs.datanode.max.locked.memory' was not found in 
configurations dictionary!
2026-01-22 11:37:53,651 - Skipping param: dfs_ha_namenode_ids, due to 
Configuration parameter 'dfs.ha.namenodes' was not found in configurations 
dictionary!
2026-01-22 11:37:53,651 - Skipping param: falcon_user, due to Configuration 
parameter 'falcon-env' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Skipping param: gmetad_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Skipping param: gmond_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Skipping param: hbase_user, due to Configuration 
parameter 'hbase-env' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Skipping param: nfsgateway_heapsize, due to 
Configuration parameter 'nfsgateway_heapsize' was not found in configurations 
dictionary!
2026-01-22 11:37:53,652 - Skipping param: oozie_user, due to Configuration 
parameter 'oozie-env' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Skipping param: repo_info, due to Configuration 
parameter 'repoInfo' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Skipping param: zeppelin_group, due to Configuration 
parameter 'zeppelin-env' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Skipping param: zeppelin_user, due to Configuration 
parameter 'zeppelin-env' was not found in configurations dictionary!
2026-01-22 11:37:53,652 - Group['kms'] {}
2026-01-22 11:37:53,657 - Group['ranger'] {}
2026-01-22 11:37:53,658 - Group['hdfs'] {}
2026-01-22 11:37:53,660 - Group['hadoop'] {}
2026-01-22 11:37:53,669 - User['hive'] {'uid': None, 'gid': 'hadoop', 'groups': 
['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,674 - User['yarn-ats'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,678 - User['infra-solr'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,683 - User['zookeeper'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,687 - User['ams'] {'uid': None, 'gid': 'hadoop', 'groups': 
['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,693 - User['ranger'] {'uid': None, 'gid': 'hadoop', 
'groups': ['ranger', 'hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,698 - User['tez'] {'uid': None, 'gid': 'hadoop', 'groups': 
['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,701 - User['kms'] {'uid': None, 'gid': 'hadoop', 'groups': 
['kms', 'hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,704 - User['ambari-qa'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,707 - User['solr'] {'uid': None, 'gid': 'hadoop', 'groups': 
['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,710 - User['hdfs'] {'uid': None, 'gid': 'hadoop', 'groups': 
['hdfs', 'hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,714 - User['yarn'] {'uid': None, 'gid': 'hadoop', 'groups': 
['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,718 - User['hcat'] {'uid': None, 'gid': 'hadoop', 'groups': 
['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,722 - User['mapred'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,725 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] 
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
2026-01-22 11:37:53,727 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh 
ambari-qa 
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2026-01-22 11:37:53,741 - Skipping 
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa 
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
 0'] due to not_if
2026-01-22 11:37:53,742 - Group['hdfs'] {}
2026-01-22 11:37:53,747 - User['hdfs'] {'groups': ['hdfs', 'hadoop', 'hdfs'], 
'fetch_nonlocal_groups': True}
2026-01-22 11:37:53,750 - FS Type: HDFS
2026-01-22 11:37:53,750 - Directory['/etc/hadoop'] {'mode': 0o755}
2026-01-22 11:37:53,751 - 
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 
'group': 'hadoop', 'mode': 0o1777}
2026-01-22 11:37:53,829 - Skipping param: gmetad_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-22 11:37:53,830 - Skipping param: gmond_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-22 11:37:53,830 - Skipping param: hbase_user, due to Configuration 
parameter 'hbase-env' was not found in configurations dictionary!
2026-01-22 11:37:53,830 - Skipping param: repo_info, due to Configuration 
parameter 'repoInfo' was not found in configurations dictionary!
2026-01-22 11:37:53,831 - Repository['BIGTOP-3.3.0-repo-2'] {'action': 
['prepare'], 'base_url': 'http://ba-amb-control01.hq.eset.com:8888/', 
'mirror_list': None, 'repo_file_name': 'ambari-bigtop-2', 'repo_template': 
'[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list 
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif 
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'components': ['bigtop', 'main']}
2026-01-22 11:37:53,838 - Repository[None] {'action': ['create']}
2026-01-22 11:37:53,839 - File['/tmp/tmpglktiznv'] {'content': 
b'[BIGTOP-3.3.0-repo-2]\nname=BIGTOP-3.3.0-repo-2\nbaseurl=http://ba-amb-control01.hq.eset.com:8888/\n\npath=/\nenabled=1\ngpgcheck=0',
 'owner': 'root'}
2026-01-22 11:37:53,842 - Writing File['/tmp/tmpglktiznv'] because contents 
don't match
2026-01-22 11:37:53,845 - Moving /tmp/tmp1769078273.8421648_982 to 
/tmp/tmpglktiznv
2026-01-22 11:37:53,865 - Rewriting /etc/yum.repos.d/ambari-bigtop-2.repo since 
it has changed.
2026-01-22 11:37:53,865 - File['/etc/yum.repos.d/ambari-bigtop-2.repo'] 
{'content': StaticFile('/tmp/tmpglktiznv')}
2026-01-22 11:37:53,866 - Writing File['/etc/yum.repos.d/ambari-bigtop-2.repo'] 
because it doesn't exist
2026-01-22 11:37:53,866 - Moving /tmp/tmp1769078273.8661828_445 to 
/etc/yum.repos.d/ambari-bigtop-2.repo
2026-01-22 11:37:53,881 - Package['unzip'] {'retry_on_repo_unavailability': 
False, 'retry_count': 5}
2026-01-22 11:37:54,788 - Skipping installation of existing package unzip
2026-01-22 11:37:54,788 - Package['curl'] {'retry_on_repo_unavailability': 
False, 'retry_count': 5}
2026-01-22 11:37:55,492 - Skipping installation of existing package curl
2026-01-22 11:37:55,492 - Package['bigtop-select'] 
{'retry_on_repo_unavailability': False, 'retry_count': 5}
2026-01-22 11:37:56,456 - Skipping installation of existing package 
bigtop-select
2026-01-22 11:37:56,616 - call[('ambari-python-wrap', 
'/usr/lib/bigtop-select/distro-select', 'versions')] {}
2026-01-22 11:37:56,687 - call returned (0, '3.3.0')
2026-01-22 11:37:56,687 - The 'hadoop-hdfs-client' component did not advertise 
a version. This may indicate a problem with the component packaging. However, 
the stack-select tool was able to report a single version installed (3.3.0). 
This is the version that will be reported.
2026-01-22 11:37:57,295 - Using hadoop conf dir: /etc/hadoop/conf
2026-01-22 11:37:57,298 - Stack Feature Version Info: Cluster Stack=3.3.0, 
Command Stack=None, Command Version=None -> 3.3.0
2026-01-22 11:37:57,305 - Using hadoop conf dir: /etc/hadoop/conf
The dfs.nameservices property is not set or not a string
2026-01-22 11:37:57,311 - Skipping param: dfs_ha_namenode_ids, due to 
Configuration parameter 'dfs.ha.namenodes' was not found in configurations 
dictionary!
2026-01-22 11:37:57,311 - Skipping param: falcon_user, due to Configuration 
parameter 'falcon-env' was not found in configurations dictionary!
2026-01-22 11:37:57,311 - Skipping param: ha_zookeeper_quorum, due to 
Configuration parameter 'ha.zookeeper.quorum' was not found in configurations 
dictionary!
2026-01-22 11:37:57,312 - Skipping param: hbase_user, due to Configuration 
parameter 'hbase-env' was not found in configurations dictionary!
2026-01-22 11:37:57,312 - Skipping param: hdfs_user_keytab, due to 
Configuration parameter 'hdfs_user_keytab' was not found in configurations 
dictionary!
2026-01-22 11:37:57,312 - Skipping param: nfs_file_dump_dir, due to 
Configuration parameter 'nfs.file.dump.dir' was not found in configurations 
dictionary!
2026-01-22 11:37:57,312 - Skipping param: nfsgateway_heapsize, due to 
Configuration parameter 'nfsgateway_heapsize' was not found in configurations 
dictionary!
2026-01-22 11:37:57,313 - Skipping param: oozie_user, due to Configuration 
parameter 'oozie-env' was not found in configurations dictionary!
2026-01-22 11:37:57,313 - Skipping param: smokeuser_principal, due to 
Configuration parameter 'smokeuser_principal_name' was not found in 
configurations dictionary!
2026-01-22 11:37:57,325 - Command repositories: BIGTOP-3.3.0-repo-2
2026-01-22 11:37:57,325 - Applicable repositories: BIGTOP-3.3.0-repo-2
2026-01-22 11:37:57,325 - Looking for matching packages in the following 
repositories: BIGTOP-3.3.0-repo-2


Command aborted. Reason: 'Server considered task failed and automatically 
aborted it'

Command failed after 1 tries

DNF.LOG has the following error

DEBUG Unknown configuration option: path = / in 
/etc/yum.repos.d/ambari-bigtop-1.repo

I thought I had located where path=/ was but this is still occurring.  Also, if 
I edit the target ambari-bigtop-t.repo file to remove this line, the unknown 
script puts it back in again and we still get installation failures.

2026-01-22 11:37:53,831 - Repository['BIGTOP-3.3.0-repo-2'] {'action': 
['prepare'], 'base_url': 'http://ba-amb-control01.hq.eset.com:8888/', 
'mirror_list': None, 'repo_file_name': 'ambari-bigtop-2', 'repo_template': 
'[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list 
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif 
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'components': ['bigtop', 'main']}

2026-01-22 11:37:53,838 - Repository[None] {'action': ['create']}

2026-01-22 11:37:53,839 - File['/tmp/tmpglktiznv'] {'content': 
b'[BIGTOP-3.3.0-repo-2]\nname=BIGTOP-3.3.0-repo-2\nbaseurl=http://ba-amb-control01.hq.eset.com:8888/\n\npath=/\nenabled=1\ngpgcheck=0',
 'owner': 'root'}


Where can I PERMANENTLY disable this? Furthermore, WHY is it actually included? 
 There is NO documentation which has a declaration for path in a yum repository 
configuration file.  The AMBARI documentation you have suggests that we create 
/var/www/html/ambari-repo and then the following script to create the yum 
config to install the ambari components.  Why not just check for this file and 
use that if it exists instead of having another?

# For Rocky Linux 9:
sudo tee /etc/yum.repos.d/ambari.repo << EOF
[ambari]
name=Ambari Repository
baseurl=http://your-server-ip/ambari-repo
gpgcheck=0
enabled=1
EOF

Marc

Reply via email to