minixar commented on PR #3718:
URL: https://github.com/apache/ambari/pull/3718#issuecomment-2725020407

   Hi @smallyao 
   I try to deploy Bigtop 3.3.0 on rockylinux8 + java1.8 + python 3 and get the 
following errors:
   
![image](https://github.com/user-attachments/assets/fff35296-a6e4-48f3-8d48-28bfee7d80ea)
   At any node i get similar errors:
   stderr
   2025-03-14 11:09:46,574 - Reporting component version failed
   Traceback (most recent call last):
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 428, in execute
       self.save_component_version_to_structured_out(self.command_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 263, in save_component_version_to_structured_out
       stack_select_package_name = stack_select.get_package_name()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 126, in get_package_name
       package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, 
component_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 252, in get_packages
       supported_packages = get_supported_packages()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 164, in get_supported_packages
       raise Fail(f"Unable to query for supported packages using 
{stack_selector_path}")
   resource_management.core.exceptions.Fail: Unable to query for supported 
packages using /usr/lib/bigtop-select/distro-select
   NoneType: None
   
   The above exception was the cause of the following exception:
   
   2025-03-14 11:09:50,237 - Reporting component version failed
   Traceback (most recent call last):
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 413, in execute
       method(env)
     File 
"/var/lib/ambari-agent/cache/stacks/BIGTOP/3.2.0/services/HDFS/package/scripts/datanode.py",
 line 59, in install
       self.install_packages(env)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 994, in install_packages
       name = self.format_package_name(package["name"])
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 653, in format_package_name
       return self.get_package_from_available(name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 615, in get_package_from_available
       raise Fail(f"No package found for {name}(expected name: 
{name_with_version})")
   resource_management.core.exceptions.Fail: No package found for 
hadoop_${stack_version}(expected name: hadoop_3_3_0)
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 428, in execute
       self.save_component_version_to_structured_out(self.command_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 263, in save_component_version_to_structured_out
       stack_select_package_name = stack_select.get_package_name()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 126, in get_package_name
       package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, 
component_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 252, in get_packages
       supported_packages = get_supported_packages()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 164, in get_supported_packages
       raise Fail(f"Unable to query for supported packages using 
{stack_selector_path}")
   resource_management.core.exceptions.Fail: Unable to query for supported 
packages using /usr/lib/bigtop-select/distro-select
   Traceback (most recent call last):
     File 
"/var/lib/ambari-agent/cache/stacks/BIGTOP/3.2.0/services/HDFS/package/scripts/datanode.py",
 line 196, in <module>
       DataNode().execute()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 413, in execute
       method(env)
     File 
"/var/lib/ambari-agent/cache/stacks/BIGTOP/3.2.0/services/HDFS/package/scripts/datanode.py",
 line 59, in install
       self.install_packages(env)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 994, in install_packages
       name = self.format_package_name(package["name"])
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 653, in format_package_name
       return self.get_package_from_available(name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 615, in get_package_from_available
       raise Fail(f"No package found for {name}(expected name: 
{name_with_version})")
   resource_management.core.exceptions.Fail: No package found for 
hadoop_${stack_version}(expected name: hadoop_3_3_0)
   
   
   stdout
   2025-03-14 11:09:41,981 - Stack Feature Version Info: Cluster Stack=3.3.0, 
Command Stack=None, Command Version=None -> 3.3.0
   2025-03-14 11:09:41,983 - Using hadoop conf dir: /etc/hadoop/conf
   2025-03-14 11:09:41,984 - Skipping param: datanode_max_locked_memory, due to 
Configuration parameter 'dfs.datanode.max.locked.memory' was not found in 
configurations dictionary!
   2025-03-14 11:09:41,984 - Skipping param: dfs_ha_namenode_ids, due to 
Configuration parameter 'dfs.ha.namenodes' was not found in configurations 
dictionary!
   2025-03-14 11:09:41,984 - Skipping param: falcon_user, due to Configuration 
parameter 'falcon-env' was not found in configurations dictionary!
   2025-03-14 11:09:41,985 - Skipping param: gmetad_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
   2025-03-14 11:09:41,985 - Skipping param: gmond_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
   2025-03-14 11:09:41,985 - Skipping param: nfsgateway_heapsize, due to 
Configuration parameter 'nfsgateway_heapsize' was not found in configurations 
dictionary!
   2025-03-14 11:09:41,985 - Skipping param: oozie_user, due to Configuration 
parameter 'oozie-env' was not found in configurations dictionary!
   2025-03-14 11:09:41,985 - Skipping param: repo_info, due to Configuration 
parameter 'repoInfo' was not found in configurations dictionary!
   2025-03-14 11:09:41,985 - Skipping param: zeppelin_group, due to 
Configuration parameter 'zeppelin-env' was not found in configurations 
dictionary!
   2025-03-14 11:09:41,985 - Skipping param: zeppelin_user, due to 
Configuration parameter 'zeppelin-env' was not found in configurations 
dictionary!
   2025-03-14 11:09:41,986 - Group['kms'] {}
   2025-03-14 11:09:41,987 - Group['flink'] {}
   2025-03-14 11:09:41,987 - Group['spark'] {}
   2025-03-14 11:09:41,987 - Group['ranger'] {}
   2025-03-14 11:09:41,987 - Group['hdfs'] {}
   2025-03-14 11:09:41,987 - Group['hadoop'] {}
   2025-03-14 11:09:41,988 - User['yarn-ats'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:41,997 - User['hive'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:41,999 - User['infra-solr'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,002 - User['zookeeper'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,005 - User['ams'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,014 - User['ranger'] {'uid': None, 'gid': 'hadoop', 
'groups': ['ranger', 'hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,018 - User['tez'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,021 - User['kms'] {'uid': None, 'gid': 'hadoop', 
'groups': ['kms', 'hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,025 - User['flink'] {'uid': None, 'gid': 'hadoop', 
'groups': ['flink', 'hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,027 - User['spark'] {'uid': None, 'gid': 'hadoop', 
'groups': ['spark', 'hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,029 - User['ambari-qa'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,032 - User['hdfs'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hdfs', 'hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,034 - User['yarn'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,038 - User['mapred'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,040 - User['hbase'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,042 - User['hcat'] {'uid': None, 'gid': 'hadoop', 
'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,044 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] 
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
   2025-03-14 11:09:42,046 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh 
ambari-qa 
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
   2025-03-14 11:09:42,052 - Skipping 
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa 
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
 0'] due to not_if
   2025-03-14 11:09:42,053 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 
'mode': 0o775, 'create_parents': True, 'cd_access': 'a'}
   2025-03-14 11:09:42,054 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] 
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
   2025-03-14 11:09:42,056 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] 
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
   2025-03-14 11:09:42,056 - call['/var/lib/ambari-agent/tmp/changeUid.sh 
hbase'] {}
   2025-03-14 11:09:42,067 - call returned (0, '1014')
   2025-03-14 11:09:42,068 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh 
hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 
1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
   2025-03-14 11:09:42,077 - Skipping 
Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase 
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] 
due to not_if
   2025-03-14 11:09:42,078 - Group['hdfs'] {}
   2025-03-14 11:09:42,078 - User['hdfs'] {'groups': ['hdfs', 'hadoop', 
'hdfs'], 'fetch_nonlocal_groups': True}
   2025-03-14 11:09:42,084 - FS Type: HDFS
   2025-03-14 11:09:42,084 - Directory['/etc/hadoop'] {'mode': 0o755}
   2025-03-14 11:09:42,084 - 
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 
'group': 'hadoop', 'mode': 0o1777}
   2025-03-14 11:09:42,140 - Skipping param: gmetad_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
   2025-03-14 11:09:42,140 - Skipping param: gmond_user, due to Configuration 
parameter 'ganglia-env' was not found in configurations dictionary!
   2025-03-14 11:09:42,141 - Skipping param: repo_info, due to Configuration 
parameter 'repoInfo' was not found in configurations dictionary!
   2025-03-14 11:09:42,141 - Repository['BIGTOP-3.3.0-repo-51'] {'action': 
['prepare'], 'base_url': 
'http://repos.bigtop.apache.org/releases/3.3.0/rockylinux/8/x86_64', 
'mirror_list': None, 'repo_file_name': 'ambari-bigtop-51', 'repo_template': 
'[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list 
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif 
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'components': ['bigtop', 'main']}
   2025-03-14 11:09:42,150 - Repository[None] {'action': ['create']}
   2025-03-14 11:09:42,152 - File['/tmp/tmpj9_s6pue'] {'content': 
b'[BIGTOP-3.3.0-repo-51]\nname=BIGTOP-3.3.0-repo-51\nbaseurl=http://repos.bigtop.apache.org/releases/3.3.0/rockylinux/8/x86_64\n\npath=/\nenabled=1\ngpgcheck=0',
 'owner': 'root'}
   2025-03-14 11:09:42,153 - Writing File['/tmp/tmpj9_s6pue'] because contents 
don't match
   2025-03-14 11:09:42,153 - Moving /tmp/tmp1741964982.153123_667 to 
/tmp/tmpj9_s6pue
   2025-03-14 11:09:42,159 - File['/tmp/tmp1g8_ht5d'] {'content': 
StaticFile('/etc/yum.repos.d/ambari-bigtop-51.repo'), 'owner': 'root'}
   2025-03-14 11:09:42,159 - Writing File['/tmp/tmp1g8_ht5d'] because contents 
don't match
   2025-03-14 11:09:42,159 - Moving /tmp/tmp1741964982.1596704_820 to 
/tmp/tmp1g8_ht5d
   2025-03-14 11:09:42,165 - Package['unzip'] {'retry_on_repo_unavailability': 
False, 'retry_count': 5}
   2025-03-14 11:09:44,470 - Skipping installation of existing package unzip
   2025-03-14 11:09:44,471 - Package['curl'] {'retry_on_repo_unavailability': 
False, 'retry_count': 5}
   2025-03-14 11:09:46,230 - Skipping installation of existing package curl
   2025-03-14 11:09:46,231 - Package['bigtop-select'] 
{'retry_on_repo_unavailability': False, 'retry_count': 5}
   2025-03-14 11:09:46,527 - Skipping installation of existing package 
bigtop-select
   2025-03-14 11:09:46,574 - Reporting component version failed
   Traceback (most recent call last):
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 428, in execute
       self.save_component_version_to_structured_out(self.command_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 263, in save_component_version_to_structured_out
       stack_select_package_name = stack_select.get_package_name()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 126, in get_package_name
       package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, 
component_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 252, in get_packages
       supported_packages = get_supported_packages()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 164, in get_supported_packages
       raise Fail(f"Unable to query for supported packages using 
{stack_selector_path}")
   resource_management.core.exceptions.Fail: Unable to query for supported 
packages using /usr/lib/bigtop-select/distro-select
   2025-03-14 11:09:47,431 - Using hadoop conf dir: /etc/hadoop/conf
   2025-03-14 11:09:47,433 - Stack Feature Version Info: Cluster Stack=3.3.0, 
Command Stack=None, Command Version=None -> 3.3.0
   2025-03-14 11:09:47,444 - Using hadoop conf dir: /etc/hadoop/conf
   The dfs.nameservices property is not set or not a string
   2025-03-14 11:09:47,450 - Skipping param: dfs_ha_namenode_ids, due to 
Configuration parameter 'dfs.ha.namenodes' was not found in configurations 
dictionary!
   2025-03-14 11:09:47,451 - Skipping param: falcon_user, due to Configuration 
parameter 'falcon-env' was not found in configurations dictionary!
   2025-03-14 11:09:47,451 - Skipping param: ha_zookeeper_quorum, due to 
Configuration parameter 'ha.zookeeper.quorum' was not found in configurations 
dictionary!
   2025-03-14 11:09:47,451 - Skipping param: hdfs_user_keytab, due to 
Configuration parameter 'hdfs_user_keytab' was not found in configurations 
dictionary!
   2025-03-14 11:09:47,452 - Skipping param: nfs_file_dump_dir, due to 
Configuration parameter 'nfs.file.dump.dir' was not found in configurations 
dictionary!
   2025-03-14 11:09:47,452 - Skipping param: nfsgateway_heapsize, due to 
Configuration parameter 'nfsgateway_heapsize' was not found in configurations 
dictionary!
   2025-03-14 11:09:47,452 - Skipping param: oozie_user, due to Configuration 
parameter 'oozie-env' was not found in configurations dictionary!
   2025-03-14 11:09:47,452 - Skipping param: smokeuser_principal, due to 
Configuration parameter 'smokeuser_principal_name' was not found in 
configurations dictionary!
   2025-03-14 11:09:47,467 - Command repositories: BIGTOP-3.3.0-repo-51
   2025-03-14 11:09:47,467 - Applicable repositories: BIGTOP-3.3.0-repo-51
   2025-03-14 11:09:47,468 - Looking for matching packages in the following 
repositories: BIGTOP-3.3.0-repo-51
   2025-03-14 11:09:50,237 - Reporting component version failed
   Traceback (most recent call last):
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 413, in execute
       method(env)
     File 
"/var/lib/ambari-agent/cache/stacks/BIGTOP/3.2.0/services/HDFS/package/scripts/datanode.py",
 line 59, in install
       self.install_packages(env)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 994, in install_packages
       name = self.format_package_name(package["name"])
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 653, in format_package_name
       return self.get_package_from_available(name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 615, in get_package_from_available
       raise Fail(f"No package found for {name}(expected name: 
{name_with_version})")
   resource_management.core.exceptions.Fail: No package found for 
hadoop_${stack_version}(expected name: hadoop_3_3_0)
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 428, in execute
       self.save_component_version_to_structured_out(self.command_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
line 263, in save_component_version_to_structured_out
       stack_select_package_name = stack_select.get_package_name()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 126, in get_package_name
       package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, 
component_name)
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 252, in get_packages
       supported_packages = get_supported_packages()
     File 
"/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py",
 line 164, in get_supported_packages
       raise Fail(f"Unable to query for supported packages using 
{stack_selector_path}")
   resource_management.core.exceptions.Fail: Unable to query for supported 
packages using /usr/lib/bigtop-select/distro-select
   
   Command failed after 1 tries
   
   
   I checked the bigtop-select package is installed in all the hosts.
   But when i run the bigtop-select script, i get the following output:
   [root@rocky8-big-data-cluster-n07 bigtop-select]# ./distro-select 
   alluxio - None
   alluxio-master - None
   alluxio-worker - None
   flink-client - None
   flink-historyserver - None
   hadoop-client - None
   hadoop-hdfs-client - None
   hadoop-hdfs-datanode - None
   hadoop-hdfs-journalnode - None
   hadoop-hdfs-namenode - None
   hadoop-hdfs-secondarynamenode - None
   hadoop-hdfs-zkfc - None
   hadoop-mapreduce-client - None
   hadoop-mapreduce-historyserver - None
   hadoop-yarn-client - None
   hadoop-yarn-nodemanager - None
   hadoop-yarn-resourcemanager - None
   hbase-client - None
   hbase-master - None
   hbase-regionserver - None
   hive-client - None
   hive-metastore - None
   hive-server2 - None
   hive-webhcat - None
   kafka-broker - None
   phoenix-client - None
   phoenix-server - None
   ranger-admin - None
   ranger-tagsync - None
   ranger-usersync - None
   solr-server - None
   spark-client - None
   spark-historyserver - None
   spark-thriftserver - None
   tez-client - None
   zeppelin-server - None
   zookeeper-client - None
   zookeeper-server - None
   
   I think there is a problem with the repo and the package version selected of 
each service.  Do you think is possible that rockylinux8 bigtop support is the 
problem?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@ambari.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@ambari.apache.org
For additional commands, e-mail: dev-h...@ambari.apache.org

Reply via email to