[
https://issues.apache.org/jira/browse/AMBARI-25885?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
LiJie2023 updated AMBARI-25885:
-------------------------------
Description:
When I use ambari to install, some warning messages will appear and prompt
"Command aborted. Reason: 'Server consistent task failed and automatically
aborted it'". But when I click "Retry", the installation will succeed. What is
the reason? My error reason is inconsistent with
+https://issues.apache.org/jira/browse/AMBARI-25069+
!image-2023-03-08-11-12-27-370.png|width=1525,height=348!
Detailed log:
{code:java}
stderr:
Command aborted. Reason: 'Server considered task failed and automatically
aborted it'
{code}
{code:java}
stdout:
2023-03-06 17:23:16,264 - Stack Feature Version Info: Cluster Stack=1.0,
Command Stack=None, Command Version=None -> 1.0
2023-03-06 17:23:16,267 - Group['flink'] {}
2023-03-06 17:23:16,275 - Adding group Group['flink']
2023-03-06 17:23:16,346 - Group['elasticsearch'] {}
2023-03-06 17:23:16,347 - Adding group Group['elasticsearch']
2023-03-06 17:23:16,362 - Group['spark'] {}
2023-03-06 17:23:16,363 - Adding group Group['spark']
2023-03-06 17:23:16,376 - Group['hdfs'] {}
2023-03-06 17:23:16,376 - Adding group Group['hdfs']
2023-03-06 17:23:16,389 - Group['hadoop'] {}
2023-03-06 17:23:16,390 - Adding group Group['hadoop']
2023-03-06 17:23:16,403 - Group['kibana'] {}
2023-03-06 17:23:16,404 - Adding group Group['kibana']
2023-03-06 17:23:16,418 - User['hive'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:16,418 - Adding user User['hive']
2023-03-06 17:23:17,167 - User['zookeeper'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,168 - Adding user User['zookeeper']
2023-03-06 17:23:17,192 - User['efak'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,193 - Adding user User['efak']
2023-03-06 17:23:17,217 - User['ams'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,217 - Adding user User['ams']
2023-03-06 17:23:17,241 - User['hubble'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,241 - Adding user User['hubble']
2023-03-06 17:23:17,266 - User['flink'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['flink', 'hadoop'], 'uid': None}
2023-03-06 17:23:17,266 - Adding user User['flink']
2023-03-06 17:23:17,290 - User['hugegraph'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,290 - Adding user User['hugegraph']
2023-03-06 17:23:17,316 - User['elasticsearch'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,316 - Adding user User['elasticsearch']
2023-03-06 17:23:17,389 - User['spark'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2023-03-06 17:23:17,389 - Adding user User['spark']
2023-03-06 17:23:17,452 - User['ambari-qa'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,452 - Adding user User['ambari-qa']
2023-03-06 17:23:17,595 - User['kafka'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,595 - Adding user User['kafka']
2023-03-06 17:23:17,622 - User['hdfs'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2023-03-06 17:23:17,622 - Adding user User['hdfs']
2023-03-06 17:23:17,645 - User['yarn'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,645 - Adding user User['yarn']
2023-03-06 17:23:17,669 - User['kibana'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'kibana'], 'uid': None}
2023-03-06 17:23:17,669 - Adding user User['kibana']
2023-03-06 17:23:17,692 - User['mapred'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,693 - Adding user User['mapred']
2023-03-06 17:23:17,881 - User['hbase'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,882 - Adding user User['hbase']
2023-03-06 17:23:18,436 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-03-06 17:23:18,440 - Writing
File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
2023-03-06 17:23:18,440 - Changing permission for
/var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
2023-03-06 17:23:18,441 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2023-03-06 17:23:18,446 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] due to not_if
2023-03-06 17:23:18,446 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase',
'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2023-03-06 17:23:18,446 - Creating directory Directory['/tmp/hbase-hbase']
since it doesn't exist.
2023-03-06 17:23:18,447 - Changing owner for /tmp/hbase-hbase from 0 to hbase
2023-03-06 17:23:18,447 - Changing permission for /tmp/hbase-hbase from 755 to
775
2023-03-06 17:23:18,447 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-03-06 17:23:18,448 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-03-06 17:23:18,449 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase']
{}
2023-03-06 17:23:18,457 - call returned (0, '1015')
2023-03-06 17:23:18,458 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015']
{'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2023-03-06 17:23:18,463 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015']
due to not_if
2023-03-06 17:23:18,463 - Group['hdfs'] {}
2023-03-06 17:23:18,464 - User['hdfs'] {'fetch_nonlocal_groups': True,
'groups': ['hdfs', 'hadoop', u'hdfs']}
2023-03-06 17:23:18,465 - FS Type: HDFS
2023-03-06 17:23:18,465 - Directory['/etc/hadoop'] {'mode': 0755}
2023-03-06 17:23:18,465 - Creating directory Directory['/etc/hadoop'] since it
doesn't exist.
2023-03-06 17:23:18,465 -
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs',
'group': 'hadoop', 'mode': 01777}
2023-03-06 17:23:18,466 - Creating directory
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't
exist.
2023-03-06 17:23:18,466 - Changing owner for
/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2023-03-06 17:23:18,466 - Changing group for
/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
2023-03-06 17:23:18,466 - Changing permission for
/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
2023-03-06 17:23:18,478 - Repository['BGTP-1.0-repo-1'] {'base_url':
'http://master.bigdata.repo:5376/bigtop', 'action': ['prepare'], 'components':
[u'BGTP', 'main'], 'repo_template': '[{{repo_id}}]\nname=repo_id\n{% if
mirror_list %}mirrorlist=mirror_list{% else %}baseurl=base_url{% endif
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-bgtp-1',
'mirror_list': None}
2023-03-06 17:23:18,486 - Repository[None] {'action': ['create']}
2023-03-06 17:23:18,487 - File['/tmp/tmpFcKB36'] {'content':
'[BGTP-1.0-repo-1]\nname=BGTP-1.0-repo-1\nbaseurl=http://master.bigdata.repo:5376/bigtop\n\npath=/\nenabled=1\ngpgcheck=0'}
2023-03-06 17:23:18,488 - Writing File['/tmp/tmpFcKB36'] because contents don't
match
2023-03-06 17:23:18,488 - Rewriting /etc/yum.repos.d/ambari-bgtp-1.repo since
it has changed.
2023-03-06 17:23:18,488 - File['/etc/yum.repos.d/ambari-bgtp-1.repo']
{'content': StaticFile('/tmp/tmpFcKB36')}
2023-03-06 17:23:18,489 - Writing File['/etc/yum.repos.d/ambari-bgtp-1.repo']
because it doesn't exist
2023-03-06 17:23:18,489 - Package['unzip'] {'retry_on_repo_unavailability':
False, 'retry_count': 5}
Command aborted. Reason: 'Server considered task failed and automatically
aborted it'
Command failed after 1 tries
{code}
was:
When I use ambari to install, some warning messages will appear and prompt
"Command aborted. Reason: 'Server consistent task failed and automatically
aborted it'". But when I click "Retry", the installation will succeed. What is
the reason? My error reason is inconsistent with
+https://issues.apache.org/jira/browse/AMBARI-25069+
!image-2023-03-08-11-12-27-370.png|width=1525,height=348!
Detailed log:
stderr:
Command aborted. Reason: 'Server considered task failed and automatically
aborted it'
stdout:
2023-03-06 17:23:16,264 - Stack Feature Version Info: Cluster Stack=1.0,
Command Stack=None, Command Version=None -> 1.0
2023-03-06 17:23:16,267 - Group['flink'] {}
2023-03-06 17:23:16,275 - Adding group Group['flink']
2023-03-06 17:23:16,346 - Group['elasticsearch'] {}
2023-03-06 17:23:16,347 - Adding group Group['elasticsearch']
2023-03-06 17:23:16,362 - Group['spark'] {}
2023-03-06 17:23:16,363 - Adding group Group['spark']
2023-03-06 17:23:16,376 - Group['hdfs'] {}
2023-03-06 17:23:16,376 - Adding group Group['hdfs']
2023-03-06 17:23:16,389 - Group['hadoop'] {}
2023-03-06 17:23:16,390 - Adding group Group['hadoop']
2023-03-06 17:23:16,403 - Group['kibana'] {}
2023-03-06 17:23:16,404 - Adding group Group['kibana']
2023-03-06 17:23:16,418 - User['hive'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:16,418 - Adding user User['hive']
2023-03-06 17:23:17,167 - User['zookeeper'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,168 - Adding user User['zookeeper']
2023-03-06 17:23:17,192 - User['efak'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,193 - Adding user User['efak']
2023-03-06 17:23:17,217 - User['ams'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,217 - Adding user User['ams']
2023-03-06 17:23:17,241 - User['hubble'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,241 - Adding user User['hubble']
2023-03-06 17:23:17,266 - User['flink'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['flink', 'hadoop'], 'uid': None}
2023-03-06 17:23:17,266 - Adding user User['flink']
2023-03-06 17:23:17,290 - User['hugegraph'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,290 - Adding user User['hugegraph']
2023-03-06 17:23:17,316 - User['elasticsearch'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,316 - Adding user User['elasticsearch']
2023-03-06 17:23:17,389 - User['spark'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2023-03-06 17:23:17,389 - Adding user User['spark']
2023-03-06 17:23:17,452 - User['ambari-qa'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,452 - Adding user User['ambari-qa']
2023-03-06 17:23:17,595 - User['kafka'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,595 - Adding user User['kafka']
2023-03-06 17:23:17,622 - User['hdfs'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2023-03-06 17:23:17,622 - Adding user User['hdfs']
2023-03-06 17:23:17,645 - User['yarn'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,645 - Adding user User['yarn']
2023-03-06 17:23:17,669 - User['kibana'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'kibana'], 'uid': None}
2023-03-06 17:23:17,669 - Adding user User['kibana']
2023-03-06 17:23:17,692 - User['mapred'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,693 - Adding user User['mapred']
2023-03-06 17:23:17,881 - User['hbase'] \{'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2023-03-06 17:23:17,882 - Adding user User['hbase']
2023-03-06 17:23:18,436 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
\{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-03-06 17:23:18,440 - Writing
File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
2023-03-06 17:23:18,440 - Changing permission for
/var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
2023-03-06 17:23:18,441 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] \{'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2023-03-06 17:23:18,446 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] due to not_if
2023-03-06 17:23:18,446 - Directory['/tmp/hbase-hbase'] \{'owner': 'hbase',
'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2023-03-06 17:23:18,446 - Creating directory Directory['/tmp/hbase-hbase']
since it doesn't exist.
2023-03-06 17:23:18,447 - Changing owner for /tmp/hbase-hbase from 0 to hbase
2023-03-06 17:23:18,447 - Changing permission for /tmp/hbase-hbase from 755 to
775
2023-03-06 17:23:18,447 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
\{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-03-06 17:23:18,448 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
\{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-03-06 17:23:18,449 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase']
{}
2023-03-06 17:23:18,457 - call returned (0, '1015')
2023-03-06 17:23:18,458 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015']
\{'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2023-03-06 17:23:18,463 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015']
due to not_if
2023-03-06 17:23:18,463 - Group['hdfs'] {}
2023-03-06 17:23:18,464 - User['hdfs'] \{'fetch_nonlocal_groups': True,
'groups': ['hdfs', 'hadoop', u'hdfs']}
2023-03-06 17:23:18,465 - FS Type: HDFS
2023-03-06 17:23:18,465 - Directory['/etc/hadoop'] \{'mode': 0755}
2023-03-06 17:23:18,465 - Creating directory Directory['/etc/hadoop'] since it
doesn't exist.
2023-03-06 17:23:18,465 -
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] \{'owner': 'hdfs',
'group': 'hadoop', 'mode': 01777}
2023-03-06 17:23:18,466 - Creating directory
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't
exist.
2023-03-06 17:23:18,466 - Changing owner for
/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2023-03-06 17:23:18,466 - Changing group for
/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
2023-03-06 17:23:18,466 - Changing permission for
/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
2023-03-06 17:23:18,478 - Repository['BGTP-1.0-repo-1'] \{'base_url':
'http://master.bigdata.repo:5376/bigtop', 'action': ['prepare'], 'components':
[u'BGTP', 'main'], 'repo_template': '[{{repo_id}}]\nname=\{{repo_id}}\n\{% if
mirror_list %}mirrorlist=\{{mirror_list}}\{% else %}baseurl=\{{base_url}}\{%
endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-bgtp-1',
'mirror_list': None}
2023-03-06 17:23:18,486 - Repository[None] \{'action': ['create']}
2023-03-06 17:23:18,487 - File['/tmp/tmpFcKB36'] \{'content':
'[BGTP-1.0-repo-1]\nname=BGTP-1.0-repo-1\nbaseurl=http://master.bigdata.repo:5376/bigtop\n\npath=/\nenabled=1\ngpgcheck=0'}
2023-03-06 17:23:18,488 - Writing File['/tmp/tmpFcKB36'] because contents don't
match
2023-03-06 17:23:18,488 - Rewriting /etc/yum.repos.d/ambari-bgtp-1.repo since
it has changed.
2023-03-06 17:23:18,488 - File['/etc/yum.repos.d/ambari-bgtp-1.repo']
\{'content': StaticFile('/tmp/tmpFcKB36')}
2023-03-06 17:23:18,489 - Writing File['/etc/yum.repos.d/ambari-bgtp-1.repo']
because it doesn't exist
2023-03-06 17:23:18,489 - Package['unzip'] \{'retry_on_repo_unavailability':
False, 'retry_count': 5}
Command aborted. Reason: 'Server considered task failed and automatically
aborted it'
Command failed after 1 tries
> “Server considered task failed and automatically aborted it”
> ------------------------------------------------------------
>
> Key: AMBARI-25885
> URL: https://issues.apache.org/jira/browse/AMBARI-25885
> Project: Ambari
> Issue Type: Bug
> Components: ambari-server
> Affects Versions: 2.7.5
> Reporter: LiJie2023
> Priority: Blocker
> Attachments: image-2023-03-08-11-12-27-370.png
>
>
> When I use ambari to install, some warning messages will appear and prompt
> "Command aborted. Reason: 'Server consistent task failed and automatically
> aborted it'". But when I click "Retry", the installation will succeed. What
> is the reason? My error reason is inconsistent with
> +https://issues.apache.org/jira/browse/AMBARI-25069+
>
> !image-2023-03-08-11-12-27-370.png|width=1525,height=348!
> Detailed log:
>
> {code:java}
> stderr:
> Command aborted. Reason: 'Server considered task failed and automatically
> aborted it'
> {code}
> {code:java}
> stdout:
> 2023-03-06 17:23:16,264 - Stack Feature Version Info: Cluster Stack=1.0,
> Command Stack=None, Command Version=None -> 1.0
> 2023-03-06 17:23:16,267 - Group['flink'] {}
> 2023-03-06 17:23:16,275 - Adding group Group['flink']
> 2023-03-06 17:23:16,346 - Group['elasticsearch'] {}
> 2023-03-06 17:23:16,347 - Adding group Group['elasticsearch']
> 2023-03-06 17:23:16,362 - Group['spark'] {}
> 2023-03-06 17:23:16,363 - Adding group Group['spark']
> 2023-03-06 17:23:16,376 - Group['hdfs'] {}
> 2023-03-06 17:23:16,376 - Adding group Group['hdfs']
> 2023-03-06 17:23:16,389 - Group['hadoop'] {}
> 2023-03-06 17:23:16,390 - Adding group Group['hadoop']
> 2023-03-06 17:23:16,403 - Group['kibana'] {}
> 2023-03-06 17:23:16,404 - Adding group Group['kibana']
> 2023-03-06 17:23:16,418 - User['hive'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:16,418 - Adding user User['hive']
> 2023-03-06 17:23:17,167 - User['zookeeper'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,168 - Adding user User['zookeeper']
> 2023-03-06 17:23:17,192 - User['efak'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,193 - Adding user User['efak']
> 2023-03-06 17:23:17,217 - User['ams'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,217 - Adding user User['ams']
> 2023-03-06 17:23:17,241 - User['hubble'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,241 - Adding user User['hubble']
> 2023-03-06 17:23:17,266 - User['flink'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['flink', 'hadoop'], 'uid': None}
> 2023-03-06 17:23:17,266 - Adding user User['flink']
> 2023-03-06 17:23:17,290 - User['hugegraph'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,290 - Adding user User['hugegraph']
> 2023-03-06 17:23:17,316 - User['elasticsearch'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,316 - Adding user User['elasticsearch']
> 2023-03-06 17:23:17,389 - User['spark'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
> 2023-03-06 17:23:17,389 - Adding user User['spark']
> 2023-03-06 17:23:17,452 - User['ambari-qa'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,452 - Adding user User['ambari-qa']
> 2023-03-06 17:23:17,595 - User['kafka'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,595 - Adding user User['kafka']
> 2023-03-06 17:23:17,622 - User['hdfs'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
> 2023-03-06 17:23:17,622 - Adding user User['hdfs']
> 2023-03-06 17:23:17,645 - User['yarn'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,645 - Adding user User['yarn']
> 2023-03-06 17:23:17,669 - User['kibana'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'kibana'], 'uid': None}
> 2023-03-06 17:23:17,669 - Adding user User['kibana']
> 2023-03-06 17:23:17,692 - User['mapred'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,693 - Adding user User['mapred']
> 2023-03-06 17:23:17,881 - User['hbase'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
> 2023-03-06 17:23:17,882 - Adding user User['hbase']
> 2023-03-06 17:23:18,436 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2023-03-06 17:23:18,440 - Writing
> File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
> 2023-03-06 17:23:18,440 - Changing permission for
> /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
> 2023-03-06 17:23:18,441 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
> 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2023-03-06 17:23:18,446 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
> 0'] due to not_if
> 2023-03-06 17:23:18,446 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase',
> 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
> 2023-03-06 17:23:18,446 - Creating directory Directory['/tmp/hbase-hbase']
> since it doesn't exist.
> 2023-03-06 17:23:18,447 - Changing owner for /tmp/hbase-hbase from 0 to hbase
> 2023-03-06 17:23:18,447 - Changing permission for /tmp/hbase-hbase from 755
> to 775
> 2023-03-06 17:23:18,447 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2023-03-06 17:23:18,448 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2023-03-06 17:23:18,449 - call['/var/lib/ambari-agent/tmp/changeUid.sh
> hbase'] {}
> 2023-03-06 17:23:18,457 - call returned (0, '1015')
> 2023-03-06 17:23:18,458 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase
> 1015'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
> 2023-03-06 17:23:18,463 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015']
> due to not_if
> 2023-03-06 17:23:18,463 - Group['hdfs'] {}
> 2023-03-06 17:23:18,464 - User['hdfs'] {'fetch_nonlocal_groups': True,
> 'groups': ['hdfs', 'hadoop', u'hdfs']}
> 2023-03-06 17:23:18,465 - FS Type: HDFS
> 2023-03-06 17:23:18,465 - Directory['/etc/hadoop'] {'mode': 0755}
> 2023-03-06 17:23:18,465 - Creating directory Directory['/etc/hadoop'] since
> it doesn't exist.
> 2023-03-06 17:23:18,465 -
> Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner':
> 'hdfs', 'group': 'hadoop', 'mode': 01777}
> 2023-03-06 17:23:18,466 - Creating directory
> Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't
> exist.
> 2023-03-06 17:23:18,466 - Changing owner for
> /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
> 2023-03-06 17:23:18,466 - Changing group for
> /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
> 2023-03-06 17:23:18,466 - Changing permission for
> /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
> 2023-03-06 17:23:18,478 - Repository['BGTP-1.0-repo-1'] {'base_url':
> 'http://master.bigdata.repo:5376/bigtop', 'action': ['prepare'],
> 'components': [u'BGTP', 'main'], 'repo_template':
> '[{{repo_id}}]\nname=repo_id\n{% if mirror_list %}mirrorlist=mirror_list{%
> else %}baseurl=base_url{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0',
> 'repo_file_name': 'ambari-bgtp-1', 'mirror_list': None}
> 2023-03-06 17:23:18,486 - Repository[None] {'action': ['create']}
> 2023-03-06 17:23:18,487 - File['/tmp/tmpFcKB36'] {'content':
> '[BGTP-1.0-repo-1]\nname=BGTP-1.0-repo-1\nbaseurl=http://master.bigdata.repo:5376/bigtop\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2023-03-06 17:23:18,488 - Writing File['/tmp/tmpFcKB36'] because contents
> don't match
> 2023-03-06 17:23:18,488 - Rewriting /etc/yum.repos.d/ambari-bgtp-1.repo since
> it has changed.
> 2023-03-06 17:23:18,488 - File['/etc/yum.repos.d/ambari-bgtp-1.repo']
> {'content': StaticFile('/tmp/tmpFcKB36')}
> 2023-03-06 17:23:18,489 - Writing File['/etc/yum.repos.d/ambari-bgtp-1.repo']
> because it doesn't exist
> 2023-03-06 17:23:18,489 - Package['unzip'] {'retry_on_repo_unavailability':
> False, 'retry_count': 5}
> Command aborted. Reason: 'Server considered task failed and automatically
> aborted it'
> Command failed after 1 tries
> {code}
>
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]