I'm also not quite sure if the procedure is correct. I tried it with my cluster
in a similar way but failed:
1. Put cluster in global maintenance
2. Upgraded each host to 4.5.7 by a) Moving all VMs running on the host to
other hosts, b) Removing the host from the cluster, c) Reinstalling via CentOS
9 ISO and `dnf install -y centos-release-ovirt45`, d) Adding the host to the
cluster again
3. Stop ovirt-engine on the engine VM
4. Backup the engine and copy the backup to a existing host
5. From existing host deploy engine again while restoring backup
But the engine deployment failed. I tried it multiple times but to no avail.
Even with debug mode enabled I couldn't figure out the underlying problem:
```
2026-01-22 19:10:05,948+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Restore
engine from file]
2026-01-22 19:10:06,650+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 skipping: [localhost]
2026-01-22 19:10:07,452+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Run
engine-setup with answerfile]
2026-01-22 19:10:52,229+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 changed: [localhost -> 192.168.1.96]
2026-01-22 19:10:53,133+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Make sure
`ovirt-engine` service is running]
2026-01-22 19:10:55,137+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 ok: [localhost -> 192.168.1.96]
2026-01-22 19:10:56,039+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Run
engine-config]
2026-01-22 19:10:57,643+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Restart
engine after engine-config]
2026-01-22 19:10:58,345+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 skipping: [localhost]
2026-01-22 19:10:59,147+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Check if
Engine health page is up]
2026-01-22 19:16:28,793+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109
{'redirected': False, 'url': 'http://localhost/ovirt-engine/services/health',
'status': 500, 'date': 'Thu, 22 Jan 2026 18:16:27 GMT',
'server': 'Apache/2.4.62 (CentOS Stream) OpenSSL/3.5.1', 'content_encoding':
'identity', 'content_type': 'text/html; charset=UTF-8', 'connection': 'close',
'transfer_encoding': 'chunked', 'elapsed': 0, 'changed': False, 'msg': 'Status
code was 500 and
not [200]: HTTP Error 500: Internal Server Error', 'invocation':
{'module_args': {'url': 'http://localhost/ovirt-engine/services/health',
'status_code': [200], 'force': False, 'http_agent': 'ansible-httpget',
'use_proxy': True, 'validate_certs': True,
'force_basic_auth': False, 'use_gssapi': False, 'body_format': 'raw', 'method':
'GET', 'return_content': False, 'follow_redirects': 'safe', 'timeout': 30,
'headers': {}, 'remote_src': False, 'unredirected_headers': [], 'decompress':
True, 'use_netrc': True,
'unsafe_writes': False, 'url_username': None, 'url_password': None,
'client_cert': None, 'client_key': None, 'dest': None, 'body': None, 'src':
None, 'creates': None, 'removes': None, 'unix_socket': None, 'ca_path': None,
'ciphers': None,
'mode': None, 'owner': None, 'group': None, 'seuser': None, 'serole': None,
'selevel': None, 'setype': None, 'attributes': None}}, '_ansible_no_log':
False, 'attempts': 30, '_ansible_delegated_vars': {'ansible_host':
'192.168.1.96', 'ansible_port': None',
'ansible_user': 'root', 'ansible_connection': 'smart'}}
2026-01-22 19:16:28,894+0100 ERROR
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:113
fatal: [localhost -> 192.168.1.96]: FAILED! => {"attempts": 30, "changed":
false, "connection": "close", "content_encoding": "identity"
, "content_type": "text/html; charset=UTF-8", "date": "Thu, 22 Jan 2026
18:16:27 GMT", "elapsed": 0, "msg": "Status code was 500 and not [200]: HTTP
Error 500: Internal Server Error", "redirected": false, "server":
"Apache/2.4.62 (CentOS Stream) Open
SSL/3.5.1", "status": 500, "transfer_encoding": "chunked", "url":
"http://localhost/ovirt-engine/services/health"}
```
As it also failed to copy the engine logs I couldn't look into those either:
```
2026-01-22 19:16:57,258+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Copy
engine logs]
2026-01-22 19:16:58,261+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109
{'results': [{'changed': True, 'stdout': '', 'stderr': 'libguestfs: trace:
set_verbose true\nlibguestfs: trace: set_verbose = 0\n
libguestfs: trace: set_backend "direct"\nlibguestfs: trace: set_backend =
0\nlibguestfs: create: flags = 0, handle = 0x555b13f148d0, program =
virt-copy-out\nlibguestfs: trace: add_drive
"/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2"
"readonly:true"\nlibguestfs: creating COW overlay to protect original drive
content\nlibguestfs: trace: get_tmpdir\nlibguestfs: trace: get_tmpdir =
"/tmp"\nlibguestfs: trace: disk_create "/tmp/libguestfs7wkyEe/overlay1.qcow2"
"qcow2" -1
"backingfile:/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2"\nlibguestfs:
trace: disk_format
"/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2"\nlibguestfs:
command: run: qemu-img --help | grep -sqE -- \'
\\binfo\\b.*-U\\b\'\nlibguestfs: command: run: qemu-img\nlibguestfs: command:
run: \\ info\nlibguestfs: command: run: \\ --output json\nlibguestfs: command:
run: \\
/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2\nqemu-img:
Could not open
\'/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2\':
Failed to get shared "write" lock\nIs another process using the image
[/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2]?\n
libguestfs: error: qemu-img info:
/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2:
qemu-img info exited with error status 1, see debug messages above\nlibguestfs:
trace: disk_format = NULL (error)\nlibguestfs: trace: disk_create = -1 (error)\n
libguestfs: trace: add_drive = -1 (error)\nlibguestfs: trace:
close\nlibguestfs: closing guestfs handle 0x555b13f148d0 (state 0)\nlibguestfs:
command: run: rm\nlibguestfs: command: run: \\ -rf /tmp/libguestfs7wkyEe',
'rc': 1, 'cmd':
['virt-copy-out', '-a',
'/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2',
'/var/log',
'/var/log/ovirt-hosted-engine-setup/engine-logs-2026-01-22T18:02:03Z/'],
'start': '2026-01-22 19:16:57.999751', 'end': '2026-01-22 19:16:58.038786',
'delta': '0:00:00.039035', 'failed': True, 'msg': 'non-zero return code',
'invocation': {'module_args': {'_raw_params': 'virt-copy-out -a
/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2
/var/log /var/log/ovirt-hosted-engine-setup/engine-logs-2026-01-22T18:02:03Z/',
'_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True,
'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes':
None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines':
['libguestfs: trace: set_verbose true', 'libguestfs: trace: set_verbose = 0',
'libguestfs: trace: set_backend "direct"', 'libguestfs: trace: set_backend =
0', 'libguestfs: create: flags = 0, handle = 0x555b13f148d0, program =
virt-copy-out',
'libguestfs: trace: add_drive
"/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2"
"readonly:true"', 'libguestfs: creating COW overlay to protect original drive
content', 'libguestfs: trace: get_tmpdir', 'libguestfs: trace:
get_tmpdir = "/tmp"', 'libguestfs: trace: disk_create
"/tmp/libguestfs7wkyEe/overlay1.qcow2" "qcow2" -1
"backingfile:/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2"',
'libguestfs: trace: disk_format "/var/tmp/localvmllob_pd7/
ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2"', "libguestfs: command: run:
qemu-img --help | grep -sqE -- '\\binfo\\b.*-U\\b'", 'libguestfs: command: run:
qemu-img', 'libguestfs: command: run: \\ info', 'libguestfs: command: run: \\
--output json',
'libguestfs: command: run: \\
/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2',
'qemu-img: Could not open
\'/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2\':
Failed to get shared "write" lock',
'Is another process using the image
[/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2]?',
'libguestfs: error: qemu-img info:
/var/tmp/localvmllob_pd7/ovirt-engine-appliance-centos9-4.5.7-1.el9.qcow2:
qemu-img info exited with
error status 1, see debug messages above', 'libguestfs: trace: disk_format =
NULL (error)', 'libguestfs: trace: disk_create = -1 (error)', 'libguestfs:
trace: add_drive = -1 (error)', 'libguestfs: trace: close', 'libguestfs:
closing guestfs handle 0
x555b13f148d0 (state 0)', 'libguestfs: command: run: rm', 'libguestfs: command:
run: \\ -rf /tmp/libguestfs7wkyEe'], '_ansible_no_log': False, 'item':
'/var/log', 'ansible_loop_var': 'item', '_ansible_item_label': '/var/log'}],
'changed': True, 'msg': 'One or more items failed'}
```
_______________________________________________
Users mailing list -- [email protected]
To unsubscribe send an email to [email protected]
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct:
https://www.ovirt.org/community/about/community-guidelines/
List Archives:
https://lists.ovirt.org/archives/list/[email protected]/message/W6AUHT7SHQJIYSPH43JO7XTIONVUBRVX/