Hi all,

everyday at 01:00 AM we perform a daily backup on many VMs hosted in multiple 
hosts (all with oVirt 4.3) using a custom sciprt written in python3 (using SDK) 
and everything works "almost" fine.

There is one single VM (Windows Server 2016) with a disk of 600 GB (the real 
disk usage is about 150 GB) hosted on a single node that has a strange 
behaviour. 

1) The export start after a few seconds of the execution of the script, we use 
"vm_service.export_to_path_on_host" class for exporting the VM as ova file;
2) After few minutes i see in the engine the command "START, 
DumpXmlsVDSCommand" that is completely istantly both on the host side ad the 
engine side, and it's fine:

2020-07-02 01:05:45,428+0200 INFO  (jsonrpc/7) [api.host] START 
dumpxmls(vmList=[u'10e88cab-ec4f-4491-b51f-94e3d2e81a0a']) 
from=::ffff:$ENGINE_IP,39308 (api:48)
2020-07-02 01:05:45,428+0200 INFO  (jsonrpc/7) [api.host] FINISH dumpxmls 
return={... LONG XML ...}


3) after 3 hours i see the following logs about the export task:
        
        
2020-07-02 04:11:39,201+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetVolumeInfoVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-91) 
[895d6936-f45c-4766-afd4-408a4e4e9a41] START, GetVolumeInfoVDSCommand(HostName 
= $OVIRT_HOST, 
GetVolumeInfoVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5',
 storagePoolId='79d774b7-ca5b-49c2-baf8-9275ba3f1a84', 
storageDomainId='6775c41c-7d67-451b-8beb-4fd086eade2e', 
imageGroupId='a084fa36-0f93-45c2-a323-ea9ca2d16677', 
imageId='55b3eac5-05b2-4bae-be50-37cde7050697'}), log id: 3cbf2c7c
2020-07-02 04:11:39,299+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetQemuImageInfoVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-91) 
[895d6936-f45c-4766-afd4-408a4e4e9a41] START, 
GetQemuImageInfoVDSCommand(HostName = $OVIRT_HOST, 
GetVolumeInfoVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5',
 storagePoolId='79d774b7-ca5b-49c2-baf8-9275ba3f1a84', 
storageDomainId='6775c41c-7d67-451b-8beb-4fd086eade2e', 
imageGroupId='a084fa36-0f93-45c2-a323-ea9ca2d16677', 
imageId='55b3eac5-05b2-4bae-be50-37cde7050697'}), log id: 47d5122e
2020-07-02 04:43:21,339+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.PrepareImageVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-19) [312261bb] START, 
PrepareImageVDSCommand(HostName = $OVIRT_HOST, 
PrepareImageVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5'}),
 log id: 28c880d12020-07-02 04:43:21,650+02 INFO  
[org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor] 
(EE-ManagedThreadFactory-engineScheduled-Thread-19) [312261bb] Executing 
Ansible command: ANSIBLE_STDOUT_CALLBACK=imagemeasureplugin 
/usr/bin/ansible-playbook --ssh-common-args=-F 
/var/lib/ovirt-engine/.ssh/config 
--private-key=/etc/pki/ovirt-engine/keys/engine_id_rsa 
--inventory=/tmp/ansible-inventory1121551177697847734 
--extra-vars=image_path="/rhev/data-center/mnt/_data/6775c41c-7d67-451b-8beb-4fd086eade2e/images/a084fa36-0f93-45c2-a323-ea9ca2d16677/5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04"
 /usr/share/ovirt-engine/playbooks/ovirt-image-measure.yml [Logfile: 
/var/log/ovirt-engine/ova
 /ovirt-image-measure-ansible-20200702044321-$OVIRT_HOST-312261bb.log]
2020-07-02 05:05:20,428+02 INFO  
[org.ovirt.engine.core.vdsbroker.irsbroker.UploadStreamVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-8) [64c95c4d] START, 
UploadStreamVDSCommand(HostName = $OVIRT_HOST, 
UploadStreamVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5'}),
 log id: 666809cb
2020-07-02 05:05:22,104+02 INFO  
[org.ovirt.engine.core.vdsbroker.irsbroker.UploadStreamVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-8) [64c95c4d] START, 
UploadStreamVDSCommand(HostName = $OVIRT_HOST, 
UploadStreamVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5'}),
 log id: 33ff46bf
2020-07-02 05:05:29,602+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(EE-ManagedThreadFactory-engine-Thread-287) [64c95c4d] START, 
HSMClearTaskVDSCommand(HostName = $OVIRT_HOST, 
HSMTaskGuidBaseVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5',
 taskId='e40a2740-37f6-455d-bed7-554efef761ff'}), log id: f01ff26
2020-07-02 05:05:29,619+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(EE-ManagedThreadFactory-engine-Thread-269) [64c95c4d] START, 
HSMClearTaskVDSCommand(HostName = $OVIRT_HOST, 
HSMTaskGuidBaseVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5',
 taskId='1c94df38-88c1-4bf8-89a2-8c322513b21b'}), log id: 6150e13b  
        
4) And then, after several hours, the export fail with the following logs:

engine logs.

2020-07-02 12:43:21,653+02 ERROR [org.ovirt.engine.core.bll.CreateOvaCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-19) [312261bb] Failed to 
measure image: null. Please check logs for more details: 
/var/log/ovirt-engine/ova/ovirt-image-measure-ansible-20200702044321-$OVIRT_HOST-312261bb.log
2020-07-02 12:43:23,741+02 ERROR 
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] 
(EE-ManagedThreadFactory-engineScheduled-Thread-28) [] EVENT_ID: 
IMPORTEXPORT_EXPORT_VM_TO_OVA_FAILED(1,225), Failed to export Vm 
$OVIRT_VM_TO_EXPORT as a Virtual Appliance to path 
/backup/$OVIRT_VM_TO_EXPORT_daily_2020Jul02.ova on Host $OVIRT_HOST
2020-07-02 12:43:26,162+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.MergeVDSCommand] 
(EE-ManagedThreadFactory-commandCoordinator-Thread-9) [54b569e8] START, 
MergeVDSCommand(HostName = $OVIRT_HOST, 
MergeVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5', 
vmId='10e88cab-ec4f-4491-b51f-94e3d2e81a0a', 
storagePoolId='79d774b7-ca5b-49c2-baf8-9275ba3f1a84', 
storageDomainId='6775c41c-7d67-451b-8beb-4fd086eade2e', 
imageGroupId='a084fa36-0f93-45c2-a323-ea9ca2d16677', 
imageId='55b3eac5-05b2-4bae-be50-37cde7050697', 
baseImageId='5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', 
topImageId='55b3eac5-05b2-4bae-be50-37cde7050697', bandwidth='0'}), log id: 
a0b417d
2020-07-02 12:43:26,328+02 ERROR 
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] 
(EE-ManagedThreadFactory-commandCoordinator-Thread-9) [54b569e8] EVENT_ID: 
VDS_BROKER_COMMAND_FAILURE(10,802), VDSM $OVIRT_HOST command MergeVDS failed: 
Merge failed
2020-07-02 12:43:26,328+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.MergeVDSCommand] 
(EE-ManagedThreadFactory-commandCoordinator-Thread-9) [54b569e8] HostName = 
$OVIRT_HOST
2020-07-02 12:43:26,328+02 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.MergeVDSCommand] 
(EE-ManagedThreadFactory-commandCoordinator-Thread-9) [54b569e8] Command 
'MergeVDSCommand(HostName = $OVIRT_HOST, 
MergeVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5', 
vmId='10e88cab-ec4f-4491-b51f-94e3d2e81a0a', 
storagePoolId='79d774b7-ca5b-49c2-baf8-9275ba3f1a84', 
storageDomainId='6775c41c-7d67-451b-8beb-4fd086eade2e', 
imageGroupId='a084fa36-0f93-45c2-a323-ea9ca2d16677', 
imageId='55b3eac5-05b2-4bae-be50-37cde7050697', 
baseImageId='5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', 
topImageId='55b3eac5-05b2-4bae-be50-37cde7050697', bandwidth='0'})' execution 
failed: VDSGenericException: VDSErrorException: Failed to MergeVDS, error = 
Merge failed, code = 52
2020-07-02 12:43:28,996+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.DumpXmlsVDSCommand] 
(EE-ManagedThreadFactory-commandCoordinator-Thread-2) [54b569e8] START, 
DumpXmlsVDSCommand(HostName = $OVIRT_HOST, 
Params:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5', 
vmIds='[10e88cab-ec4f-4491-b51f-94e3d2e81a0a]'}), log id: 65709a96
2020-07-02 13:05:38,206+02 INFO  
[org.ovirt.engine.core.vdsbroker.irsbroker.UploadStreamVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-45) [5179fcb] START, 
UploadStreamVDSCommand(HostName = $OVIRT_HOST, 
UploadStreamVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5'}),
 log id: 44e9ec23
2020-07-02 13:05:39,851+02 INFO  
[org.ovirt.engine.core.vdsbroker.irsbroker.UploadStreamVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-45) [5179fcb] START, 
UploadStreamVDSCommand(HostName = $OVIRT_HOST, 
UploadStreamVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5'}),
 log id: 68231c70
2020-07-02 13:05:40,356+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(EE-ManagedThreadFactory-engine-Thread-240) [5179fcb] START, 
HSMClearTaskVDSCommand(HostName = $OVIRT_HOST, 
HSMTaskGuidBaseVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5',
 taskId='5d61682b-cf47-4637-9014-c647eb5265ee'}), log id: 718885f3
2020-07-02 13:05:50,357+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(EE-ManagedThreadFactory-engine-Thread-290) [5179fcb] START, 
HSMClearTaskVDSCommand(HostName = $OVIRT_HOST, 
HSMTaskGuidBaseVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5',
 taskId='5709eb6c-a656-4a4d-b67f-72dd0bebf3e4'}), log id: 40c18d57
        


And here the vdsm.log of the host:

2020-07-02 12:43:26,169+0200 INFO  (jsonrpc/6) [api.virt] START 
merge(drive={u'imageID': u'a084fa36-0f93-45c2-a323-ea9ca2d16677', u'volumeID': 
u'55b3eac5-05b2-4bae-be50-37cde7050697', u'domainID': 
u'6775c41c-7d67-451b-8beb-4fd086eade2e', u'poolID': 
u'79d774b7-ca5b-49c2-baf8-9275ba3f1a84'}, 
baseVolUUID=u'5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', 
topVolUUID=u'55b3eac5-05b2-4bae-be50-37cde7050697', bandwidth=u'0', 
jobUUID=u'b49fa67b-4a5a-4f73-823f-9e33a8e20cd2') from=::ffff:$ENGINE_IP,39308, 
flow_id=54b569e8, vmId=10e88cab-ec4f-4491-b51f-94e3d2e81a0a (api:48)
2020-07-02 12:43:26,196+0200 INFO  (jsonrpc/6) [vdsm.api] START 
getVolumeInfo(sdUUID=u'6775c41c-7d67-451b-8beb-4fd086eade2e', 
spUUID='79d774b7-ca5b-49c2-baf8-9275ba3f1a84', 
imgUUID=u'a084fa36-0f93-45c2-a323-ea9ca2d16677', 
volUUID=u'5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', options=None) 
from=::ffff:$ENGINE_IP,39308, flow_id=54b569e8, 
task_id=bb1d1fd0-9746-4949-9c6c-b069947f5875 (api:48)
2020-07-02 12:43:26,199+0200 INFO  (jsonrpc/6) [storage.VolumeManifest] Info 
request: sdUUID=6775c41c-7d67-451b-8beb-4fd086eade2e 
imgUUID=a084fa36-0f93-45c2-a323-ea9ca2d16677 volUUID = 
5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04  (volume:240)
2020-07-02 12:43:26,205+0200 INFO  (jsonrpc/6) [storage.VolumeManifest] 
6775c41c-7d67-451b-8beb-4fd086eade2e/a084fa36-0f93-45c2-a323-ea9ca2d16677/5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04
 info is {'status': 'OK', 'domain': '6775c41c-7d67-451b-8beb-4fd086eade2e', 
'voltype': 'INTERNAL', 'description': '', 'parent': 
'7e802d0d-d6cd-4979-8f8d-5f4c5d5c3013', 'format': 'COW', 'generation': 0, 
'image': 'a084fa36-0f93-45c2-a323-ea9ca2d16677', 'disktype': 'DATA', 
'legality': 'LEGAL', 'mtime': '0', 'apparentsize': '6635716608', 'children': 
[], 'pool': '', 'ctime': '1593558283', 'capacity': '644245094400', 'uuid': 
u'5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', 'truesize': '6637277184', 'type': 
'SPARSE'} (volume:279)
2020-07-02 12:43:26,205+0200 INFO  (jsonrpc/6) [vdsm.api] FINISH getVolumeInfo 
return={'info': {'status': 'OK', 'domain': 
'6775c41c-7d67-451b-8beb-4fd086eade2e', 'voltype': 'INTERNAL', 'description': 
'', 'parent': '7e802d0d-d6cd-4979-8f8d-5f4c5d5c3013', 'format': 'COW', 
'generation': 0, 'image': 'a084fa36-0f93-45c2-a323-ea9ca2d16677', 'disktype': 
'DATA', 'legality': 'LEGAL', 'mtime': '0', 'apparentsize': 
'6635716608','children': [], 'pool': '', 'ctime': '1593558283', 'capacity': 
'644245094400', 'uuid': u'5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', 'truesize': 
'6637277184', 'type': 'SPARSE'}} from=::ffff:$ENGINE_IP,39308, 
flow_id=54b569e8, task_id=bb1d1fd0-9746-4949-9c6c-b069947f5875 (api:54)
2020-07-02 12:43:26,206+0200 INFO  (jsonrpc/6) [vdsm.api] START 
getVolumeInfo(sdUUID=u'6775c41c-7d67-451b-8beb-4fd086eade2e', 
spUUID='79d774b7-ca5b-49c2-baf8-9275ba3f1a84', 
imgUUID=u'a084fa36-0f93-45c2-a323-ea9ca2d16677', 
volUUID=u'55b3eac5-05b2-4bae-be50-37cde7050697', options=None) 
from=::ffff:$ENGINE_IP,39308, flow_id=54b569e8, 
task_id=32ce9b21-b9f3-4319-a229-b32df1fb9283 (api:48)
2020-07-02 12:43:26,208+0200 INFO  (jsonrpc/6) [storage.VolumeManifest] Info 
request: sdUUID=6775c41c-7d67-451b-8beb-4fd086eade2e 
imgUUID=a084fa36-0f93-45c2-a323-ea9ca2d16677 volUUID = 
55b3eac5-05b2-4bae-be50-37cde7050697  (volume:240)
2020-07-02 12:43:26,219+0200 INFO  (jsonrpc/6) [storage.VolumeManifest] 
6775c41c-7d67-451b-8beb-4fd086eade2e/a084fa36-0f93-45c2-a323-ea9ca2d16677/55b3eac5-05b2-4bae-be50-37cde7050697
 info is {'status': 'OK', 'domain': '6775c41c-7d67-451b-8beb-4fd086eade2e', 
'voltype': 'LEAF', 'description': '', 'parent': 
'5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', 'format': 'COW', 'generation': 0, 
'image': 'a084fa36-0f93-45c2-a323-ea9ca2d16677', 'disktype': 'DATA', 
'legality': 'LEGAL', 'mtime': '0', 'apparentsize': '4077387776', 'children': 
[], 'pool': '', 'ctime': '1593644682', 'capacity': '644245094400', 'uuid': 
u'55b3eac5-05b2-4bae-be50-37cde7050697', 'truesize': '4077940736', 'type': 
'SPARSE'} (volume:279)
2020-07-02 12:43:26,219+0200 INFO  (jsonrpc/6) [vdsm.api] FINISH getVolumeInfo 
return={'info': {'status': 'OK', 'domain': 
'6775c41c-7d67-451b-8beb-4fd086eade2e', 'voltype': 'LEAF', 'description': '', 
'parent': '5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04', 'format': 'COW', 
'generation': 0, 'image': 'a084fa36-0f93-45c2-a323-ea9ca2d16677', 'disktype': 
'DATA', 'legality': 'LEGAL', 'mtime': '0', 'apparentsize': '4077387776', 
'children': [], 'pool': '', 'ctime': '1593644682', 'capacity': '644245094400', 
'uuid': u'55b3eac5-05b2-4bae-be50-37cde7050697', 'truesize': '4077940736', 
'type': 'SPARSE'}} from=::ffff:$ENGINE_IP,39308, flow_id=54b569e8, 
task_id=32ce9b21-b9f3-4319-a229-b32df1fb9283 (api:54)
2020-07-02 12:43:26,235+0200 INFO  (jsonrpc/6) [virt.vm] 
(vmId='10e88cab-ec4f-4491-b51f-94e3d2e81a0a') Starting merge with 
jobUUID=u'b49fa67b-4a5a-4f73-823f-9e33a8e20cd2', original 
chain=e2bad960-6ad4-44b6-8ff0-69dcca6bf722 < 
7e802d0d-d6cd-4979-8f8d-5f4c5d5c3013 < 5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04 < 
55b3eac5-05b2-4bae-be50-37cde7050697 (top), disk='sda', base='sda[1]', 
top=None, bandwidth=0, flags=12 (vm:5951)
2020-07-02 12:43:26,280+0200 ERROR (jsonrpc/6) [virt.vm] 
(vmId='10e88cab-ec4f-4491-b51f-94e3d2e81a0a') Live merge failed (job: 
b49fa67b-4a5a-4f73-823f-9e33a8e20cd2) (vm:5957)
2020-07-02 12:43:26,296+0200 INFO  (jsonrpc/6) [api.virt] FINISH merge 
return={'status': {'message': 'Merge failed', 'code': 52}} 
from=::ffff:$ENGINE_IP,39308, flow_id=54b569e8, 
vmId=10e88cab-ec4f-4491-b51f-94e3d2e81a0a (api:54)
2020-07-02 12:43:26,296+0200 INFO  (jsonrpc/6) [jsonrpc.JsonRpcServer] RPC call 
VM.merge failed (error 52) in 0.13 seconds (__init__:312)
2020-07-02 12:43:26,627+0200 INFO  (jsonrpc/1) [vdsm.api] START 
getSpmStatus(spUUID=u'79d774b7-ca5b-49c2-baf8-9275ba3f1a84', options=None) 
from=::ffff:$ENGINE_IP,39308, flow_id=4968c8cb, 
task_id=ee82b2d6-39f5-43ad-83a3-803f1af8e234 (api:48)
2020-07-02 12:43:26,628+0200 INFO  (jsonrpc/1) [vdsm.api] FINISH getSpmStatus 
return={'spm_st': {'spmId': 1, 'spmStatus': 'SPM', 'spmLver': -1}} 
from=::ffff:$ENGINE_IP,39308, flow_id=4968c8cb, 
task_id=ee82b2d6-39f5-43ad-83a3-803f1af8e234 (api:54)
2020-07-02 12:43:26,628+0200 INFO  (jsonrpc/1) [jsonrpc.JsonRpcServer] RPC call 
StoragePool.getSpmStatus succeeded in 0.00 seconds (__init__:312)
2020-07-02 12:43:26,645+0200 INFO  (jsonrpc/3) [vdsm.api] START 
getStoragePoolInfo(spUUID=u'79d774b7-ca5b-49c2-baf8-9275ba3f1a84', 
options=None) from=::ffff:$ENGINE_IP,39304, flow_id=4968c8cb, 
task_id=ab1124bc-651d-4ade-aae4-ff9fea23f6a0 (api:48)
2020-07-02 12:43:26,646+0200 INFO  (jsonrpc/3) [vdsm.api] FINISH 
getStoragePoolInfo return={'info': {'name': 'No Description', 'isoprefix': '', 
'pool_status': 'connected', 'lver': -1, 'domains': 
u'6775c41c-7d67-451b-8beb-4fd086eade2e:Active', 'master_uuid': 
u'6775c41c-7d67-451b-8beb-4fd086eade2e', 'version': '5', 'spm_id': 1, 'type': 
'LOCALFS', 'master_ver': 1}, 'dominfo': 
{u'6775c41c-7d67-451b-8beb-4fd086eade2e': {'status': u'Active', 'diskfree': 
'3563234983936', 'isoprefix': '', 'alerts': [], 'disktotal': '3923683311616', 
'version': 5}}} from=::ffff:$ENGINE_IP,39304, flow_id=4968c8cb, 
task_id=ab1124bc-651d-4ade-aae4-ff9fea23f6a0 (api:54)
2020-07-02 12:43:26,647+0200 INFO  (jsonrpc/3) [jsonrpc.JsonRpcServer] RPC call 
StoragePool.getInfo succeeded in 0.01 seconds (__init__:312)


5) His task seems block any other export tasks, leave them in pending. There 
were about 5 export tasks and 1 import temaplte task and as soon we killed the 
following process, the other 4 tasks and the import one completed after few 
minutes.


2020-07-02 04:43:21,339+02 INFO  
[org.ovirt.engine.core.vdsbroker.vdsbroker.PrepareImageVDSCommand] 
(EE-ManagedThreadFactory-engineScheduled-Thread-19) [312261bb] START, 
PrepareImageVDSCommand(HostName = $OVIRT_HOST, 
PrepareImageVDSCommandParameters:{hostId='e8c07142-fcd8-4f78-9158-ffb2caa06dc5'}),
 log id: 28c880d12020-07-02 04:43:21,650+02 INFO  
[org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor] 
(EE-ManagedThreadFactory-engineScheduled-Thread-19) [312261bb] Executing 
Ansible command: ANSIBLE_STDOUT_CALLBACK=imagemeasureplugin 
/usr/bin/ansible-playbook --ssh-common-args=-F 
/var/lib/ovirt-engine/.ssh/config 
--private-key=/etc/pki/ovirt-engine/keys/engine_id_rsa 
--inventory=/tmp/ansible-inventory1121551177697847734 
--extra-vars=image_path="/rhev/data-center/mnt/_data/6775c41c-7d67-451b-8beb-4fd086eade2e/images/a084fa36-0f93-45c2-a323-ea9ca2d16677/5b76cc6c-1a6b-4e02-8ce4-dc80faf9ed04"
 /usr/share/ovirt-engine/playbooks/ovirt-image-measure.yml [Logfile: 
/var/log/ovirt-engine/ova
 /ovirt-image-measure-ansible-20200702044321-$OVIRT_HOST-312261bb.log]

The process that we killed was:

vdsm     25338 25332 99 04:14 pts/0    07:40:09 qemu-img measure -O qcow2 
/rhev/data-center/mnt/_data/6775c41c-7d67-451b-8beb-4fd086eade2e/images/a084fa36-0f93-45c2-a323-ea9ca2d16677/55b3eac5-05b2-4bae-be50-37cde705069

A strace -p of the PID 25338 showed following lines:

lseek(11, 3056795648, SEEK_DATA)        = 3056795648
lseek(11, 3056795648, SEEK_HOLE)        = 13407092736
lseek(14, 128637468672, SEEK_DATA)      = 128637468672
lseek(14, 128637468672, SEEK_HOLE)      = 317708828672
lseek(14, 128646250496, SEEK_DATA)      = 128646250496
lseek(14, 128646250496, SEEK_HOLE)      = 317708828672
lseek(14, 128637730816, SEEK_DATA)      = 128637730816
lseek(14, 128637730816, SEEK_HOLE)      = 317708828672
lseek(14, 128646774784, SEEK_DATA)      = 128646774784
lseek(14, 128646774784, SEEK_HOLE)      = 317708828672
lseek(14, 128646709248, SEEK_DATA)      = 128646709248


Can yu help us figuring out what is going on?

If any other informations is needed I'll provide 'em ASAP.

Thank you for your patience and your time,
Regards 

Francesco
_______________________________________________
Devel mailing list -- [email protected]
To unsubscribe send an email to [email protected]
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/[email protected]/message/DUYQNBUXV67VCERJV42DTRXPWMLZIHJI/

Reply via email to