Update: I found a workaround : 1. Stop VM 2. Migrate to the destination storage 3. Start VM on destination storage, at this point I get this error : File []/vmfs/volumes/5ef5902c-83b77eaa-5143-b4a9fc1bb3ae/i-2-1497-VM/ROOT-1497.vmdk was not found 4. If i check the destination is see the VM folder and vmx, vmsd, hlog files but no .vmdk file. 5. I do a manually copy for this file : ROOT-1497.vmdk, from initial source to : vmfs/volumes/5ef5902c-83b77eaa-5143-b4a9fc1bb3ae/i-2-1497-VM/ 6. Start VM, everything works.
Here we can see the error which I get when I start the VM I get before migrating manually the vmdk file : https://pastebin.com/VcEtV7R3 Regards, Cristian On 2020/06/28 06:31:50, <cristian.c@istream.today> wrote: > Hello, > > > > I just wanted to migrate a VM to a different host, I did this multiple > times in previous version (4.11) with success, IN this version after I turn > off and execute migrate, select destination I get in few seconds that is was > migrated, on VMware side I only see the following status : > > > > Relocate virtual machine i-2-3483-VM Completed > > Consolidate virtual machine disk files i-2-3483-VM Completed > > > > Nothing else, I do not see any volume export, no errors, only if I try to > start the VM, fails ofc. > > > > Relocate virtual machine > > i-2-3483-VM > > Unable to access the virtual > > machine configuration: Unable > > to access file [datastore1 (8)] i > > -2-3483-VM/i-2-3483-VM.vmx > > > > Please see the screenshots : https://imgur.com/a/t50qL0z > > > > > > Management logs : > > > > 020-06-28 02:15:23,133 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (qtp464887938-24:ctx-d702500a ctx-ff7ed9a8) (logid:ca91d1eb) submit async > job-50375, details: AsyncJobVO {id:50375, userId: 2, accountId: 2, > instanceType: None, instanceId: null, cmd: > org.apache.cloudstack.api.command.admin.vm.MigrateVMCmd, cmdInfo: > {"virtualmachineid":"774154f9-b2fa-4c0a-b71c-68e977a74e75","response":"json" > ,"ctxUserId":"2","httpmethod":"GET","ctxStartEventId":"110070","ctxDetails": > "{\"interface > com.cloud.storage.StoragePool\":\"d6f36aa9-eadd-4019-b556-5c454c653810\",\"i > nterface > com.cloud.vm.VirtualMachine\":\"774154f9-b2fa-4c0a-b71c-68e977a74e75\"}","ct > xAccountId":"2","cmdEventType":"VM.MIGRATE","storageid":"d6f36aa9-eadd-4019- > b556-5c454c653810","_":"1593324216076"}, cmdVersion: 0, status: IN_PROGRESS, > processStatus: 0, resultCode: 0, result: null, initMsid: 345048618557, > completeMsid: null, lastUpdated: null, lastPolled: null, created: null, > removed: null} > > 2020-06-28 02:15:23,134 DEBUG [c.c.a.ApiServlet] > (qtp464887938-24:ctx-d702500a ctx-ff7ed9a8) (logid:ca91d1eb) ===END=== > 79.114.109.121 -- GET > command=migrateVirtualMachine&storageid=d6f36aa9-eadd-4019-b556-5c454c653810 > &virtualmachineid=774154f9-b2fa-4c0a-b71c-68e977a74e75&response=json&_=15933 > 24216076 > > 2020-06-28 02:15:23,134 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375) (logid:869d4155) Executing > AsyncJobVO {id:50375, userId: 2, accountId: 2, instanceType: None, > instanceId: null, cmd: > org.apache.cloudstack.api.command.admin.vm.MigrateVMCmd, cmdInfo: > {"virtualmachineid":"774154f9-b2fa-4c0a-b71c-68e977a74e75","response":"json" > ,"ctxUserId":"2","httpmethod":"GET","ctxStartEventId":"110070","ctxDetails": > "{\"interface > com.cloud.storage.StoragePool\":\"d6f36aa9-eadd-4019-b556-5c454c653810\",\"i > nterface > com.cloud.vm.VirtualMachine\":\"774154f9-b2fa-4c0a-b71c-68e977a74e75\"}","ct > xAccountId":"2","cmdEventType":"VM.MIGRATE","storageid":"d6f36aa9-eadd-4019- > b556-5c454c653810","_":"1593324216076"}, cmdVersion: 0, status: IN_PROGRESS, > processStatus: 0, resultCode: 0, result: null, initMsid: 345048618557, > completeMsid: null, lastUpdated: null, lastPolled: null, created: null, > removed: null} > > 2020-06-28 02:15:23,147 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375 ctx-cd586069) (logid:869d4155) > Sync job-50376 execution on object VmWorkJobQueue.3483 > > 2020-06-28 02:15:23,565 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (AsyncJobMgr-Heartbeat-1:ctx-b96f4e30) (logid:d7dd57ef) Execute sync-queue > item: SyncQueueItemVO {id:60, queueId: 29428, contentType: AsyncJob, > contentId: 50376, lastProcessMsid: 345048618557, lastprocessNumber: 6, > lastProcessTime: Sun Jun 28 02:15:23 EDT 2020, created: Sun Jun 28 02:15:23 > EDT 2020} > > 2020-06-28 02:15:23,566 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (AsyncJobMgr-Heartbeat-1:ctx-b96f4e30) (logid:d7dd57ef) Schedule queued > job-50376 > > 2020-06-28 02:15:23,568 INFO [o.a.c.f.j.i.AsyncJobMonitor] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376) (logid:1bb04314) Add > job-50376 into job monitoring > > 2020-06-28 02:15:23,571 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376) (logid:869d4155) > Executing AsyncJobVO {id:50376, userId: 2, accountId: 2, instanceType: null, > instanceId: null, cmd: com.cloud.vm.VmWorkStorageMigration, cmdInfo: > rO0ABXNyACNjb20uY2xvdWQudm0uVm1Xb3JrU3RvcmFnZU1pZ3JhdGlvboeRn3LBtueeAgABTAAK > ZGVzdFBvb2xJZHQAEExqYXZhL2xhbmcvTG9uZzt4cgATY29tLmNsb3VkLnZtLlZtV29ya5-Ztlbw > JWdrAgAESgAJYWNjb3VudElkSgAGdXNlcklkSgAEdm1JZEwAC2hhbmRsZXJOYW1ldAASTGphdmEv > bGFuZy9TdHJpbmc7eHAAAAAAAAAAAgAAAAAAAAACAAAAAAAADZt0ABlWaXJ0dWFsTWFjaGluZU1h > bmFnZXJJbXBsc3IADmphdmEubGFuZy5Mb25nO4vkkMyPI98CAAFKAAV2YWx1ZXhyABBqYXZhLmxh > bmcuTnVtYmVyhqyVHQuU4IsCAAB4cAAAAAAAAAAL, cmdVersion: 0, status: > IN_PROGRESS, processStatus: 0, resultCode: 0, result: null, initMsid: > 345048618557, completeMsid: null, lastUpdated: null, lastPolled: null, > created: Sun Jun 28 02:15:23 EDT 2020, removed: null} > > 2020-06-28 02:15:23,572 DEBUG [c.c.v.VmWorkJobDispatcher] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376) (logid:869d4155) Run > VM work job: com.cloud.vm.VmWorkStorageMigration for VM 3483, job origin: > 50375 > > 2020-06-28 02:15:23,573 DEBUG [c.c.v.VmWorkJobHandlerProxy] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Execute VM work job: > com.cloud.vm.VmWorkStorageMigration{"destPoolId":11,"userId":2,"accountId":2 > ,"vmId":3483,"handlerName":"VirtualMachineManagerImpl"} > > 2020-06-28 02:15:23,582 DEBUG [c.c.c.CapacityManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) VM state transitted from :Stopped to Migrating with event: > StorageMigrationRequestedvm's original host id: 12 new host id: null host id > before state transition: null > > 2020-06-28 02:15:23,583 DEBUG [c.c.v.VirtualMachineManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Offline migration of VMware vm i-2-3483-VM with volumes > > 2020-06-28 02:15:23,586 DEBUG [c.c.v.VirtualMachineManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) host id is null, using last host id 12 > > 2020-06-28 02:15:23,588 DEBUG [c.c.a.t.Request] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Seq 12-4809281452078284077: Sending { Cmd , MgmtId: > 345048618557, via: 12(ns515431.ip-xxx-xxx-174.net), Ver: v1, Flags: 100111, > [{"com.cloud.agent.api.MigrateVmToPoolCommand":{"volumes":[{"id":5864,"name" > :"ROOT-3483","path":"ROOT-3483","size":10737418240,"type":"ROOT","storagePoo > lType":"VMFS","storagePoolUuid":"d6f36aa9-eadd-4019-b556-5c454c653810","devi > ceId":0,"chainInfo":"{\"diskDeviceBusName\":\"scsi0:0\",\"diskChain\":[\"[da > tastore1 (8)] > i-2-3483-VM/ROOT-3483.vmdk\"]}"}],"vmName":"i-2-3483-VM","destinationPool":" > d6f36aa9-eadd-4019-b556-5c454c653810","executeInSequence":true,"wait":0}}] } > > 2020-06-28 02:15:23,588 DEBUG [c.c.a.t.Request] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Seq 12-4809281452078284077: Executing: { Cmd , MgmtId: > 345048618557, via: 12(ns515431.ip-xxx-xxx-174.net), Ver: v1, Flags: 100111, > [{"com.cloud.agent.api.MigrateVmToPoolCommand":{"volumes":[{"id":5864,"name" > :"ROOT-3483","path":"ROOT-3483","size":10737418240,"type":"ROOT","storagePoo > lType":"VMFS","storagePoolUuid":"d6f36aa9-eadd-4019-b556-5c454c653810","devi > ceId":0,"chainInfo":"{\"diskDeviceBusName\":\"scsi0:0\",\"diskChain\":[\"[da > tastore1 (8)] > i-2-3483-VM/ROOT-3483.vmdk\"]}"}],"vmName":"i-2-3483-VM","destinationPool":" > d6f36aa9-eadd-4019-b556-5c454c653810","executeInSequence":true,"wait":0}}] } > > 2020-06-28 02:15:23,588 DEBUG [c.c.a.m.DirectAgentAttache] > (DirectAgent-38:ctx-30bb3c08) (logid:41b99820) Seq 12-4809281452078284077: > Executing request > > 2020-06-28 02:15:23,588 INFO [c.c.h.v.r.VmwareResource] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) excuting > MigrateVmToPoolCommand i-2-3483-VM -> d6f36aa9-eadd-4019-b556-5c454c653810 > > 2020-06-28 02:15:23,588 DEBUG [c.c.h.v.r.VmwareResource] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) > MigrateVmToPoolCommand: > {"volumes":[{"id":5864,"name":"ROOT-3483","path":"ROOT-3483","size":10737418 > 240,"type":"ROOT","storagePoolType":"VMFS","storagePoolUuid":"d6f36aa9-eadd- > 4019-b556-5c454c653810","deviceId":0,"chainInfo":"{\"diskDeviceBusName\":\"s > csi0:0\",\"diskChain\":[\"[datastore1 (8)] > i-2-3483-VM/ROOT-3483.vmdk\"]}"}],"vmName":"i-2-3483-VM","destinationPool":" > d6f36aa9-eadd-4019-b556-5c454c653810","executeInSequence":true,"wait":0} > > 2020-06-28 02:15:23,635 DEBUG [c.c.h.v.r.VmwareResource] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) finding > datastore d6f36aa9-eadd-4019-b556-5c454c653810 > > 2020-06-28 02:15:23,651 DEBUG [c.c.h.v.r.VmwareResource] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) locating > disk for volume (5864) using path ROOT-3483 > > 2020-06-28 02:15:23,654 INFO [c.c.h.v.m.VirtualMachineMO] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) Look for > disk device info for volume : ROOT-3483.vmdk with base name: ROOT-3483 > > 2020-06-28 02:15:23,654 INFO [c.c.h.v.m.VirtualMachineMO] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) Test > against disk device, controller key: 1000, unit number: 0 > > 2020-06-28 02:15:23,654 INFO [c.c.h.v.m.VirtualMachineMO] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) Test > against disk backing : [datastore1 (8)] i-2-3483-VM/ROOT-3483.vmdk > > 2020-06-28 02:15:23,654 INFO [c.c.h.v.m.VirtualMachineMO] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) Disk > backing : [datastore1 (8)] i-2-3483-VM/ROOT-3483.vmdk matches ==> scsi0:0 > > 2020-06-28 02:15:23,657 INFO [c.c.h.v.u.VmwareContext] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) > Connected, conn: > sun.net.www.protocol.https.DelegateHttpsURLConnection:https://vscsp01.xxxx.h > ost/folder/i-2-3483-VM/ROOT-3483.vmdk?dcPath=AMS-DC-01&dsName=datastore1+%28 > 8%29, retry: 0 > > 2020-06-28 02:15:23,732 DEBUG [c.c.a.m.AgentManagerImpl] > (AgentManager-Handler-6:null) (logid:) SeqA 106257-1083271: Processing Seq > 106257-1083271: { Cmd , MgmtId: -1, via: 106257, Ver: v1, Flags: 11, > [{"com.cloud.agent.api.ConsoleProxyLoadReportCommand":{"_proxyVmId":3341,"_l > oadInfo":"{\n \"connections\": []\n}","wait":0}}] } > > 2020-06-28 02:15:23,734 DEBUG [c.c.a.m.AgentManagerImpl] > (AgentManager-Handler-6:null) (logid:) SeqA 106257-1083271: Sending Seq > 106257-1083271: { Ans: , MgmtId: 345048618557, via: 106257, Ver: v1, Flags: > 100010, > [{"com.cloud.agent.api.AgentControlAnswer":{"result":true,"wait":0}}] } > > 2020-06-28 02:15:24,429 DEBUG [c.c.s.StatsCollector] > (StatsCollector-4:ctx-7be2bc32) (logid:cbc7639e) HostStatsCollector is > running... > > 2020-06-28 02:15:24,434 DEBUG [c.c.a.m.DirectAgentAttache] > (DirectAgent-309:ctx-50d489ef) (logid:82f6a95d) Seq 4-3803008410337695073: > Executing request > > 2020-06-28 02:15:24,458 DEBUG [c.c.h.v.r.VmwareResource] > (DirectAgent-38:ctx-30bb3c08 ns515431.ip-xxx-xxx-174.net, > job-50375/job-50376, cmd: MigrateVmToPoolCommand) (logid:869d4155) > Successfully consolidated disks of VM i-2-3483-VM. > > 2020-06-28 02:15:24,465 DEBUG [c.c.a.m.DirectAgentAttache] > (DirectAgent-38:ctx-30bb3c08) (logid:869d4155) Seq 12-4809281452078284077: > Response Received: > > 2020-06-28 02:15:24,465 DEBUG [c.c.a.t.Request] > (DirectAgent-38:ctx-30bb3c08) (logid:869d4155) Seq 12-4809281452078284077: > Processing: { Ans: , MgmtId: 345048618557, via: > 12(ns515431.ip-xxx-xxx-174.net), Ver: v1, Flags: 110, > [{"com.cloud.agent.api.MigrateVmToPoolAnswer":{"volumeTos":[{"path":"ROOT-34 > 83","accountId":0,"chainInfo":"","id":5864,"directDownload":false}],"result" > :true,"wait":0}}] } > > 2020-06-28 02:15:24,465 DEBUG [c.c.a.m.AgentAttache] > (DirectAgent-38:ctx-30bb3c08) (logid:869d4155) Seq 12-4809281452078284077: > No more commands found > > 2020-06-28 02:15:24,465 DEBUG [c.c.a.t.Request] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Seq 12-4809281452078284077: Received: { Ans: , MgmtId: > 345048618557, via: 12(ns515431.ip-xxx-xxx-174.net), Ver: v1, Flags: 110, { > MigrateVmToPoolAnswer } } > > 2020-06-28 02:15:24,465 DEBUG [c.c.v.VirtualMachineManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) cleaning up after hypervisor pool migration volumes for VM > i-2-3483-VM(774154f9-b2fa-4c0a-b71c-68e977a74e75) to pool > null(d6f36aa9-eadd-4019-b556-5c454c653810) > > 2020-06-28 02:15:24,466 DEBUG [c.c.v.VirtualMachineManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) found 1 volumes for VM > i-2-3483-VM(uuid:774154f9-b2fa-4c0a-b71c-68e977a74e75, id:3483) > > 2020-06-28 02:15:24,466 DEBUG [c.c.v.VirtualMachineManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) updating volume (5864) with path 'ROOT-3483' on pool '11' > > 2020-06-28 02:15:24,467 DEBUG [c.c.a.m.DirectAgentAttache] > (DirectAgent-309:ctx-50d489ef) (logid:cbc7639e) Seq 4-3803008410337695073: > Response Received: > > 2020-06-28 02:15:24,467 DEBUG [c.c.a.t.Request] > (StatsCollector-4:ctx-7be2bc32) (logid:cbc7639e) Seq 4-3803008410337695073: > Received: { Ans: , MgmtId: 345048618557, via: > 4(ns521637.ip-xxx-xx-119.net), Ver: v1, Flags: 10, { GetHostStatsAnswer } } > > 2020-06-28 02:15:24,470 DEBUG [c.c.a.m.DirectAgentAttache] > (DirectAgent-297:ctx-e2f5bd9e) (logid:ee57070f) Seq 6-176484810397596966: > Executing request > > 2020-06-28 02:15:24,471 DEBUG [c.c.c.CapacityManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) VM state transitted from :Migrating to Stopped with event: > AgentReportStoppedvm's original host id: 12 new host id: null host id before > state transition: null > > 2020-06-28 02:15:24,476 DEBUG [c.c.c.CapacityManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Hosts's actual total CPU: 29592 and CPU after applying > overprovisioning: 59184 > > 2020-06-28 02:15:24,476 DEBUG [c.c.c.CapacityManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Hosts's actual total RAM: 34323972096 and RAM after > applying overprovisioning: 68647944192 > > 2020-06-28 02:15:24,476 DEBUG [c.c.c.CapacityManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) release cpu from host: 12, old used: 42700,reserved: 333, > actual total: 29592, total with overprovisioning: 59184; new used: > 42200,reserved:333; movedfromreserved: false,moveToReserveredfalse > > 2020-06-28 02:15:24,476 DEBUG [c.c.c.CapacityManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) release mem from host: 12, old used: 49392123904,reserved: > 357913952, total: 68647944192; new used: 48855252992,reserved:357913952; > movedfromreserved: false,moveToReserveredfalse > > 2020-06-28 02:15:24,484 DEBUG [c.c.v.VmWorkJobHandlerProxy] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Done executing VM work job: > com.cloud.vm.VmWorkStorageMigration{"destPoolId":11,"userId":2,"accountId":2 > ,"vmId":3483,"handlerName":"VirtualMachineManagerImpl"} > > 2020-06-28 02:15:24,484 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Complete async job-50376, jobStatus: SUCCEEDED, resultCode: > 0, result: null > > 2020-06-28 02:15:24,485 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Publish async job-50376 complete on message bus > > 2020-06-28 02:15:24,485 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Wake up jobs related to job-50376 > > 2020-06-28 02:15:24,485 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Update db status for job-50376 > > 2020-06-28 02:15:24,485 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376 ctx-9e7ed995) > (logid:869d4155) Wake up jobs joined with job-50376 and disjoin all subjobs > created from job- 50376 > > 2020-06-28 02:15:24,489 DEBUG [c.c.v.VmWorkJobDispatcher] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376) (logid:869d4155) > Done with run of VM work job: com.cloud.vm.VmWorkStorageMigration for VM > 3483, job origin: 50375 > > 2020-06-28 02:15:24,489 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376) (logid:869d4155) > Done executing com.cloud.vm.VmWorkStorageMigration for job-50376 > > 2020-06-28 02:15:24,490 INFO [o.a.c.f.j.i.AsyncJobMonitor] > (Work-Job-Executor-31:ctx-6bb2bd88 job-50375/job-50376) (logid:869d4155) > Remove job-50376 from job monitoring > > 2020-06-28 02:15:24,497 DEBUG [c.c.a.m.DirectAgentAttache] > (DirectAgent-297:ctx-e2f5bd9e) (logid:cbc7639e) Seq 6-176484810397596966: > Response Received: > > 2020-06-28 02:15:24,497 DEBUG [c.c.a.t.Request] > (StatsCollector-4:ctx-7be2bc32) (logid:cbc7639e) Seq 6-176484810397596966: > Received: { Ans: , MgmtId: 345048618557, via: > 6(ns515155.ip-xxx-xxx-174.net), Ver: v1, Flags: 10, { GetHostStatsAnswer } } > > 2020-06-28 02:15:24,499 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375 ctx-cd586069) (logid:869d4155) > Complete async job-50375, jobStatus: SUCCEEDED, resultCode: 0, result: > org.apache.cloudstack.api.response.UserVmResponse/virtualmachine/{"id":"7741 > 54f9-b2fa-4c0a-b71c-68e977a74e75","name":"testpass3","displayname":"testpass > 3","account":"admin","userid":"1727541a-d492-11e5-86c3-000c298715c8","userna > me":"admin","domainid":"d7e430ac-d491-11e5-86c3-000c298715c8","domain":"ROOT > ","created":"2020-06-19T16:25:22-0400","state":"Stopped","haenable":false,"z > oneid":"43585c50-1d09-4163-89e3-fb736f77a893","zonename":"ZONE001","template > id":"13522fcd-1280-4c87-89c1-2f1cfc98a99a","templatename":"CentOS-7-Minimal" > ,"templatedisplaytext":"CentOS-7-Minimal","passwordenabled":true,"serviceoff > eringid":"84ae80de-3c6b-4d36-931e-638c6ff007f0","serviceofferingname":"Small > Instance","cpunumber":1,"cpuspeed":500,"memory":512,"cpuused":"0.97%","netwo > rkkbsread":69267,"networkkbswrite":0,"diskkbsread":1338,"diskkbswrite":329," > memorykbs":524288,"memoryintfreekbs":35840,"memorytargetkbs":524288,"diskior > ead":4,"diskiowrite":0,"guestosid":"0c044dd6-d492-11e5-86c3-000c298715c8","r > ootdeviceid":0,"rootdevicetype":"ROOT","securitygroup":[],"nic":[{"id":"7d4b > 66b6-88d7-490d-baed-8b08ac66e2e3","networkid":"5a72dcaf-2d95-44c9-a08a-479df > f527bdd","networkname":"defaultGuestNetwork","netmask":"255.255.255.192","ga > teway":"192.xx.xx.126","ipaddress":"192.xx.xxx.84","broadcasturi":"vlan://un > tagged","traffictype":"Guest","type":"Shared","isdefault":true,"macaddress": > "1e:00:71:00:00:7c","secondaryip":[],"extradhcpoption":[]}],"hypervisor":"VM > ware","instancename":"i-2-3483-VM","details":{"keyboard":"us","dataDiskContr > oller":"osdefault","memoryOvercommitRatio":"3.0","Message.ReservedCapacityFr > eed.Flag":"false","cpuOvercommitRatio":"3.0","nicAdapter":"E1000","rootDiskC > ontroller":"scsi"},"affinitygroup":[],"displayvm":true,"isdynamicallyscalabl > e":false,"ostypeid":"0c044dd6-d492-11e5-86c3-000c298715c8","tags":[]} > > 2020-06-28 02:15:24,499 DEBUG [c.c.a.m.DirectAgentAttache] > (DirectAgent-331:ctx-9f06f713) (logid:e4f3d58e) Seq 7-758575062235233578: > Executing request > > 2020-06-28 02:15:24,500 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375 ctx-cd586069) (logid:869d4155) > Publish async job-50375 complete on message bus > > 2020-06-28 02:15:24,500 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375 ctx-cd586069) (logid:869d4155) > Wake up jobs related to job-50375 > > 2020-06-28 02:15:24,500 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375 ctx-cd586069) (logid:869d4155) > Update db status for job-50375 > > 2020-06-28 02:15:24,500 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375 ctx-cd586069) (logid:869d4155) > Wake up jobs joined with job-50375 and disjoin all subjobs created from job- > 50375 > > 2020-06-28 02:15:24,502 DEBUG [o.a.c.f.j.i.AsyncJobManagerImpl] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375) (logid:869d4155) Done executing > org.apache.cloudstack.api.command.admin.vm.MigrateVMCmd for job-50375 > > 2020-06-28 02:15:24,502 INFO [o.a.c.f.j.i.AsyncJobMonitor] > (API-Job-Executor-32:ctx-b3cda3c2 job-50375) (logid:869d4155) Remove > job-50375 from job monitoring > > > > From my point of view this looks like migration for shared storage.. > > > > Best regards, > > Cristian > >