Hi Benjamin, no experience here with Centos7 vs KVM really...
What I do remember, but that is VXLAN related, with kernel 4.0 and on-wards (we are on Ubuntu), if you try to provision more vxlan interfaces then max_igmp_memebership ( /proc/sys/net/ipv4/igmp_max_memberships ) it will start interfaces but they will not be UP and proper cloud-stack message is shown in logs (message that OS sends when you try to UP the interface, we use ACS 4.8) - so it is easy to see issue and failed ations (though the VM would be started fine :) ) Again, you should be able to examine agent logs and see which script is failing... I expect either UP action is failing or not even attempted (that would be perhaps a bug in ACS, but I don't expect that...) Cheers On 24 May 2018 at 16:03, Adam Witwicki <[email protected]> wrote: > Hi Ben > > " Now i moved back to centos7 and have another problem. all vlan bridges > that will be created by cloudstack-agent not manual get in ifup state. if i > ifup the vlan bridges manual all works fine. Have someone a idea how i can > force centos to automaticly bring up new if devices ?" > > I had this issue with 4.9 and ending up writing a dirty script to check > the down bridges every minute via cron and if up them. > > iface=`/usr/sbin/ip add | grep -E "brbond.*DOWN" | awk -F ':' '{print > $2}'` > echo "$iface" | while read -r a; do /usr/sbin/ip link set dev $a up; done > > > Thanks > > Adam > > -----Original Message----- > From: Benjamin Naber <[email protected]> > Sent: 24 May 2018 14:37 > To: [email protected] > Subject: Re: KVM Problem by deploying VPC > > ** This mail originated from OUTSIDE the Oakford corporate network. Treat > hyperlinks and attachments in this email with caution. ** > > Hi Andrija, > > ty for reply. i solved the error. Now i have an other error. Problem was > that it failed witout a bonding device. i created a bonding device (bond0) > wiht a single nic now it works without problems On Ubuntu. > > Example (didnt work): > > auto enp8s0f0 > iface enp8s0f0 inet manual > > auto cloudbr0 > iface cloudbr0 inet static > address 10.253.250.230 > gateway 10.253.250.1 > netmask 255.255.255.0 > dns-nameservers 8.8.8.8 8.8.4.4 > bridge_ports enp8s0f0 > bridge_fd 5 > bridge_stp off > bridge_maxwait 1 > > Example (works fine): > > auto enp8s0f0 > iface enp8s0f0 inet manual > bond-master bond0 > > auto bond0 > iface bond0 inet manual > bond-mode active-backup > bond-miimon 100 > bond-slaves none > > > auto cloudbr0 > iface cloudbr0 inet static > address 10.253.250.230 > gateway 10.253.250.1 > netmask 255.255.255.0 > dns-nameservers 8.8.8.8 8.8.4.4 > bridge_ports bond0 > bridge_fd 5 > bridge_stp off > bridge_maxwait 1 > > Now i moved back to centos7 and have another problem. all vlan bridges > that will be created by cloudstack-agent not manual get in ifup state. if i > ifup the vlan bridges manual all works fine. Have someone a idea how i can > force centos to automaticly bring up new if devices ? > > > Kind regards > > Ben > > > > > Andrija Panic <[email protected]> hat am 24. Mai 2018 um > 00:37 geschrieben: > > > > Hi Ben, > > > > interesting parts seems to be: > > 2018-05-23 12:59:47,213 DEBUG [kvm.resource. > LibvirtComputingResource] > > (agentRequest-Handler-5:null) (logid:ef8b353e) getting broadcast uri > for > > pif enp8s0f0.233 and bridge brenp8s0f0-233 > > 2018-05-23 12:59:47,213 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > (agentRequest-Handler-5:null) (logid:ef8b353e) Transforming > > com.cloud.agent.api.routing.IpAssocVpcCommand to ConfigItems > > 2018-05-23 12:59:47,581 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > FileConfigItem, > > copying 257 characters to ip_associations.json took 340ms > > 2018-05-23 12:59:47,582 DEBUG [kvm.resource. > LibvirtComputingResource] > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > /usr/share/cloudstack-common/scripts/network/domr/router_proxy.sh > > update_config.py 169.254.2.247 ip_associations.json > > 2018-05-23 12:59:47,766 DEBUG [kvm.resource. > LibvirtComputingResource] > > (agentRequest-Handler-5:null) (logid:ef8b353e) Exit value is 1 > > > > What I believe we see here, is that IP association fails for some > reason > > (exit value is 1) and after that ACS will just stop the VM and do the > > cleanup (all log lines after this one)... > > > > Can you check your broadcast URI in the DB ? ACtually... I see it > vlan 223 > > > > * I did have some issues in past releases, when we wanted to use > untagged > > vlans for Public network, but it seems not to be case here... > > > > Not sure, if it's possible that you also SSH to the VR during this > creation > > process, in order to collect logs from inside the VR - before Qemu > destroys > > the VR? > > > > ssh -p 3922 -i .ssh/id_rsa.cloud root@ 169.254.2.247 and then try > to fetch > > (SCP or something) whole /var/log/ folder to the localhost - from > there, > > there is cloud.log and auth.log where most of the commands outputs > are > > located (success or failure) > > > > or something like rsync -av -e "ssh -p 3992 -i .ssh.id_rsa.cloud" > > root@169....... > > /local/dir/ > > and keep repeating until you fetch whatever data to be able to > analyze it > > > > Perhaps someone else will have better suggestion... > > > > Best > > Andrija > > > > On 23 May 2018 at 13:08, Benjamin Naber < > [email protected]> > > wrote: > > > > > > > > > Hi Andrija, > > > > > > first of all thanks for your reply. I have now testet the > setup on a > > > Ubuntu Xenial. Same issu with Default VPC Offering. Redundant > VPC offering > > > works also without any problems. Same as on CentOS. > > > > > > See delow the debug log (censored public Ips): > > > > > > 2018-05-23 12:58:05,161 INFO [cloud.agent.Agent] > > > (agentRequest-Handler-2:null) (logid:0806b407) Proccess agent > ready > > > command, agent id = 15 > > > 2018-05-23 12:58:05,161 INFO [cloud.agent.Agent] > > > (agentRequest-Handler-2:null) (logid:0806b407) Set agent id 15 > > > 2018-05-23 12:58:05,162 INFO [cloud.agent.Agent] > > > (agentRequest-Handler-2:null) (logid:0806b407) Ready command > is processed: > > > agent id = 15 > > > 2018-05-23 12:58:05,162 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-2:null) (logid:0806b407) Seq > 15-5506494969390563335: > > > { Ans: , MgmtId: 109952567336, via: 15, Ver: v1, Flags: 110, > > > [{"com.cloud.agent.api.ReadyAnswer":{"result":true,"wait":0}}] > } > > > 2018-05-23 12:58:05,292 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:0806b407) Request:Seq > > > 15-5506494969390563336: { Cmd , MgmtId: 109952567336, via: 15, > Ver: v1, > > > Flags: 100111, [{"com.cloud.agent.api.ReadyCo > > > mmand":{"dcId":1,"hostId":15,"wait":0}}] } > > > 2018-05-23 12:58:05,292 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:0806b407) Processing > command: > > > com.cloud.agent.api.ReadyCommand > > > 2018-05-23 12:58:05,292 INFO [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:0806b407) Proccess agent > ready > > > command, agent id = 15 > > > 2018-05-23 12:58:05,292 INFO [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:0806b407) Set agent id 15 > > > 2018-05-23 12:58:05,293 INFO [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:0806b407) Ready command > is processed: > > > agent id = 15 > > > 2018-05-23 12:58:05,294 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:0806b407) Seq > 15-5506494969390563336: > > > { Ans: , MgmtId: 109952567336, via: 15, Ver: v1, Flags: 110, > > > [{"com.cloud.agent.api.ReadyAnswer":{"result":true,"wait":0}}] > } > > > 2018-05-23 12:58:43,019 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Request:Seq > > > 15-5506494969390563337: { Cmd , MgmtId: 109952567336, via: 15, > Ver: v1, > > > Flags: 100111, [{"org.apache.cloudstack. > storage.command.CopyCommand":{" > > > srcTO":{"org.apache.cloudstack.storage.to.TemplateObjectTO": > > > {"path":"6129c76d-67e5-4454-9bc9-4e9a5a152610","origUrl":" > > > http://packages.shapeblue.com.s3-eu-west-1.amazonaws.com/sys > > > temvmtemplate/4.10/dnsmasq/systemvm64template-4.10-kvm. > qcow2.bz2 > > > ","uuid":"6129c76d-67e5-4454-9bc9-4e9a5a152610","id": > > > 3,"format":"QCOW2","accountId":1,"checksum":"bc2eac46f16a2ec > > > e6c19d4b89db41de3","hvm":false,"displayText":"SystemVM > Template > > > (KVM)","imageDataStore":{"org.apache.cloudstack.storage.to.P > > > rimaryDataStoreTO":{"uuid":"2258aa76-7813-354d-b274-961fb337 > > > e716","id":14,"poolType":"RBD","host":"ceph-mon","path":" > > > rbd","port":6789,"url":"RBD://ceph-mon/rbd/?ROLE=Primary& > > > STOREUUID=2258aa76-7813-354d-b274-961fb337e716","isManaged" > > > :false}},"name":"routing-3","hypervisorType":"KVM"}},"destTO > > > ":{"org.apache.cloudstack.storage.to.VolumeObjectTO":{" > > > uuid":"fa3dba5d-364b-43ce-9ebf-d15a3324f765"," > > > volumeType":"ROOT","dataStore":{"org.apache.cloudstack. > storage.to > > > .PrimaryDataStoreTO":{"uuid":"2258aa76-7813-354d- > > > b274-961fb337e716","id":14,"poolType":"RBD","host":"ceph-mon > > > ","path":"rbd","port":6789,"url":"RBD://ceph-mon/rbd/?ROLE > > > =Primary&STOREUUID=2258aa76-7813-354d-b274-961fb337e716"," > > > isManaged":false}},"name":"ROOT-37","size":349945344," > > > volumeId":37,"vmName":"r-37-VM","accountId":2,"format":" > > > QCOW2","provisioningType":"THIN","id":37,"deviceId":0,"hy > > > pervisorType":"KVM"}},"executeInSequence":true," > options":{},"options2":{},"wait":0}}] > > > } > > > 2018-05-23 12:58:43,020 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Processing > command: > > > org.apache.cloudstack.storage.command.CopyCommand > > > 2018-05-23 12:58:43,021 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Trying to fetch > storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 from libvirt > > > 2018-05-23 12:58:43,021 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:58:43,032 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Succesfully > refreshed pool > > > 2258aa76-7813-354d-b274-961fb337e716 Capacity: 1500336095232 > Used: > > > 22039007177 Available: 1426988417024 > > > 2018-05-23 12:58:43,032 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Trying to fetch > storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 from libvirt > > > 2018-05-23 12:58:43,032 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:58:43,041 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Succesfully > refreshed pool > > > 2258aa76-7813-354d-b274-961fb337e716 Capacity: 1500336095232 > Used: > > > 22039007177 Available: 1426988417024 > > > 2018-05-23 12:58:43,223 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Creating volume > > > fa3dba5d-364b-43ce-9ebf-d15a3324f765 from template > > > 6129c76d-67e5-4454-9bc9-4e9a5a152610 in pool > > > 2258aa76-7813-354d-b274-961fb337e716 (RBD) with size 349945344 > > > 2018-05-23 12:58:43,225 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Trying to > perform a RBD > > > clone (layering) since we are operating in the same storage > pool > > > 2018-05-23 12:58:43,270 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Succesfully > connected to > > > Ceph cluster at ceph-mon:6789 > > > 2018-05-23 12:58:43,321 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) The source image > > > rbd/6129c76d-67e5-4454-9bc9-4e9a5a152610 is RBD format 2. We > will perform > > > a RBD clone using snapshot cloudstack-base-snap > > > 2018-05-23 12:58:43,321 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Checking if RBD > snapshot > > > rbd/6129c76d-67e5-4454-9bc9-4e9a5a152610@cloudstack-base-snap > exists > > > prior to attempting a clone operation. > > > 2018-05-23 12:58:43,343 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Found 1 > snapshots on RBD > > > image rbd/6129c76d-67e5-4454-9bc9-4e9a5a152610 > > > 2018-05-23 12:58:43,344 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) RBD snapshot > > > rbd/6129c76d-67e5-4454-9bc9-4e9a5a152610@cloudstack-base-snap > already > > > exists. > > > 2018-05-23 12:58:43,468 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Succesfully > cloned > > > 6129c76d-67e5-4454-9bc9-4e9a5a152610@cloudstack-base-snap to > > > fa3dba5d-364b-43ce-9ebf-d15a3324f765 > > > 2018-05-23 12:58:43,474 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Seq > 15-5506494969390563337: > > > { Ans: , MgmtId: 109952567336, via: 15, Ver: v1, Flags: 110, > > > [{"org.apache.cloudstack.storage.command.CopyCmdAnswer":{" > > > newData":{"org.apache.cloudstack.storage.to.VolumeObjectTO": > > > {"size":349945344,"path":"fa3dba5d-364b-43ce-9ebf- > > > d15a3324f765","accountId":0,"format":"RAW","id":0}}," > result":true,"wait":0}}] > > > } > > > 2018-05-23 12:58:43,625 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Request:Seq > > > 15-5506494969390563338: { Cmd , MgmtId: 109952567336, via: 15, > Ver: v1, > > > Flags: 100111, [{"com.cloud.agent.api.StartCo > > > mmand":{"vm":{"id":37,"name":"r-37-VM","type":"DomainRouter" > > > ,"cpus":1,"minSpeed":500,"maxSpeed":500,"minRam": > > > 268435456,"maxRam":268435456,"arch":"x86_64","os":"Debian > GNU/Linux 5.0 > > > (64-bit)","platformEmulator":"Debian GNU/Linux > 5","bootArgs":" vpccidr= > > > 10.59.0.0/21 domain=test.internal dns1=8.8.8.8 > privategateway=None > > > template=domP name=r-37-VM eth0ip=169.254.2.247 > eth0mask=255.255.0.0 > > > type=vpcrouter disable_rp_filter=true > baremetalnotificationsecurityk > > > ey=Gs_GstLMdaO8CWiW_-wN71Xu0RhIGWdJ27Y_Q_ > lJ7vnFR44NkHMRknSyuMp4eRm9WXpasA7dmX9UtCRILqRjLA > > > baremetalnotificationapikey=6a0ehX9xAk7wTIm4knUtIY0Qd1e95U03 > > > mXqoNnZD42UMvKAsZJxpOgnVnwRNQC3pC32eIwyYnyWHBX1e8vY5cQ > host=172.16.11.56 > > > port=8080","enableHA":true,"limitCpuUse":false,"enableDynami > > > callyScaleVm":false,"vncPassword":"-Wf7MGedC0usYmBc > > > Cscv7A","params":{},"uuid":"87a45b1c-bf35-431e-82a0- > > > c0c961feab05","disks":[{"data":{"org.apache.cloudstack. > storage.to > > > .VolumeObjectTO":{"uuid":"fa3dba5d-364b-43ce-9ebf- > d15a3324f765"," > > > volumeType":"ROOT","dataStore":{"org.apache.cloudstack. > storage.to > > > .PrimaryDataStoreTO":{"uuid":"2258aa76-7813-354d- > > > b274-961fb337e716","id":14,"poolType":"RBD","host":"ceph-mon > > > ","path":"rbd","port":6789,"url":"RBD://ceph-mon/rbd/?ROLE > > > =Primary&STOREUUID=2258aa76-7813-354d-b274-961fb337e716"," > > > isManaged":false}},"name":"ROOT-37","size":349945344," > > > path":"fa3dba5d-364b-43ce-9ebf-d15a3324f765","volumeId": > > > 37,"vmName":"r-37-VM","accountId":2,"format":"RAW","p > > > rovisioningType":"THIN","id":37,"deviceId":0,"bytesReadRate" > > > :0,"bytesWriteRate":0,"iopsReadRate":0,"iopsWriteRate":0,"hy > > > pervisorType":"KVM"}},"diskSeq":0,"path":"fa3dba5d-364b- > > > 43ce-9ebf-d15a3324f765","type":"ROOT","_details":{"storageHo > > > st":"ceph-mon","managed":"false","storagePort":"6789"," > > > volumeSize":"349945344"}}],"nics":[{"deviceId":0," > networkRateMbps":-1," > > > defaultNic":false,"pxeDisable":true,"nicUuid":"226f4319- > > > b15c-4e48-8a74-6a28b574a992","uuid":"76d57f99-f4af-42b8- > > > af0a-41f0a2cd0ba5","ip":"169.254.2.247","netmask":"255.255. > > > 0.0","gateway":"169.254.0.1","mac":"0e:00:a9:fe:02:f7","broa > > > dcastType":"LinkLocal","type":"Control","isSecurityGroupEnab > > > led":false}],"guestOsDetails":{}},"hostIp":"10.253.250.230", > > > "executeInSequence":false,"wait":0}},{"com.cloud.agent.api. > > > check.CheckSshCommand":{"ip":"169.254.2.247","port":3922," > > > interval":6,"retries":100,"name":"r-37-VM","wait":0}},{" > > > com.cloud.agent.api.GetDomRVersionCmd":{"accessDetails":{" > > > router.ip":"169.254.2.247","router.name":"r-37-VM"},"wait": > > > 0}},{"com.cloud.agent.api.PlugNicCommand":{"nic":{"deviceId" > > > :1,"networkRateMbps":200,"defaultNic":true,"pxeDisable": > > > true,"nicUuid":"29bd4c33-47dd-4357-aa3d-8c5ae56169b5","uuid" > > > :"787cf778-230f-4e17-a648-f63171bdc81c","ip":"82.119.177. > > > 188","netmask":"255.255.255.240","gateway":"82.119.177. > > > 177","mac":"1e:00:87:00:00:0d","broadcastType":"Vlan","type" > > > :"Public","broadcastUri":"vlan://233","isolationUri":"vl > > > an://233","isSecurityGroupEnabled":false},"instanceName":"r- > > > 37-VM","vmType":"DomainRouter","details":{},"wait":0}},{" > > > com.cloud.agent.api.routing.IpAssocVpcCommand":{"ipAddresses > > > ":[{"accountId":2,"publicIp":"82.X.X.X","sourceNat":true," > > > add":true,"oneToOneNat":false,"firstIP":true,"broadcastUri": > > > "vlan://233","vlanGateway":"82.119.X.X","vlanNetmask":" > 255.255.255.240"," > > > vifMacAddress":"1e:00:87:00:00:0d","networkRate":200," > > > trafficType":"Public","newNic":false}],"accessDetails":{" > > > router.guest.ip":"82.X.X.X","router.ip":"169.254.2.247"," > > > zone.network.type":"Advanced","router.name":"r-37-VM"}," > > > wait":0}},{"com.cloud.agent.api.routing.SetSourceNatComman > > > d":{"ipAddress":{"accountId":2,"publicIp":"82.X.X.X","sourc > > > eNat":true,"add":true,"oneToOneNat":false,"firstIP":true," > > > broadcastUri":"vlan://233","vlanGateway":"82.X.X.X"," > > > vlanNetmask":"255.255.255.240","vifMacAddress":"1e:00:87:00: > > > 00:0d","networkRate":200,"trafficType":"Public","newNic" > > > :false},"add":true,"accessDetails":{"router.ip":"169.254.2. > > > 247","zone.network.type":"Advanced","router.name":"r-37- > VM"},"wait":0}},{}] > > > } > > > 2018-05-23 12:58:43,625 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > command: > > > com.cloud.agent.api.StartCommand > > > 2018-05-23 12:58:43,647 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:58:43,648 DEBUG [resource.wrapper. > LibvirtStartCommandWrapper] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) remaining > mem36873936896 > > > 2018-05-23 12:58:43,649 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to fetch > storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 from libvirt > > > 2018-05-23 12:58:43,649 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:58:43,657 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Succesfully > refreshed pool > > > 2258aa76-7813-354d-b274-961fb337e716 Capacity: 1500336095232 > Used: > > > 22039007177 Available: 1426988417024 > > > 2018-05-23 12:58:43,663 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Could not find > volume > > > fa3dba5d-364b-43ce-9ebf-d15a3324f765: Storage volume not > found: no > > > storage vol with matching name 'fa3dba5d-364b-43ce-9ebf- > d15a3324f765' > > > 2018-05-23 12:58:43,664 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Refreshing > storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 > > > 2018-05-23 12:58:43,947 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Found volume > > > fa3dba5d-364b-43ce-9ebf-d15a3324f765 in storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 after refreshing the pool > > > 2018-05-23 12:58:44,204 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to fetch > storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 from libvirt > > > 2018-05-23 12:58:44,204 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:58:44,211 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Succesfully > refreshed pool > > > 2258aa76-7813-354d-b274-961fb337e716 Capacity: 1500336095232 > Used: > > > 22039007177 Available: 1426979717120 > > > 2018-05-23 12:58:44,211 DEBUG [kvm.resource.BridgeVifDriver] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) > > > nic=[Nic:Control-169.254.2.247-null] > > > 2018-05-23 12:58:44,212 DEBUG [utils.script.Script] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > /bin/bash -c ip > > > route | grep 169.254.0.0/16 > > > 2018-05-23 <http://169.254.0.0/162018-05-23> 12:58:44,248 > DEBUG > > > [utils.script.Script] (agentRequest-Handler-5:null) > (logid:ef8b353e) > > > Execution is successful. > > > 2018-05-23 12:58:44,251 DEBUG [resource.wrapper. > LibvirtStartCommandWrapper] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) starting > r-37-VM: <domain type='kvm'> > > > > > > <name>r-37-VM</name> > > > > > > <uuid>87a45b1c-bf35-431e-82a0-c0c961feab05</uuid> > > > > > > <description>Debian GNU/Linux 5.0 (64-bit)</description> > > > > > > <cpu></cpu><sysinfo type='smbios'> > > > > > > <system> > > > > > > <entry name='manufacturer'>Apache Software Foundation</entry> > > > > > > <entry name='product'>CloudStack KVM Hypervisor</entry> > > > > > > <entry name='uuid'>87a45b1c-bf35- > 431e-82a0-c0c961feab05</entry> > > > </system> > > > </sysinfo> > > > > > > <os> > > > > > > <type arch='x86_64' machine='pc'>hvm</type> > > > > > > <boot dev='cdrom'/> > > > > > > <boot dev='hd'/> > > > > > > <smbios mode='sysinfo'/> > > > </os> > > > > > > <on_reboot>restart</on_reboot> > > > > > > <on_poweroff>destroy</on_poweroff> > > > > > > <on_crash>destroy</on_crash> > > > > > > <memory>262144</memory> > > > > > > <devices> > > > > > > <memballoon model='none'/> > > > </devices> > > > > > > <vcpu>1</vcpu> > > > > > > <features> > > > > > > <pae/> > > > > > > <apic/> > > > > > > <acpi/> > > > </features> > > > > > > <cputune> > > > > > > <shares>500</shares> > > > </cputune> > > > > > > <clock offset='utc'> > > > > > > <timer name='kvmclock' > > > > </timer> > > > </clock> > > > > > > <devices> > > > > > > <emulator>/usr/bin/kvm-spice</emulator> > > > > > > <interface type='bridge'> > > > > > > <source bridge='cloud0'/> > > > > > > <mac address='0e:00:a9:fe:02:f7'/> > > > > > > <model type='virtio'/> > > > > > > <rom bar='off' file=''/></interface> > > > > > > <console type='pty'> > > > > > > <target port='0'/> > > > </console> > > > > > > <disk device='disk' type='network'> > > > > > > <driver name='qemu' type='raw' cache='none' /> > > > > > > <source protocol='rbd' name='rbd/fa3dba5d-364b-43ce- > 9ebf-d15a3324f765'> > > > > > > <host name='ceph-mon' port='6789'/> > > > </source> > > > > > > <auth username='admin'> > > > > > > <secret type='ceph' uuid='2258aa76-7813-354d-b274- > 961fb337e716'/> > > > </auth> > > > > > > <target dev='vda' bus='virtio'/> > > > > > > <serial>fa3dba5d364b43ce9ebf</serial></disk> > > > > > > <disk device='cdrom' type='file'> > > > > > > <driver name='qemu' type='raw' cache='none' /> > > > > > > <source file='/usr/share/cloudstack-common/vms/systemvm.iso'/> > > > > > > <target dev='hdc' bus='ide'/> > > > </disk> > > > > > > <serial type='pty'> > > > > > > <target port='0'/> > > > </serial> > > > > > > <graphics type='vnc' autoport='yes' listen='10.253.250.230' > passwd='-Wf7MGedC0usYmBcCscv7A'/> > > > > > > <channel type='unix'> > > > > > > <source mode='bind' path='/var/lib/libvirt/qemu/r- > 37-VM.agent'/> > > > > > > <address type='virtio-serial'/> > > > > > > <target type='virtio' name='r-37-VM.vport'/> > > > </channel> > > > > > > <channel type='unix'> > > > > > > <source mode='bind' path='/var/lib/libvirt/qemu/r- > 37-VM.org.qemu.guest_agent.0'/> > > > > > > <address type='virtio-serial'/> > > > > > > <target type='virtio' name='org.qemu.guest_agent.0'/> > > > </channel> > > > > > > <input type='tablet' bus='usb'/> > > > </devices> > > > </domain> > > > > > > 2018-05-23 12:58:46,460 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/hypervisor/kvm/ > patchviasocket.py > > > -n r-37-VM -p %vpccidr=10.59.0.0/21%domain= > test.internal%dns1=8.8.8.8%priv > > > ategateway=None%template=domP%name=r-37-VM%eth0ip=169.254.2. > > > 247%eth0mask=255.255.0.0%type=vpcrouter%disable_rp_filter= > > > true%baremetalnotificationsecuritykey=Gs_GstLMdaO8CWiW_-wN71 > > > Xu0RhIGWdJ27Y_Q_lJ7vnFR44NkHMRknSyuMp4eRm9WXpasA7dmX9UtCRILq > > > RjLA%baremetalnotificationapikey=6a0ehX9xAk7wTIm4knUtIY0Qd1e > > > 95U03mXqoNnZD42UMvKAsZJxpOgnVnwRNQC3pC32eIwyYnyWHBX1e8vY5cQ% > > > host=172.16.11.56%port=8080 > > > 2018-05-23 12:58:46,496 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:58:46,498 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:58:49,567 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Could not > connect to > > > 169.254.2.247 > > > 2018-05-23 12:58:54,568 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:58:57,630 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Could not > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:02,631 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Unable to logon > to > > > 169.254.2.247 > > > 2018-05-23 12:59:02,632 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/hypervisor/kvm/ > patchviasocket.py > > > -n r-37-VM -p %vpccidr=10.59.0.0/21%domain= > test.internal%dns1=8.8.8.8%priv > > > ategateway=None%template=domP%name=r-37-VM%eth0ip=169.254.2. > > > 247%eth0mask=255.255.0.0%type=vpcrouter%disable_rp_filter= > > > true%baremetalnotificationsecuritykey=Gs_GstLMdaO8CWiW_-wN71 > > > Xu0RhIGWdJ27Y_Q_lJ7vnFR44NkHMRknSyuMp4eRm9WXpasA7dmX9UtCRILq > > > RjLA%baremetalnotificationapikey=6a0ehX9xAk7wTIm4knUtIY0Qd1e > > > 95U03mXqoNnZD42UMvKAsZJxpOgnVnwRNQC3pC32eIwyYnyWHBX1e8vY5cQ% > > > host=172.16.11.56%port=8080 > > > 2018-05-23 12:59:02,667 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:02,669 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:04,665 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (UgentTask-2:null) (logid:) Executing: > /usr/share/cloudstack-common/s > > > cripts/vm/network/security_group.py get_rule_logs_for_vms > > > 2018-05-23 12:59:04,731 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-1:null) (logid:80cd5d21) Processing > command: > > > com.cloud.agent.api.GetHostStatsCommand > > > 2018-05-23 12:59:04,770 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (UgentTask-2:null) (logid:) Execution is successful. > > > 2018-05-23 12:59:04,772 DEBUG [kvm.resource.LibvirtConnection] > > > (UgentTask-2:null) (logid:) Looking for libvirtd connection at: > > > qemu:///system > > > 2018-05-23 12:59:04,777 DEBUG [cloud.agent.Agent] > (UgentTask-2:null) > > > (logid:) Sending ping: Seq 15-2: { Cmd , MgmtId: -1, via: 15, > Ver: v1, > > > Flags: 11, [{"com.cloud.agent.api. > PingRoutingWithNwGroupsCommand":{" > > > newGroupStates":{},"_hostVmStateReport":{"r-37-VM":{"state": > > > "PowerOn","host":"kvm-test01-sb"}},"_gatewayAccessible": > > > > > > true,"_vnetAccessible":true,"hostType":"Routing","hostId":15,"wait":0}}] > } > > > 2018-05-23 12:59:04,876 DEBUG [cloud.agent.Agent] > (Agent-Handler-3:null) > > > (logid:) Received response: Seq 15-2: { Ans: , MgmtId: > 109952567336, via: > > > 15, Ver: v1, Flags: 100010, [{"com.cloud.agent.api.PingAns > > > wer":{"_command":{"hostType":"Routing","hostId":15,"wait":0} > ,"result":true,"wait":0}}] > > > } > > > 2018-05-23 12:59:05,726 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Could not > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:10,727 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:12,767 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Could not > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:13,532 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-2:null) (logid:275957d4) Processing > command: > > > com.cloud.agent.api.GetStorageStatsCommand > > > 2018-05-23 12:59:13,533 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-2:null) (logid:275957d4) Trying to fetch > storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 from libvirt > > > 2018-05-23 12:59:13,533 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-2:null) (logid:275957d4) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:59:13,539 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-2:null) (logid:275957d4) Asking libvirt > to refresh > > > storage pool 2258aa76-7813-354d-b274-961fb337e716 > > > 2018-05-23 12:59:13,989 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-2:null) (logid:275957d4) Succesfully > refreshed pool > > > 2258aa76-7813-354d-b274-961fb337e716 Capacity: 1500336095232 > Used: > > > 22173237211 Available: 1426660573184 > > > 2018-05-23 12:59:17,767 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Unable to logon > to > > > 169.254.2.247 > > > 2018-05-23 12:59:17,768 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/hypervisor/kvm/ > patchviasocket.py > > > -n r-37-VM -p %vpccidr=10.59.0.0/21%domain= > test.internal%dns1=8.8.8.8%priv > > > ategateway=None%template=domP%name=r-37-VM%eth0ip=169.254.2. > > > 247%eth0mask=255.255.0.0%type=vpcrouter%disable_rp_filter= > > > true%baremetalnotificationsecuritykey=Gs_GstLMdaO8CWiW_-wN71 > > > Xu0RhIGWdJ27Y_Q_lJ7vnFR44NkHMRknSyuMp4eRm9WXpasA7dmX9UtCRILq > > > RjLA%baremetalnotificationapikey=6a0ehX9xAk7wTIm4knUtIY0Qd1e > > > 95U03mXqoNnZD42UMvKAsZJxpOgnVnwRNQC3pC32eIwyYnyWHBX1e8vY5cQ% > > > host=172.16.11.56%port=8080 > > > 2018-05-23 12:59:17,802 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:17,803 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:17,804 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Could not > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:22,804 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:38,239 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Could not > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:43,239 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Unable to logon > to > > > 169.254.2.247 > > > 2018-05-23 12:59:43,248 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/hypervisor/kvm/ > patchviasocket.py > > > -n r-37-VM -p %vpccidr=10.59.0.0/21%domain= > test.internal%dns1=8.8.8.8%priv > > > ategateway=None%template=domP%name=r-37-VM%eth0ip=169.254.2. > > > 247%eth0mask=255.255.0.0%type=vpcrouter%disable_rp_filter= > > > true%baremetalnotificationsecuritykey=Gs_GstLMdaO8CWiW_-wN71 > > > Xu0RhIGWdJ27Y_Q_lJ7vnFR44NkHMRknSyuMp4eRm9WXpasA7dmX9UtCRILq > > > RjLA%baremetalnotificationapikey=6a0ehX9xAk7wTIm4knUtIY0Qd1e > > > 95U03mXqoNnZD42UMvKAsZJxpOgnVnwRNQC3pC32eIwyYnyWHBX1e8vY5cQ% > > > host=172.16.11.56%port=8080 > > > 2018-05-23 12:59:43,283 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:43,285 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:43,286 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > command: > > > com.cloud.agent.api.check.CheckSshCommand > > > 2018-05-23 12:59:43,286 DEBUG [resource.wrapper.LibvirtOvsVp > > > cRoutingPolicyConfigCommandWrapper] > (agentRequest-Handler-5:null) > > > (logid:ef8b353e) Ping command port, 169.254.2.247:3922 > > > 2018-05-23 12:59:43,286 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Trying to > connect to > > > 169.254.2.247 > > > 2018-05-23 12:59:43,287 DEBUG [resource.wrapper.LibvirtOvsVp > > > cRoutingPolicyConfigCommandWrapper] > (agentRequest-Handler-5:null) > > > (logid:ef8b353e) Ping command port succeeded for vm r-37-VM > > > 2018-05-23 12:59:43,287 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > command: > > > com.cloud.agent.api.GetDomRVersionCmd > > > 2018-05-23 12:59:43,289 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/network/domr/router_ > proxy.sh > > > get_template_version.sh 169.254.2.247 > > > 2018-05-23 12:59:43,456 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:43,459 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > command: > > > com.cloud.agent.api.PlugNicCommand > > > 2018-05-23 12:59:43,459 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:59:43,474 DEBUG [kvm.resource.BridgeVifDriver] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) > > > nic=[Nic:Public-82.119.177.188-vlan://233] > > > 2018-05-23 12:59:43,474 DEBUG [kvm.resource.BridgeVifDriver] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/network/vnet/modifyvlan.sh > -v 233 > > > -p enp8s0f0 -b brenp8s0f0-233 -o add > > > 2018-05-23 12:59:43,504 DEBUG [kvm.resource.BridgeVifDriver] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:47,202 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > command: > > > com.cloud.agent.api.routing.IpAssocVpcCommand > > > 2018-05-23 12:59:47,203 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:59:47,212 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) > matchPifFileInDirectory: > > > file name 'enp8s0f0.233' > > > 2018-05-23 12:59:47,213 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) getting > broadcast uri for > > > pif enp8s0f0.233 and bridge brenp8s0f0-233 > > > 2018-05-23 12:59:47,213 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Transforming > > > com.cloud.agent.api.routing.IpAssocVpcCommand to ConfigItems > > > 2018-05-23 12:59:47,581 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > FileConfigItem, > > > copying 257 characters to ip_associations.json took 340ms > > > 2018-05-23 12:59:47,582 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/network/domr/router_ > proxy.sh > > > update_config.py 169.254.2.247 ip_associations.json > > > 2018-05-23 12:59:47,766 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Exit value is 1 > > > 2018-05-23 12:59:47,767 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) [INFO] > update_config.py :: > > > Processing incoming file => ip_associations.json[INFO] > Processing JSON file > > > ip_associations.json > > > 2018-05-23 12:59:47,768 DEBUG [resource.virtualnetwork. > VirtualRoutingResource] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Processing > ScriptConfigItem, > > > executing update_config.py ip_associations.json took 186ms > > > 2018-05-23 12:59:47,775 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-5:null) (logid:ef8b353e) Seq > 15-5506494969390563338: > > > { Ans: , MgmtId: 109952567336, via: 15, Ver: v1, Flags: 110, > > > [{"com.cloud.agent.api.StartAnswer":{"vm":{"id":37,"name":" > > > r-37-VM","type":"DomainRouter","cpus":1,"minSpeed":500," > > > maxSpeed":500,"minRam":268435456,"maxRam":268435456," > arch":"x86_64","os":"Debian > > > GNU/Linux 5.0 (64-bit)","platformEmulator":"Debian GNU/Linux > > > 5","bootArgs":" vpccidr=10.59.0.0/21 domain=test.internal > dns1=8.8.8.8 > > > privategateway=None template=domP name=r-37-VM > eth0ip=169.254.2.247 > > > eth0mask=255.255.0.0 type=vpcrouter disable_rp_filter=true > > > baremetalnotificationsecuritykey=Gs_GstLMdaO8CWiW_-wN71Xu0Rh > > > IGWdJ27Y_Q_lJ7vnFR44NkHMRknSyuMp4eRm9WXpasA7dmX9UtCRILqRjLA > > > baremetalnotificationapikey=6a0ehX9xAk7wTIm4knUtIY0Qd1e95U03 > > > mXqoNnZD42UMvKAsZJxpOgnVnwRNQC3pC32eIwyYnyWHBX1e8vY5cQ > host=172.16.11.56 > > > port=8080","enableHA":true,"limitCpuUse":false,"enableDynami > > > callyScaleVm":false,"vncPassword":"-Wf7MGedC0usYmBc > > > Cscv7A","vncAddr":"10.253.250.230","params":{},"uuid":" > > > 87a45b1c-bf35-431e-82a0-c0c961feab05","disks":[{"data" > :{"org.apache. > > > cloudstack.storage.to.VolumeObjectTO":{"uuid":" > fa3dba5d-364b-43ce- > > > 9ebf-d15a3324f765","volumeType":"ROOT","dataStore":{"org. > > > apache.cloudstack.storage.to.PrimaryDataStoreTO" > > > :{"uuid":"2258aa76-7813-354d-b274-961fb337e716","id":14,"po > > > olType":"RBD","host":"ceph-mon","path":"rbd","port":6789," > > > url":"RBD://ceph-mon/rbd/?ROLE=Primary&STOREUUID=2258aa76- > > > 7813-354d-b274-961fb337e716","isManaged":false}},"name":" > > > ROOT-37","size":349945344,"path":"fa3dba5d-364b-43ce- > > > 9ebf-d15a3324f765","volumeId":37,"vmName":"r-37-VM"," > > > accountId":2,"format":"RAW","provisioningType":"THIN","id":3 > > > 7,"deviceId":0,"bytesReadRate":0,"bytesWriteRate":0,"iopsRea > > > dRate":0,"iopsWriteRate":0,"hypervisorType":"KVM"}},"diskSeq > > > ":0,"path":"fa3dba5d-364b-43ce-9ebf-d15a3324f765","type" > > > :"ROOT","_details":{"storageHost":"ceph-mon","managed":" > > > false","storagePort":"6789","volumeSize":"349945344"}}]," > > > nics":[{"deviceId":0,"networkRateMbps":-1," > defaultNic":false,"pxeDisable" > > > :true,"nicUuid":"226f4319-b15c-4e48-8a74-6a28b574a992"," > > > uuid":"76d57f99-f4af-42b8-af0a-41f0a2cd0ba5","ip":"169. > > > 254.2.247","netmask":"255.255.0.0","gateway":"169.254.0.1"," > > > mac":"0e:00:a9:fe:02:f7","broadcastType":"LinkLocal","type": > > > "Control","isSecurityGroupEnabled":false}],"guestOsDetails": > > > {}},"result":true,"wait":0}},{"com.cloud.agent.api.check.Che > > > ckSshAnswer":{"result":true,"wait":0}},{"com.cloud.agent. > > > api.GetDomRVersionAnswer":{"templateVersion":"Cloudstack > Release 4.10.0 > > > Mon Oct 9 06:45:25 UTC 2017","scriptsVersion":"748f68 > > > e99144d82158a4c4d3dcbfef04\n","result":true,"details":"Cloudstack > Release > > > 4.10.0 Mon Oct 9 06:45:25 UTC 2017&748f68e99144d82158a4c4d3d > > > cbfef04\n","wait":0}},{"com.cloud.agent.api.PlugNicAnswer" > > > :{"result":true,"details":"success","wait":0}},{"com.cloud. > > > agent.api.routing.GroupAnswer":{"results":["null - success: > null","null - > > > failed: [INFO] update_config.py :: Processing incoming file => > > > ip_associations.json[INFO] Processing JSON file > > > ip_associations.json"],"result":false,"wait":0}},{"com. > > > cloud.agent.api.Answer":{"result":false,"details":"Stopped by > previous > > > failure","wait":0}},{"com.cloud.agent.api.Answer":{" > result":false,"details":"Stopped > > > by previous failure","wait":0}}] } > > > 2018-05-23 12:59:47,835 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Request:Seq > > > 15-5506494969390563341: { Cmd , MgmtId: 109952567336, via: 15, > Ver: v1, > > > Flags: 100011, [{"com.cloud.agent.api. > StopCommand":{"isProxy":false," > > > checkBeforeCleanup":false,"forceStop":false,"vmName":"r-37- > > > VM","executeInSequence":false,"wait":0}}] } > > > 2018-05-23 12:59:47,836 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Processing > command: > > > com.cloud.agent.api.StopCommand > > > 2018-05-23 12:59:47,836 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:59:47,854 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/network/security_ > group.py > > > destroy_network_rules_for_vm --vmname r-37-VM --vif vnet0 > > > 2018-05-23 12:59:48,041 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:48,044 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Cleaning the > metadata of vm > > > snapshots of vm r-37-VM > > > 2018-05-23 12:59:48,045 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Try to stop the > vm at first > > > 2018-05-23 12:59:54,585 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) successfully > shut down vm > > > r-37-VM > > > 2018-05-23 12:59:54,586 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Executing: > /bin/bash -c ls > > > /sys/class/net/cloud0 > > > 2018-05-23 12:59:54,591 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:54,593 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Executing: > /bin/bash -c ls > > > /sys/class/net/cloud0/brif | tr ' > > > ' ' ' > > > 2018-05-23 12:59:54,598 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:54,600 DEBUG [kvm.resource.BridgeVifDriver] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) unable to get a > vNet ID from > > > name cloud0 > > > 2018-05-23 12:59:54,600 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Executing: > /bin/bash -c ls > > > /sys/class/net/brenp8s0f0-233 > > > 2018-05-23 12:59:54,604 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:54,605 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Executing: > /bin/bash -c ls > > > /sys/class/net/brenp8s0f0-233/brif | tr ' > > > ' ' ' > > > 2018-05-23 12:59:54,610 DEBUG [utils.script.Script] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:54,612 DEBUG [kvm.resource.BridgeVifDriver] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/network/vnet/modifyvlan.sh > -o > > > delete -v 233 -p enp8s0f0 -b brenp8s0f0-233 > > > 2018-05-23 12:59:54,763 DEBUG [kvm.resource.BridgeVifDriver] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:54,766 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-3:null) (logid:ef8b353e) Seq > 15-5506494969390563341: > > > { Ans: , MgmtId: 109952567336, via: 15, Ver: v1, Flags: 10, > > > [{"com.cloud.agent.api.StopAnswer":{"result":true,"wait":0}}] > } > > > 2018-05-23 12:59:54,852 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Request:Seq > > > 15-5506494969390563342: { Cmd , MgmtId: 109952567336, via: 15, > Ver: v1, > > > Flags: 100011, [{"com.cloud.agent.api. > StopCommand":{"isProxy":false," > > > checkBeforeCleanup":false,"forceStop":false,"vmName":"r-37- > > > VM","executeInSequence":false,"wait":0}}] } > > > 2018-05-23 12:59:54,852 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Processing > command: > > > com.cloud.agent.api.StopCommand > > > 2018-05-23 12:59:54,853 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:59:54,854 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Can not find > KVM connection > > > for Instance: r-37-VM, continuing. > > > 2018-05-23 12:59:54,854 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: lxc:/// > > > 2018-05-23 12:59:54,854 INFO [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) No existing > libvirtd > > > connection found. Opening a new one > > > 2018-05-23 12:59:54,856 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Successfully > connected to > > > libvirt at: lxc:/// > > > 2018-05-23 12:59:54,857 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Can not find > LXC connection > > > for Instance: r-37-VM, continuing. > > > 2018-05-23 12:59:54,857 WARN [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Can not find a > connection > > > for Instance r-37-VM. Assuming the default connection. > > > 2018-05-23 12:59:54,857 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:59:54,858 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Failed to get > dom xml: > > > org.libvirt.LibvirtException: Domain not found: no domain with > matching > > > name 'r-37-VM' > > > 2018-05-23 12:59:54,859 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Failed to get > dom xml: > > > org.libvirt.LibvirtException: Domain not found: no domain with > matching > > > name 'r-37-VM' > > > 2018-05-23 12:59:54,860 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Failed to get > dom xml: > > > org.libvirt.LibvirtException: Domain not found: no domain with > matching > > > name 'r-37-VM' > > > 2018-05-23 12:59:54,860 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Executing: > > > /usr/share/cloudstack-common/scripts/vm/network/security_ > group.py > > > destroy_network_rules_for_vm --vmname r-37-VM > > > 2018-05-23 12:59:55,043 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Execution is > successful. > > > 2018-05-23 12:59:55,046 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Failed to get > vm :Domain not > > > found: no domain with matching name 'r-37-VM' > > > 2018-05-23 12:59:55,046 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Try to stop the > vm at first > > > 2018-05-23 12:59:55,047 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) VM r-37-VM > doesn't exist, no > > > need to stop it > > > 2018-05-23 12:59:55,048 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-4:null) (logid:ef8b353e) Seq > 15-5506494969390563342: > > > { Ans: , MgmtId: 109952567336, via: 15, Ver: v1, Flags: 10, > > > [{"com.cloud.agent.api.StopAnswer":{"result":true,"wait":0}}] > } > > > 2018-05-23 12:59:56,940 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Request:Seq > > > 15-5506494969390563343: { Cmd , MgmtId: 109952567336, via: 15, > Ver: v1, > > > Flags: 100011, [{"org.apache.cloudstack. > storage.command.DeleteCommand":{" > > > data":{"org.apache.cloudstack.storage.to.VolumeObjectTO":{" > > > uuid":"fa3dba5d-364b-43ce-9ebf-d15a3324f765"," > > > volumeType":"ROOT","dataStore":{"org.apache.cloudstack. > storage.to > > > .PrimaryDataStoreTO":{"uuid":"2258aa76-7813-354d- > > > b274-961fb337e716","id":14,"poolType":"RBD","host":"ceph-mon > > > ","path":"rbd","port":6789,"url":"RBD://ceph-mon/rbd/?ROLE > > > =Primary&STOREUUID=2258aa76-7813-354d-b274-961fb337e716"," > > > isManaged":false}},"name":"ROOT-37","size":349945344," > > > path":"fa3dba5d-364b-43ce-9ebf-d15a3324f765","volumeId": > > > 37,"vmName":"r-37-VM","accountId":2,"format":"RAW","p > > > rovisioningType":"THIN","id":37,"deviceId":0," > hypervisorType":"KVM"}},"wait":0}}] > > > } > > > 2018-05-23 12:59:56,940 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Processing > command: > > > org.apache.cloudstack.storage.command.DeleteCommand > > > 2018-05-23 12:59:56,941 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Trying to fetch > storage pool > > > 2258aa76-7813-354d-b274-961fb337e716 from libvirt > > > 2018-05-23 12:59:56,952 DEBUG [kvm.resource.LibvirtConnection] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Looking for > libvirtd > > > connection at: qemu:///system > > > 2018-05-23 12:59:56,960 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Succesfully > refreshed pool > > > 2258aa76-7813-354d-b274-961fb337e716 Capacity: 1500336095232 > Used: > > > 22173237211 Available: 1426660573184 > > > 2018-05-23 12:59:57,215 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Attempting to > remove volume > > > fa3dba5d-364b-43ce-9ebf-d15a3324f765 from pool > > > 2258aa76-7813-354d-b274-961fb337e716 > > > 2018-05-23 12:59:57,215 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Unprotecting > and Removing > > > RBD snapshots of image rbd/fa3dba5d-364b-43ce-9ebf-d15a3324f765 > prior to > > > removing the image > > > 2018-05-23 12:59:57,228 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Succesfully > connected to > > > Ceph cluster at ceph-mon:6789 > > > 2018-05-23 12:59:57,290 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Fetching list > of snapshots > > > of RBD image rbd/fa3dba5d-364b-43ce-9ebf-d15a3324f765 > > > 2018-05-23 12:59:57,302 INFO [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Succesfully > unprotected and > > > removed any remaining snapshots (0) of > rbd/fa3dba5d-364b-43ce-9ebf-d15a3324f765 > > > Continuing to remove the RBD image > > > 2018-05-23 12:59:57,302 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Closing image > and destroying > > > context > > > 2018-05-23 12:59:57,307 DEBUG [kvm.storage. > LibvirtStorageAdaptor] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Instructing > libvirt to > > > remove volume fa3dba5d-364b-43ce-9ebf-d15a3324f765 from pool > > > 2258aa76-7813-354d-b274-961fb337e716 > > > 2018-05-23 12:59:57,681 DEBUG [cloud.agent.Agent] > > > (agentRequest-Handler-1:null) (logid:ef8b353e) Seq > 15-5506494969390563343: > > > { Ans: , MgmtId: 109952567336, via: 15, Ver: v1, Flags: 10, > > > [{"com.cloud.agent.api.Answer":{"result":true,"wait":0}}] } > > > 2018-05-23 13:00:04,666 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (UgentTask-3:null) (logid:) Executing: > /usr/share/cloudstack-common/s > > > cripts/vm/network/security_group.py get_rule_logs_for_vms > > > 2018-05-23 13:00:04,770 DEBUG [kvm.resource. > LibvirtComputingResource] > > > (UgentTask-3:null) (logid:) Execution is successful. > > > 2018-05-23 13:00:04,772 DEBUG [kvm.resource.LibvirtConnection] > > > (UgentTask-3:null) (logid:) Looking for libvirtd connection at: > > > qemu:///system > > > 2018-05-23 13:00:04,774 DEBUG [cloud.agent.Agent] > (UgentTask-3:null) > > > (logid:) Sending ping: Seq 15-3: { Cmd , MgmtId: -1, via: 15, > Ver: v1, > > > Flags: 11, [{"com.cloud.agent.api. > PingRoutingWithNwGroupsCommand":{" > > > newGroupStates":{},"_hostVmStateReport":{},"_gatewayAccessib > > > le":true,"_vnetAccessible":true,"hostType":"Routing"," > hostId":15,"wait":0}}] > > > } > > > 2018-05-23 13:00:04,819 DEBUG [cloud.agent.Agent] > (Agent-Handler-3:null) > > > (logid:) Received response: Seq 15-3: { Ans: , MgmtId: > 109952567336, via: > > > 15, Ver: v1, Flags: 100010, [{"com.cloud.agent.api.PingAns > > > wer":{"_command":{"hostType":"Routing","hostId":15,"wait":0} > ,"result":true,"wait":0}}] > > > } > > > > > > any ideas ? > > > > > > king regards > > > > > > Ben > > > > > > > > > > > > > > > > > > > > Andrija Panic <[email protected]> hat am 21. Mai > 2018 um > > > > 13:17 geschrieben: > > > > > > > > NetworkManager ? I though it was advised to not run it... > > > > > > > > On 18 May 2018 at 16:11, Simon Weller > <[email protected]> > > > > wrote: > > > > > > > > > > > > > > > > > > > > > > > Ben, > > > > > > > > > > Can you put the KVM agent in debug mode and post > the logs? > > > > > > > > > > sed -i 's/INFO/DEBUG/g' > /etc/cloudstack/agent/log4j-cloud.xml > > > > > > > > > > Then restart the agent. > > > > > > > > > > * Si > > > > > > > > > > ________________________________ > > > > > From: Benjamin Naber < > [email protected]> > > > > > Sent: Friday, May 18, 2018 2:20 AM > > > > > To: Cloudstack Mailinglist > > > > > Subject: KVM Problem by deploying VPC > > > > > > > > > > Hi Together, > > > > > > > > > > im currently testing the Configuration of > KVM Hosts in our Test > > > > > environment. > > > > > When i try to deploy a VPC the Hypervisor > shows me the > > > > > following error: > > > > > > > > > > Hypervisor Log: > > > > > > > > > > May 18 09:12:08 kvm-test01-sb dbus[883]: > [system] Successfully > > > > > activated > > > > > service 'org.freedesktop.nm_dispatcher' > > > > > May 18 09:12:08 kvm-test01-sb systemd: > Started Network Manager > > > > > Script > > > > > Dispatcher Service. > > > > > May 18 09:12:08 kvm-test01-sb nm-dispatcher: > req:1 'up' > > > > > > > > > > > > > > > > > > > > > > [vnet0]: new > > > > > > > > > > > > > > > > > > > > > > request (4 scripts) > > > > > May 18 09:12:08 kvm-test01-sb nm-dispatcher: req:1 > 'up' > > > > > > > > > > > > > > > > > > > > > > [vnet0]: start > > > > > > > > > > > > > > > > > > > > > > running ordered scripts... > > > > > May 18 09:12:08 kvm-test01-sb libvirtd: 2018-05-18 > > > > > 07:12:08.667+0000: > > > > > 6251: warning : qemuDomainObjTaint:5378 : Domain > id=2 > > > > > name='r-31-VM' > > > > > uuid=ff22b439-e0d0-44d1-a3cc-8dd23afb82eb is > tainted: > > > > > high-privileges > > > > > May 18 09:12:08 kvm-test01-sb dbus[883]: [system] > Activating > > > > > via systemd: > > > > > service name='org.freedesktop.machine1' > > > > > unit='dbus-org.freedesktop. > > > > > machine1.service' > > > > > May 18 09:12:08 kvm-test01-sb systemd: Starting > Virtual > > > > > Machine and > > > > > Container Registration Service... > > > > > May 18 09:12:08 kvm-test01-sb dbus[883]: [system] > Successfully > > > > > activated > > > > > service 'org.freedesktop.machine1' > > > > > May 18 09:12:08 kvm-test01-sb systemd: Started > Virtual Machine > > > > > and > > > > > Container Registration Service. > > > > > May 18 09:12:08 kvm-test01-sb systemd-machined: > New machine > > > > > qemu-2-r-31-VM. > > > > > May 18 09:12:08 kvm-test01-sb systemd: Started > Virtual Machine > > > > > qemu-2-r-31-VM. > > > > > May 18 09:12:08 kvm-test01-sb systemd: Starting > Virtual Machine > > > > > qemu-2-r-31-VM. > > > > > May 18 09:12:08 kvm-test01-sb systemd: Unit > iscsi.service > > > > > cannot be > > > > > reloaded because it is inactive. > > > > > May 18 09:12:08 kvm-test01-sb kvm: 1 guest now > active > > > > > May 18 09:12:12 kvm-test01-sb kernel: kvm [48292]: > vcpu0 > > > > > disabled perfctr > > > > > wrmsr: 0xc1 data 0xabcd > > > > > May 18 09:12:44 kvm-test01-sb kernel: kvm [48292]: > vcpu0 > > > > > disabled perfctr > > > > > wrmsr: 0xc1 data 0xabcd > > > > > May 18 09:12:46 kvm-test01-sb sh: INFO > > > > > [kvm.storage.LibvirtStorageAdaptor] > > > > > (agentRequest-Handler-5:) (logid:b911ffae) Trying > to fetch > > > > > storage pool > > > > > 2258aa76-7813-354d-b274-961fb337e716 from libvirt > > > > > May 18 09:12:46 kvm-test01-sb sh: INFO > > > > > [kvm.storage.LibvirtStorageAdaptor] > > > > > (agentRequest-Handler-5:) (logid:b911ffae) Asking > libvirt to > > > > > refresh > > > > > storage pool 2258aa76-7813-354d-b274-961fb337e716 > > > > > May 18 09:13:05 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627585.6454] manager: (enp8s0f0.233): new > VLAN device > > > > > (/org/freedesktop/NetworkManager/Devices/15) > > > > > May 18 09:13:05 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627585.6470] device (enp8s0f0.233): carrier: > link > > > > > connected > > > > > May 18 09:13:05 kvm-test01-sb kernel: > brenp8s0f0-233: port > > > > > 1(enp8s0f0.233) > > > > > entered blocking state > > > > > May 18 09:13:05 kvm-test01-sb kernel: > brenp8s0f0-233: port > > > > > 1(enp8s0f0.233) > > > > > entered disabled state > > > > > May 18 09:13:05 kvm-test01-sb kernel: device > enp8s0f0.233 > > > > > entered > > > > > promiscuous mode > > > > > May 18 09:13:05 kvm-test01-sb kernel: > brenp8s0f0-233: port > > > > > 2(vnet1) > > > > > entered blocking state > > > > > May 18 09:13:05 kvm-test01-sb kernel: > brenp8s0f0-233: port > > > > > 2(vnet1) > > > > > entered disabled state > > > > > May 18 09:13:05 kvm-test01-sb kernel: device vnet1 > entered > > > > > promiscuous mode > > > > > May 18 09:13:05 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627585.6648] manager: (vnet1): new Tun device > > > > > (/org/freedesktop/ > > > > > NetworkManager/Devices/16) > > > > > May 18 09:13:05 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627585.6662] device (vnet1): state change: > unmanaged -> > > > > > unavailable > > > > > (reason 'connection-assumed', sys-iface-state: > 'external') > > > > > May 18 09:13:05 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627585.6674] device (vnet1): state change: > unavailable -> > > > > > disconnected > > > > > (reason 'none', sys-iface-state: 'external') > > > > > May 18 09:13:10 kvm-test01-sb kernel: cloud0: port > 1(vnet0) > > > > > entered > > > > > disabled state > > > > > May 18 09:13:10 kvm-test01-sb kernel: device vnet0 > left > > > > > promiscuous mode > > > > > May 18 09:13:10 kvm-test01-sb kernel: cloud0: port > 1(vnet0) > > > > > entered > > > > > disabled state > > > > > May 18 09:13:10 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627590.5339] device (vnet0): state change: > activated -> > > > > > unmanaged > > > > > (reason 'unmanaged', sys-iface-state: 'removed') > > > > > May 18 09:13:10 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627590.5342] device (cloud0): bridge port > vnet0 was > > > > > detached > > > > > May 18 09:13:10 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627590.5342] device (vnet0): released from > master device > > > > > cloud0 > > > > > May 18 09:13:10 kvm-test01-sb dbus[883]: [system] > Activating > > > > > via systemd: > > > > > service name='org.freedesktop.nm_dispatcher' > > > > > unit='dbus-org.freedesktop.nm-dispatcher.service' > > > > > May 18 09:13:10 kvm-test01-sb systemd: Starting > Network > > > > > Manager Script > > > > > Dispatcher Service... > > > > > May 18 09:13:10 kvm-test01-sb kernel: device vnet1 > left > > > > > promiscuous mode > > > > > May 18 09:13:10 kvm-test01-sb kernel: > brenp8s0f0-233: port > > > > > 2(vnet1) > > > > > entered disabled state > > > > > May 18 09:13:10 kvm-test01-sb dbus[883]: [system] > Successfully > > > > > activated > > > > > service 'org.freedesktop.nm_dispatcher' > > > > > May 18 09:13:10 kvm-test01-sb systemd: Started > Network Manager > > > > > Script > > > > > Dispatcher Service. > > > > > May 18 09:13:10 kvm-test01-sb nm-dispatcher: req:1 > 'down' > > > > > > > > > > > > > > > > > > > > > > [vnet0]: new > > > > > > > > > > > > > > > > > > > > > > request (4 scripts) > > > > > May 18 09:13:10 kvm-test01-sb nm-dispatcher: req:1 > 'down' > > > > > > > > > > > > > > > > > > > > > > [vnet0]: start > > > > > > > > > > > > > > > > > > > > > > running ordered scripts... > > > > > May 18 09:13:10 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627590.5772] device (vnet1): state change: > disconnected > > > > > -> unmanaged > > > > > (reason 'unmanaged', sys-iface-state: 'removed') > > > > > May 18 09:13:10 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627590.5772] device (vnet1): released from > master device > > > > > brenp8s0f0-233 > > > > > May 18 09:13:10 kvm-test01-sb libvirtd: 2018-05-18 > > > > > 07:13:10.596+0000: > > > > > 1519: error : qemuMonitorIO:704 : internal error: > End of file > > > > > from qemu > > > > > monitor > > > > > May 18 09:13:10 kvm-test01-sb kvm: 0 guests now > active > > > > > May 18 09:13:10 kvm-test01-sb systemd-machined: > Machine > > > > > qemu-2-r-31-VM > > > > > terminated. > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: QEMU > Driver error : > > > > > Domain not > > > > > found: no domain with matching uuid > > > > > 'ff22b439-e0d0-44d1-a3cc-8dd23afb82eb' > > > > > (r-31-VM) > > > > > May 18 09:13:11 kvm-test01-sb kernel: device > enp8s0f0.233 left > > > > > promiscuous > > > > > mode > > > > > May 18 09:13:11 kvm-test01-sb kernel: > brenp8s0f0-233: port > > > > > 1(enp8s0f0.233) > > > > > entered disabled state > > > > > May 18 09:13:11 kvm-test01-sb NetworkManager[925]: > <info> > > > > > [1526627591.2137] device (enp8s0f0.233): released > from master > > > > > device > > > > > brenp8s0f0-233 > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: QEMU > Driver error : > > > > > Domain not > > > > > found: no domain with matching name 'r-31-VM' > > > > > May 18 09:13:11 kvm-test01-sb sh: INFO > > > > > [kvm.resource.LibvirtConnection] > > > > > (agentRequest-Handler-3:) (logid:94c38c31) No > existing > > > > > libvirtd connection > > > > > found. Opening a new one > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: LXC > Driver error : > > > > > Domain not > > > > > found: No domain with matching name 'r-31-VM' > > > > > May 18 09:13:11 kvm-test01-sb sh: WARN > > > > > [kvm.resource.LibvirtConnection] > > > > > (agentRequest-Handler-3:) (logid:94c38c31) Can not > find a > > > > > connection for > > > > > Instance r-31-VM. Assuming the default connection. > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: QEMU > Driver error : > > > > > Domain not > > > > > found: no domain with matching name 'r-31-VM' > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: QEMU > Driver error : > > > > > Domain not > > > > > found: no domain with matching name 'r-31-VM' > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: QEMU > Driver error : > > > > > Domain not > > > > > found: no domain with matching name 'r-31-VM' > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: QEMU > Driver error : > > > > > Domain not > > > > > found: no domain with matching name 'r-31-VM' > > > > > May 18 09:13:11 kvm-test01-sb sh: libvirt: QEMU > Driver error : > > > > > Domain not > > > > > found: no domain with matching name 'r-31-VM' > > > > > > > > > > CS Management Exception: > > > > > > > > > > 2018-05-18 09:13:11,284 ERROR > [c.c.v.VirtualMachineManagerIm > > > > > pl] > > > > > (Work-Job-Executor-30:ctx-67604a44 job-169/job-170 > > > > > ctx-ec9dfbf1) > > > > > (logid:94c38c31) Failed to start instance > > > > > VM[DomainRouter|r-31-VM] > > > > > com.cloud.utils.exception.ExecutionException: > Unable to start > > > > > VM:ff22b439-e0d0-44d1-a3cc-8dd23afb82eb due to > error in > > > > > finalizeStart, > > > > > not retrying > > > > > at com.cloud.vm.VirtualMachineManagerImpl. > orchestrateStart( > > > > > VirtualMachineManagerImpl.java:1099) > > > > > at com.cloud.vm.VirtualMachineManagerImpl. > orchestrateStart( > > > > > VirtualMachineManagerImpl.java:4691) > > > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > > > > > at sun.reflect.NativeMethodAccessorImpl.invoke( > > > > > NativeMethodAccessorImpl.java:62) > > > > > at sun.reflect.DelegatingMethodAccessorImpl. > invoke( > > > > > DelegatingMethodAccessorImpl.java:43) > > > > > at java.lang.reflect.Method. > invoke(Method.java:498) > > > > > at com.cloud.vm.VmWorkJobHandlerProxy. > handleVmWorkJob( > > > > > VmWorkJobHandlerProxy.java:107) > > > > > at com.cloud.vm.VirtualMachineManagerImpl. > handleVmWorkJob( > > > > > VirtualMachineManagerImpl.java:4852) > > > > > at com.cloud.vm.VmWorkJobDispatcher.runJob( > > > > > VmWorkJobDispatcher.java:102) > > > > > at org.apache.cloudstack.framework.jobs.impl. > > > > > AsyncJobManagerImpl$5.runInContext( > AsyncJobManagerImpl. > > > > > java:558) > > > > > at org.apache.cloudstack.managed.context. > > > > > ManagedContextRunnable$1.run( > ManagedContextRunnable.java:49) > > > > > at org.apache.cloudstack.managed.context.impl. > > > > > DefaultManagedContext$1.call( > DefaultManagedContext.java:56) > > > > > at org.apache.cloudstack.managed.context.impl. > > > > > DefaultManagedContext.callWithContext( > DefaultManagedContext > > > > > .java:103) > > > > > at org.apache.cloudstack.managed.context.impl. > > > > > DefaultManagedContext.runWithContext( > DefaultManagedContext. > > > > > java:53) > > > > > at org.apache.cloudstack.managed.context. > > > > > ManagedContextRunnable.run( > ManagedContextRunnable.java:46) > > > > > at org.apache.cloudstack.framework.jobs.impl. > > > > > AsyncJobManagerImpl$5.run( > AsyncJobManagerImpl.java:506) > > > > > at java.util.concurrent.Executors$RunnableAdapter. > > > > > call(Executors.java:511) > > > > > at java.util.concurrent.FutureTask.run(FutureTask. > java:266) > > > > > at java.util.concurrent. > ThreadPoolExecutor.runWorker( > > > > > ThreadPoolExecutor.java:1149) > > > > > at java.util.concurrent. > ThreadPoolExecutor$Worker.run( > > > > > ThreadPoolExecutor.java:624) > > > > > at java.lang.Thread.run(Thread.java:748) > > > > > 2018-05-18 09:13:11,289 DEBUG > [c.c.v.VirtualMachineManagerIm > > > > > pl] > > > > > (Work-Job-Executor-30:ctx-67604a44 job-169/job-170 > > > > > ctx-ec9dfbf1) > > > > > (logid:94c38c31) Cleaning up resources for the vm > > > > > VM[DomainRouter|r-31-VM] > > > > > in Starting state > > > > > 2018-05-18 09:13:11,291 DEBUG [c.c.a.t.Request] > > > > > (Work-Job-Executor-30:ctx-67604a44 > > > > > job-169/job-170 ctx-ec9dfbf1) (logid:94c38c31) Seq > > > > > 10-6761873365520090721: > > > > > Sending { Cmd , MgmtId: 109952567336, via: > 10(kvm-test01-sb), > > > > > Ver: v1, > > > > > Flags: 100011, [{"com.cloud.agent.api.StopCom > > > > > mand":{"isProxy":false, > > > > > "checkBeforeCleanup":false," > forceStop":false,"vmName":"r- > > > > > 31-VM","executeInSequence":false,"wait":0}}] } > > > > > 2018-05-18 09:13:11,627 DEBUG [c.c.a.t.Request] > > > > > (AgentManager-Handler-14:null) (logid:) Seq > > > > > 10-6761873365520090721: > > > > > Processing: { Ans: , MgmtId: 109952567336, via: > 10, Ver: v1, > > > > > Flags: 10, > > > > > [{"com.cloud.agent.api. > StopAnswer":{"result":true,"wait":0}}] > > > > > } > > > > > 2018-05-18 09:13:11,627 DEBUG [c.c.a.t.Request] > > > > > (Work-Job-Executor-30:ctx-67604a44 > > > > > job-169/job-170 ctx-ec9dfbf1) (logid:94c38c31) Seq > > > > > 10-6761873365520090721: > > > > > Received: { Ans: , MgmtId: 109952567336, via: > > > > > 10(kvm-test01-sb), Ver: v1, > > > > > Flags: 10, { StopAnswer } } > > > > > 2018-05-18 09:13:11,639 DEBUG > [c.c.n.g.ControlNetworkGuru] > > > > > (Work-Job-Executor-30:ctx-67604a44 job-169/job-170 > > > > > ctx-ec9dfbf1) > > > > > (logid:94c38c31) Released nic: > NicProfile[65-31-null-null-null > > > > > 2018-05-18 09:13:11,646 DEBUG > [c.c.v.VirtualMachineManagerIm > > > > > pl] > > > > > (Work-Job-Executor-30:ctx-67604a44 job-169/job-170 > > > > > ctx-ec9dfbf1) > > > > > (logid:94c38c31) Successfully released network > resources for > > > > > the vm > > > > > VM[DomainRouter|r-31-VM] > > > > > 2018-05-18 09:13:11,646 DEBUG > [c.c.v.VirtualMachineManagerIm > > > > > pl] > > > > > (Work-Job-Executor-30:ctx-67604a44 job-169/job-170 > > > > > ctx-ec9dfbf1) > > > > > (logid:94c38c31) Successfully cleanued up > resources for the vm > > > > > VM[DomainRouter|r-31-VM] in Starting state > > > > > > > > > > Deployment of Isolatet Networks and redundant > VPCs, works > > > > > wothout any > > > > > problems. > > > > > > > > > > CS Version: 4.10 > > > > > KVM Host: CentOS 7.4 minimal > (libvirt-3.9.0-14.el7_5.2.x86_64, > > > > > qemu-kvm-1.5.3-156.el7.x86_64, > kernel-3.10.0-693.el7.x86_64) > > > > > > > > > > Someone got an i idea ? > > > > > > > > > > Kind regards > > > > > > > > > > Ben > > > > > > > > > > > > > > > > -- > > > > > > > > > > > > > > > > > Panić > > > > > > > > > > > > > ___________________________________________________ > > > Benjamin Naber • Holzstraße 7 • D-73650 Winterbach > > > Mobil: +49 (0) 152.34087809 > > > E-Mail: [email protected] > > > > > > ___________________________________________________ > > > Diese E-mail einschließlich eventuell angehängter Dateien > enthält > > > vertrauliche und / oder rechtlich geschützte Informationen. > Wenn Sie nicht > > > der richtige Adressat sind und diese E-mail irrtümlich > erhalten haben, > > > dürfen Sie weder den Inhalt dieser E-mail nutzen noch dürfen > Sie die > > > eventuell angehängten Dateien öffnen und auch keine Kopie > fertigen oder den > > > Inhalt weitergeben / verbreiten. Bitte verständigen Sie den > Absender und > > > löschen Sie diese E-mail und eventuell angehängte Dateien > umgehend. > > > > > > > > > -- > > > > Andrija Panić > > > > > > ___________________________________________________ > Benjamin Naber • Holzstraße 7 • D-73650 Winterbach > Mobil: +49 (0) 152.34087809 > E-Mail: [email protected] > ___________________________________________________ > Diese E-mail einschließlich eventuell angehängter Dateien enthält > vertrauliche und / oder rechtlich geschützte Informationen. Wenn Sie nicht > der richtige Adressat sind und diese E-mail irrtümlich erhalten haben, > dürfen Sie weder den Inhalt dieser E-mail nutzen noch dürfen Sie die > eventuell angehängten Dateien öffnen und auch keine Kopie fertigen oder den > Inhalt weitergeben / verbreiten. Bitte verständigen Sie den Absender und > löschen Sie diese E-mail und eventuell angehängte Dateien umgehend. > Disclaimer Notice: > This email has been sent by Oakford Technology Limited, while we have > checked this e-mail and any attachments for viruses, we can not guarantee > that they are virus-free. You must therefore take full responsibility for > virus checking. > This message and any attachments are confidential and should only be read > by those to whom they are addressed. If you are not the intended recipient, > please contact us, delete the message from your computer and destroy any > copies. Any distribution or copying without our prior permission is > prohibited. > Internet communications are not always secure and therefore Oakford > Technology Limited does not accept legal responsibility for this message. > The recipient is responsible for verifying its authenticity before acting > on the contents. Any views or opinions presented are solely those of the > author and do not necessarily represent those of Oakford Technology Limited. > Registered address: Oakford Technology Limited, 10 Prince Maurice Court, > Devizes, Wiltshire. SN10 2RT. > Registered in England and Wales No. 5971519 > > -- Andrija Panić
