Re: [ovirt-devel] Experimental Flow for Master Fails to Run a VM

2016-12-02 Thread Yaniv Kaul
On Dec 2, 2016 2:11 PM, "Anton Marchukov"  wrote:

Hello Martin.

Do by outdated you mean the old libvirt? If so that is that livirt
available in CentOS 7.2? There is no 7.3 yet.


Right, this is the issue.
Y.


Anton.

On Fri, Dec 2, 2016 at 1:07 PM, Martin Polednik 
wrote:

> On 02/12/16 10:55 +0100, Anton Marchukov wrote:
>
>> Hello All.
>>
>> Engine log can be viewed here:
>>
>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_ma
>> ster/3838/artifact/exported-artifacts/basic_suite_master.sh-
>> el7/exported-artifacts/test_logs/basic-suite-master/post-004
>> _basic_sanity.py/lago-basic-suite-master-engine/_var_log_o
>> virt-engine/engine.log
>>
>> I see the following exception there:
>>
>> 2016-12-02 04:29:24,030-05 DEBUG
>> [org.ovirt.vdsm.jsonrpc.client.internal.ResponseWorker]
>> (ResponseWorker) [83b6b5d] Message received: {"jsonrpc": "2.0", "id":
>> "ec254aad-441b-47e7-a644-aebddcc1d62c", "result": true}
>> 2016-12-02 04:29:24,030-05 ERROR
>> [org.ovirt.vdsm.jsonrpc.client.JsonRpcClient] (ResponseWorker)
>> [83b6b5d] Not able to update response for
>> "ec254aad-441b-47e7-a644-aebddcc1d62c"
>> 2016-12-02 04:29:24,041-05 DEBUG
>> [org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
>> (DefaultQuartzScheduler3) [47a31d72] Rescheduling
>> DEFAULT.org.ovirt.engine.core.bll.gluster.GlusterSyncJob.ref
>> reshLightWeightData#-9223372036854775775
>> as there is no unfired trigger.
>> 2016-12-02 04:29:24,024-05 DEBUG
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Exception:
>> org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
>> VDSGenericException: VDSNetworkException: Timeout during xml-rpc call
>> at org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.g
>> et(FutureVDSCommand.java:73)
>> [vdsbroker.jar:]
>>
>> 
>>
>> 2016-12-02 04:29:24,042-05 ERROR
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Timeout waiting for
>> VDSM response: Internal timeout occured
>> 2016-12-02 04:29:24,044-05 DEBUG
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.GetCapabilitiesVDSCommand]
>> (default task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] START,
>> GetCapabilitiesVDSCommand(HostName = lago-basic-suite-master-host0,
>> VdsIdAndVdsVDSCommandParametersBase:{runAsync='true',
>> hostId='5eb7019e-28a3-4f93-9188-685b6c64a2f5',
>> vds='Host[lago-basic-suite-master-host0,5eb7019e-28a3-4f93-9
>> 188-685b6c64a2f5]'}),
>> log id: 58f448b8
>> 2016-12-02 04:29:24,044-05 DEBUG
>> [org.ovirt.vdsm.jsonrpc.client.reactors.stomp.impl.Message] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] SEND
>> destination:jms.topic.vdsm_requests
>> reply-to:jms.topic.vdsm_responses
>> content-length:105
>>
>>
>> Please note that this runs on localhost with local bridge. So it is not
>> likely to be network itself.
>>
>
> The main issue I see is that the VM run command has actually failed
> due to libvirt no accepting /dev/urandom as RNG source[1]. This was
> done as engine patch and according to git log, posted around Mon Nov
> 28. Also adding Jakub - this should either not happen from engine's
> point of view or the lago host is outdated.
>
> [1]
> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_ma
> ster/3838/artifact/exported-artifacts/basic_suite_master.sh-
> el7/exported-artifacts/test_logs/basic-suite-master/post-004
> _basic_sanity.py/lago-basic-suite-master-host0/_var_log_vdsm/vdsm.log
>
>
> Anton.
>>
>> On Fri, Dec 2, 2016 at 10:43 AM, Anton Marchukov 
>> wrote:
>>
>> FYI. Experimental flow for master currently fails to run a VM. The tests
>>> times out while waiting for 180 seconds:
>>>
>>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_
>>> master/3838/testReport/(root)/004_basic_sanity/vm_run/
>>>
>>> This is reproducible over 23 runs of this happened tonight, sounds like a
>>> regression to me:
>>>
>>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/
>>>
>>> I will update here with additional information once I find it.
>>>
>>> Last successful run was with this patch:
>>>
>>> https://gerrit.ovirt.org/#/c/66416/ (vdsm: API: move vm parameters fixup
>>> in a method)
>>>
>>> Known to start failing around this patch:
>>>
>>> https://gerrit.ovirt.org/#/c/67647/ (vdsmapi: fix a typo in string
>>> formatting)
>>>
>>> Please notes that we do not have gating implemented yet, so everything
>>> that was merged in between those patches might have caused this (not
>>> necessary in vdsm project).
>>>
>>> Anton.
>>> --
>>> Anton Marchukov
>>> Senior Software Engineer - RHEV CI - Red Hat
>>>
>>>
>>>
>>
>> --
>> Anton Marchukov
>> Senior Software Engineer - RHEV CI - Red Hat
>>
>
> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>
>


-- 
Anton Marchukov
Senior Software Engineer - RHEV CI - Red Hat


__

Re: [ovirt-devel] Request for oVirt Ansible modules testing feedback

2016-12-02 Thread Ondra Machacek

On 12/02/2016 03:49 PM, Sven Kieske wrote:

Hi,

which ovirt versions will these ansible modules support?
only 4.x or also 3.6?


Sorry, forgot to mention, it's only 4.x. We don't plan to support 3.x.



This looks very promising, because I'm already automating template
provisioning for ovirt via ansible (parts of it) and it would be cool
to also make the actual template creation of the vm part of the ansible
playbooks, but I can't switch over to ovirt 4 yet.

I will try to get some time to test this out, if it supports the older
version.




___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


[ovirt-devel] oVirt Community Newsletter: November 2016

2016-12-02 Thread Brian Proffitt
oVirt's development is continuing on pace, as the calendar year draws to a
close and we get ready for a new year of development, evangelism, and
making virtual machine management a simple process for everyone.

Here's what happened in November of 2016:

-
Software Releases
-

oVirt 4.0.6 Third Release Candidate is now available
http://bit.ly/2gOxaDm

oVirt 4.1.0 First Beta Release is now available for testing
http://bit.ly/2gOtRfa


In the Community


Testing ovirt-engine changes without a real cluster
http://www.ovirt.org/blog/2016/11/testing-ovirt-changes-without-cluster/

Request for oVirt Ansible modules testing feedback
http://bit.ly/2gOByCg


Deep Dives and Technical Discussions


Important Open Source Cloud Products [German]
http://bit.ly/2gOvWbd

Red Hat IT runs OpenShift Container Platform on Red Hat Virtualization and
Ansible
http://red.ht/2ekHcLV

Keynote: Blurring the Lines: The Continuum Between Containers and VMs
[Video]
http://bit.ly/2gOzutT

Quick Guide: How to Plan Your Red Hat Virtualization 4.0 Deployment
http://red.ht/2emiQkN

A Decade of KVM [Chinese]
http://bit.ly/2gOAuyp

Expansion of iptables Rules for oVirt 4.0 [Russian]
http://bit.ly/2gOBaUf

-- 
Brian Proffitt
Principal Community Analyst
Open Source and Standards
@TheTechScribe
574.383.9BKP
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Request for oVirt Ansible modules testing feedback

2016-12-02 Thread Sven Kieske
Hi,

which ovirt versions will these ansible modules support?
only 4.x or also 3.6?

This looks very promising, because I'm already automating template
provisioning for ovirt via ansible (parts of it) and it would be cool
to also make the actual template creation of the vm part of the ansible
playbooks, but I can't switch over to ovirt 4 yet.

I will try to get some time to test this out, if it supports the older
version.


-- 
Mit freundlichen Grüßen / Regards

Sven Kieske

Systemadministrator
Mittwald CM Service GmbH & Co. KG
Königsberger Straße 6
32339 Espelkamp
T: +495772 293100
F: +495772 29
https://www.mittwald.de
Geschäftsführer: Robert Meyer
St.Nr.: 331/5721/1033, USt-IdNr.: DE814773217, HRA 6640, AG Bad Oeynhausen
Komplementärin: Robert Meyer Verwaltungs GmbH, HRB 13260, AG Bad Oeynhausen



signature.asc
Description: OpenPGP digital signature
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] ovirt-guest-agent job not building rpms

2016-12-02 Thread Sandro Bonazzola
On Thu, Dec 1, 2016 at 3:59 PM, Yedidyah Bar David  wrote:

> On Thu, Dec 1, 2016 at 4:32 PM, Yedidyah Bar David 
> wrote:
> > On Thu, Dec 1, 2016 at 1:59 PM, Sandro Bonazzola 
> wrote:
> >> Hi, please have a look at
> >>
> >> http://jenkins.ovirt.org/job/ovirt-guest-agent_master_
> build-artifacts-el7-x86_64/25/
> >>
> >> ovirt-guest-agent job not building rpms
> >
> > IIRC that's as planned. Check build-artifacts.sh. We build only on
> fedora.
>
> And IIRC Vinzenz builds it on fedora copr too. Note that usually
> the guest agent is not really part of oVirt - it's installed in
> guests, which are not supposed to have oVirt repos.
>
>
Ok, so we just need to drop the job :-)



> >
> >> Thanks,
> >> --
> >> Sandro Bonazzola
> >> Better technology. Faster innovation. Powered by community
> collaboration.
> >> See how it works at redhat.com
> >
> >
> >
> > --
> > Didi
>
>
>
> --
> Didi
>



-- 
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Request for oVirt Ansible modules testing feedback

2016-12-02 Thread Ondra Machacek

Hello all,

I would like to kindly ask everyone who is Ansible or oVirt user for
testing of the new Ansible oVirt modules. For everyone who is familiar
with the Ansible and oVirt, this[1] describes the steps you need to do,
to setup oVirt modules library and start using those modules (Most of 
those modules will be available in Ansible 2.3, some of them are already 
in 2.2).


If you have any issue setting this up, please contact me, I will do the
best to help you.

If you have an issue, which you think is a bug, please open an issue
here[2]. Please note that Ansible is merging it's repositories, so since
next week it will actually be stored here[3]. If you are missing
anything please open an issue as well, or just contact me, and I will
do fix it. You are also very welcome to sent PR with fixes.

For those who don't have testing environment which can test against,
I've created an Vagrant project which will deploy you the oVirt instance
using Ansible playbooks. You can find how to use it here[4].

The repository also contains few examples[5], so you don't have to
copy-paste them from the source.

Thanks all for reading this and any feedback,
Ondra

[1] https://github.com/machacekondra/ovirt-tests/releases/tag/0.1
[2] https://github.com/ansible/ansible-modules-extras/issues
[3] https://github.com/ansible/ansible
[4] https://github.com/machacekondra/ovirt-tests
[5] https://github.com/machacekondra/ovirt-tests/tree/master/examples
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


Re: [ovirt-devel] Experimental Flow for Master Fails to Run a VM

2016-12-02 Thread Anton Marchukov
Hello Martin.

Do by outdated you mean the old libvirt? If so that is that livirt
available in CentOS 7.2? There is no 7.3 yet.

Anton.

On Fri, Dec 2, 2016 at 1:07 PM, Martin Polednik 
wrote:

> On 02/12/16 10:55 +0100, Anton Marchukov wrote:
>
>> Hello All.
>>
>> Engine log can be viewed here:
>>
>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_ma
>> ster/3838/artifact/exported-artifacts/basic_suite_master.sh-
>> el7/exported-artifacts/test_logs/basic-suite-master/post-
>> 004_basic_sanity.py/lago-basic-suite-master-engine/_var_log_
>> ovirt-engine/engine.log
>>
>> I see the following exception there:
>>
>> 2016-12-02 04:29:24,030-05 DEBUG
>> [org.ovirt.vdsm.jsonrpc.client.internal.ResponseWorker]
>> (ResponseWorker) [83b6b5d] Message received: {"jsonrpc": "2.0", "id":
>> "ec254aad-441b-47e7-a644-aebddcc1d62c", "result": true}
>> 2016-12-02 04:29:24,030-05 ERROR
>> [org.ovirt.vdsm.jsonrpc.client.JsonRpcClient] (ResponseWorker)
>> [83b6b5d] Not able to update response for
>> "ec254aad-441b-47e7-a644-aebddcc1d62c"
>> 2016-12-02 04:29:24,041-05 DEBUG
>> [org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
>> (DefaultQuartzScheduler3) [47a31d72] Rescheduling
>> DEFAULT.org.ovirt.engine.core.bll.gluster.GlusterSyncJob.ref
>> reshLightWeightData#-9223372036854775775
>> as there is no unfired trigger.
>> 2016-12-02 04:29:24,024-05 DEBUG
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Exception:
>> org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
>> VDSGenericException: VDSNetworkException: Timeout during xml-rpc call
>> at org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.
>> get(FutureVDSCommand.java:73)
>> [vdsbroker.jar:]
>>
>> 
>>
>> 2016-12-02 04:29:24,042-05 ERROR
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Timeout waiting for
>> VDSM response: Internal timeout occured
>> 2016-12-02 04:29:24,044-05 DEBUG
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.GetCapabilitiesVDSCommand]
>> (default task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] START,
>> GetCapabilitiesVDSCommand(HostName = lago-basic-suite-master-host0,
>> VdsIdAndVdsVDSCommandParametersBase:{runAsync='true',
>> hostId='5eb7019e-28a3-4f93-9188-685b6c64a2f5',
>> vds='Host[lago-basic-suite-master-host0,5eb7019e-28a3-4f93-
>> 9188-685b6c64a2f5]'}),
>> log id: 58f448b8
>> 2016-12-02 04:29:24,044-05 DEBUG
>> [org.ovirt.vdsm.jsonrpc.client.reactors.stomp.impl.Message] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] SEND
>> destination:jms.topic.vdsm_requests
>> reply-to:jms.topic.vdsm_responses
>> content-length:105
>>
>>
>> Please note that this runs on localhost with local bridge. So it is not
>> likely to be network itself.
>>
>
> The main issue I see is that the VM run command has actually failed
> due to libvirt no accepting /dev/urandom as RNG source[1]. This was
> done as engine patch and according to git log, posted around Mon Nov
> 28. Also adding Jakub - this should either not happen from engine's
> point of view or the lago host is outdated.
>
> [1]
> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_ma
> ster/3838/artifact/exported-artifacts/basic_suite_master.sh-
> el7/exported-artifacts/test_logs/basic-suite-master/post-
> 004_basic_sanity.py/lago-basic-suite-master-host0/_var_log_vdsm/vdsm.log
>
>
> Anton.
>>
>> On Fri, Dec 2, 2016 at 10:43 AM, Anton Marchukov 
>> wrote:
>>
>> FYI. Experimental flow for master currently fails to run a VM. The tests
>>> times out while waiting for 180 seconds:
>>>
>>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_
>>> master/3838/testReport/(root)/004_basic_sanity/vm_run/
>>>
>>> This is reproducible over 23 runs of this happened tonight, sounds like a
>>> regression to me:
>>>
>>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/
>>>
>>> I will update here with additional information once I find it.
>>>
>>> Last successful run was with this patch:
>>>
>>> https://gerrit.ovirt.org/#/c/66416/ (vdsm: API: move vm parameters fixup
>>> in a method)
>>>
>>> Known to start failing around this patch:
>>>
>>> https://gerrit.ovirt.org/#/c/67647/ (vdsmapi: fix a typo in string
>>> formatting)
>>>
>>> Please notes that we do not have gating implemented yet, so everything
>>> that was merged in between those patches might have caused this (not
>>> necessary in vdsm project).
>>>
>>> Anton.
>>> --
>>> Anton Marchukov
>>> Senior Software Engineer - RHEV CI - Red Hat
>>>
>>>
>>>
>>
>> --
>> Anton Marchukov
>> Senior Software Engineer - RHEV CI - Red Hat
>>
>
> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>
>


-- 
Anton Marchukov
Senior Software Engineer - RHEV CI - Red Hat
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/m

Re: [ovirt-devel] python path conflicts issues while running python scripts from vdsm verbs

2016-12-02 Thread Nir Soffer
בתאריך 2 בדצמ׳ 2016 10:04 לפנה״צ,‏ "Dan Kenigsberg"  כתב:

On Thu, Dec 01, 2016 at 11:32:07PM -0500, Ramesh Nachimuthu wrote:
> Thank you all for helping me in this issue. Finally vdsm.gluster
> modules are moved python site-packages.
>

Yay!

There's a bit more work to be done:
https://travis-ci.org/oVirt/vdsm/jobs/180630800 is failing since its
docker images do not yet include python-magic and python-blivet.

Nir, would you take https://gerrit.ovirt.org/#/c/67727/ and rebuild them?



Sure, thanks.
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Experimental Flow for Master Fails to Run a VM

2016-12-02 Thread Eyal Edri
The fix for that was new python sdk build for V4 if its the same issue
where log-collector fails.
Adding Juan.

On Fri, Dec 2, 2016 at 2:07 PM, Martin Polednik 
wrote:

> On 02/12/16 10:55 +0100, Anton Marchukov wrote:
>
>> Hello All.
>>
>> Engine log can be viewed here:
>>
>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_ma
>> ster/3838/artifact/exported-artifacts/basic_suite_master.sh-
>> el7/exported-artifacts/test_logs/basic-suite-master/post-
>> 004_basic_sanity.py/lago-basic-suite-master-engine/_var_log_
>> ovirt-engine/engine.log
>>
>> I see the following exception there:
>>
>> 2016-12-02 04:29:24,030-05 DEBUG
>> [org.ovirt.vdsm.jsonrpc.client.internal.ResponseWorker]
>> (ResponseWorker) [83b6b5d] Message received: {"jsonrpc": "2.0", "id":
>> "ec254aad-441b-47e7-a644-aebddcc1d62c", "result": true}
>> 2016-12-02 04:29:24,030-05 ERROR
>> [org.ovirt.vdsm.jsonrpc.client.JsonRpcClient] (ResponseWorker)
>> [83b6b5d] Not able to update response for
>> "ec254aad-441b-47e7-a644-aebddcc1d62c"
>> 2016-12-02 04:29:24,041-05 DEBUG
>> [org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
>> (DefaultQuartzScheduler3) [47a31d72] Rescheduling
>> DEFAULT.org.ovirt.engine.core.bll.gluster.GlusterSyncJob.ref
>> reshLightWeightData#-9223372036854775775
>> as there is no unfired trigger.
>> 2016-12-02 04:29:24,024-05 DEBUG
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Exception:
>> org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
>> VDSGenericException: VDSNetworkException: Timeout during xml-rpc call
>> at org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.
>> get(FutureVDSCommand.java:73)
>> [vdsbroker.jar:]
>>
>> 
>>
>> 2016-12-02 04:29:24,042-05 ERROR
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Timeout waiting for
>> VDSM response: Internal timeout occured
>> 2016-12-02 04:29:24,044-05 DEBUG
>> [org.ovirt.engine.core.vdsbroker.vdsbroker.GetCapabilitiesVDSCommand]
>> (default task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] START,
>> GetCapabilitiesVDSCommand(HostName = lago-basic-suite-master-host0,
>> VdsIdAndVdsVDSCommandParametersBase:{runAsync='true',
>> hostId='5eb7019e-28a3-4f93-9188-685b6c64a2f5',
>> vds='Host[lago-basic-suite-master-host0,5eb7019e-28a3-4f93-
>> 9188-685b6c64a2f5]'}),
>> log id: 58f448b8
>> 2016-12-02 04:29:24,044-05 DEBUG
>> [org.ovirt.vdsm.jsonrpc.client.reactors.stomp.impl.Message] (default
>> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] SEND
>> destination:jms.topic.vdsm_requests
>> reply-to:jms.topic.vdsm_responses
>> content-length:105
>>
>>
>> Please note that this runs on localhost with local bridge. So it is not
>> likely to be network itself.
>>
>
> The main issue I see is that the VM run command has actually failed
> due to libvirt no accepting /dev/urandom as RNG source[1]. This was
> done as engine patch and according to git log, posted around Mon Nov
> 28. Also adding Jakub - this should either not happen from engine's
> point of view or the lago host is outdated.
>
> [1]
> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_ma
> ster/3838/artifact/exported-artifacts/basic_suite_master.sh-
> el7/exported-artifacts/test_logs/basic-suite-master/post-
> 004_basic_sanity.py/lago-basic-suite-master-host0/_var_log_vdsm/vdsm.log
>
>
> Anton.
>>
>> On Fri, Dec 2, 2016 at 10:43 AM, Anton Marchukov 
>> wrote:
>>
>> FYI. Experimental flow for master currently fails to run a VM. The tests
>>> times out while waiting for 180 seconds:
>>>
>>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_
>>> master/3838/testReport/(root)/004_basic_sanity/vm_run/
>>>
>>> This is reproducible over 23 runs of this happened tonight, sounds like a
>>> regression to me:
>>>
>>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/
>>>
>>> I will update here with additional information once I find it.
>>>
>>> Last successful run was with this patch:
>>>
>>> https://gerrit.ovirt.org/#/c/66416/ (vdsm: API: move vm parameters fixup
>>> in a method)
>>>
>>> Known to start failing around this patch:
>>>
>>> https://gerrit.ovirt.org/#/c/67647/ (vdsmapi: fix a typo in string
>>> formatting)
>>>
>>> Please notes that we do not have gating implemented yet, so everything
>>> that was merged in between those patches might have caused this (not
>>> necessary in vdsm project).
>>>
>>> Anton.
>>> --
>>> Anton Marchukov
>>> Senior Software Engineer - RHEV CI - Red Hat
>>>
>>>
>>>
>>
>> --
>> Anton Marchukov
>> Senior Software Engineer - RHEV CI - Red Hat
>>
>
> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
>
>
>


-- 
Eyal Edri
Associate Manager
RHV DevOps
EMEA ENG Virt

Re: [ovirt-devel] Experimental Flow for Master Fails to Run a VM

2016-12-02 Thread Martin Polednik

On 02/12/16 10:55 +0100, Anton Marchukov wrote:

Hello All.

Engine log can be viewed here:

http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/3838/artifact/exported-artifacts/basic_suite_master.sh-el7/exported-artifacts/test_logs/basic-suite-master/post-004_basic_sanity.py/lago-basic-suite-master-engine/_var_log_ovirt-engine/engine.log

I see the following exception there:

2016-12-02 04:29:24,030-05 DEBUG
[org.ovirt.vdsm.jsonrpc.client.internal.ResponseWorker]
(ResponseWorker) [83b6b5d] Message received: {"jsonrpc": "2.0", "id":
"ec254aad-441b-47e7-a644-aebddcc1d62c", "result": true}
2016-12-02 04:29:24,030-05 ERROR
[org.ovirt.vdsm.jsonrpc.client.JsonRpcClient] (ResponseWorker)
[83b6b5d] Not able to update response for
"ec254aad-441b-47e7-a644-aebddcc1d62c"
2016-12-02 04:29:24,041-05 DEBUG
[org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
(DefaultQuartzScheduler3) [47a31d72] Rescheduling
DEFAULT.org.ovirt.engine.core.bll.gluster.GlusterSyncJob.refreshLightWeightData#-9223372036854775775
as there is no unfired trigger.
2016-12-02 04:29:24,024-05 DEBUG
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Exception:
org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
VDSGenericException: VDSNetworkException: Timeout during xml-rpc call
at 
org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.get(FutureVDSCommand.java:73)
[vdsbroker.jar:]



2016-12-02 04:29:24,042-05 ERROR
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Timeout waiting for
VDSM response: Internal timeout occured
2016-12-02 04:29:24,044-05 DEBUG
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetCapabilitiesVDSCommand]
(default task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] START,
GetCapabilitiesVDSCommand(HostName = lago-basic-suite-master-host0,
VdsIdAndVdsVDSCommandParametersBase:{runAsync='true',
hostId='5eb7019e-28a3-4f93-9188-685b6c64a2f5',
vds='Host[lago-basic-suite-master-host0,5eb7019e-28a3-4f93-9188-685b6c64a2f5]'}),
log id: 58f448b8
2016-12-02 04:29:24,044-05 DEBUG
[org.ovirt.vdsm.jsonrpc.client.reactors.stomp.impl.Message] (default
task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] SEND
destination:jms.topic.vdsm_requests
reply-to:jms.topic.vdsm_responses
content-length:105


Please note that this runs on localhost with local bridge. So it is not
likely to be network itself.


The main issue I see is that the VM run command has actually failed
due to libvirt no accepting /dev/urandom as RNG source[1]. This was
done as engine patch and according to git log, posted around Mon Nov
28. Also adding Jakub - this should either not happen from engine's
point of view or the lago host is outdated.

[1]
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/3838/artifact/exported-artifacts/basic_suite_master.sh-el7/exported-artifacts/test_logs/basic-suite-master/post-004_basic_sanity.py/lago-basic-suite-master-host0/_var_log_vdsm/vdsm.log


Anton.

On Fri, Dec 2, 2016 at 10:43 AM, Anton Marchukov 
wrote:


FYI. Experimental flow for master currently fails to run a VM. The tests
times out while waiting for 180 seconds:

http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_
master/3838/testReport/(root)/004_basic_sanity/vm_run/

This is reproducible over 23 runs of this happened tonight, sounds like a
regression to me:

http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/

I will update here with additional information once I find it.

Last successful run was with this patch:

https://gerrit.ovirt.org/#/c/66416/ (vdsm: API: move vm parameters fixup
in a method)

Known to start failing around this patch:

https://gerrit.ovirt.org/#/c/67647/ (vdsmapi: fix a typo in string
formatting)

Please notes that we do not have gating implemented yet, so everything
that was merged in between those patches might have caused this (not
necessary in vdsm project).

Anton.
--
Anton Marchukov
Senior Software Engineer - RHEV CI - Red Hat





--
Anton Marchukov
Senior Software Engineer - RHEV CI - Red Hat



___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


Re: [ovirt-devel] Experimental Flow for Master Fails to Run a VM

2016-12-02 Thread Piotr Kliczewski
Anton,

I see following event in the log:

2016-12-02 04:31:12,527-05 DEBUG
[org.ovirt.engine.core.vdsbroker.monitoring.EventVmStatsRefresher]
(ForkJoinPool-1-worker-4) [83b6b5d] processing event for host
lago-basic-suite-master-host0 data:
39710f89-9fa2-423e-9fa8-1448ca51f166:
status = Down
timeOffset = 0
exitReason = 1
exitMessage = XML error: file '/dev/urandom' is not a supported random source
exitCode = 1

and here is the vdsm log:

2016-12-02 04:31:10,618 ERROR (vm/39710f89) [virt.vm]
(vmId='39710f89-9fa2-423e-9fa8-1448ca51f166') The vm start process
failed (vm:613)
Traceback (most recent call last):
  File "/usr/share/vdsm/virt/vm.py", line 549, in _startUnderlyingVm
self._run()
  File "/usr/share/vdsm/virt/vm.py", line 1980, in _run
self._connection.createXML(domxml, flags),
  File "/usr/lib/python2.7/site-packages/vdsm/libvirtconnection.py",
line 128, in wrapper
ret = f(*args, **kwargs)
  File "/usr/lib/python2.7/site-packages/vdsm/utils.py", line 936, in wrapper
return func(inst, *args, **kwargs)
  File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML
if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self)
libvirtError: XML error: file '/dev/urandom' is not a supported random source

@Martin is it known issue?

Thanks,
Piotr

On Fri, Dec 2, 2016 at 10:55 AM, Anton Marchukov  wrote:
> Hello All.
>
> Engine log can be viewed here:
>
> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/3838/artifact/exported-artifacts/basic_suite_master.sh-el7/exported-artifacts/test_logs/basic-suite-master/post-004_basic_sanity.py/lago-basic-suite-master-engine/_var_log_ovirt-engine/engine.log
>
> I see the following exception there:
>
> 2016-12-02 04:29:24,030-05 DEBUG
> [org.ovirt.vdsm.jsonrpc.client.internal.ResponseWorker] (ResponseWorker)
> [83b6b5d] Message received: {"jsonrpc": "2.0", "id":
> "ec254aad-441b-47e7-a644-aebddcc1d62c", "result": true}
> 2016-12-02 04:29:24,030-05 ERROR
> [org.ovirt.vdsm.jsonrpc.client.JsonRpcClient] (ResponseWorker) [83b6b5d] Not
> able to update response for "ec254aad-441b-47e7-a644-aebddcc1d62c"
> 2016-12-02 04:29:24,041-05 DEBUG
> [org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
> (DefaultQuartzScheduler3) [47a31d72] Rescheduling
> DEFAULT.org.ovirt.engine.core.bll.gluster.GlusterSyncJob.refreshLightWeightData#-9223372036854775775
> as there is no unfired trigger.
> 2016-12-02 04:29:24,024-05 DEBUG
> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default task-12)
> [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Exception:
> org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
> VDSGenericException: VDSNetworkException: Timeout during xml-rpc call
> at
> org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.get(FutureVDSCommand.java:73)
> [vdsbroker.jar:]
>

This issue may occur during setupNetworks due to nature of the
operation. I need to update the message because is not correct.

> 
>
> 2016-12-02 04:29:24,042-05 ERROR
> [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default task-12)
> [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Timeout waiting for VDSM response:
> Internal timeout occured
> 2016-12-02 04:29:24,044-05 DEBUG
> [org.ovirt.engine.core.vdsbroker.vdsbroker.GetCapabilitiesVDSCommand]
> (default task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] START,
> GetCapabilitiesVDSCommand(HostName = lago-basic-suite-master-host0,
> VdsIdAndVdsVDSCommandParametersBase:{runAsync='true',
> hostId='5eb7019e-28a3-4f93-9188-685b6c64a2f5',
> vds='Host[lago-basic-suite-master-host0,5eb7019e-28a3-4f93-9188-685b6c64a2f5]'}),
> log id: 58f448b8
> 2016-12-02 04:29:24,044-05 DEBUG
> [org.ovirt.vdsm.jsonrpc.client.reactors.stomp.impl.Message] (default
> task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] SEND
> destination:jms.topic.vdsm_requests
> reply-to:jms.topic.vdsm_responses
> content-length:105
>
>
> Please note that this runs on localhost with local bridge. So it is not
> likely to be network itself.
>
> Anton.
>
> On Fri, Dec 2, 2016 at 10:43 AM, Anton Marchukov 
> wrote:
>>
>> FYI. Experimental flow for master currently fails to run a VM. The tests
>> times out while waiting for 180 seconds:
>>
>>
>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/3838/testReport/(root)/004_basic_sanity/vm_run/
>>
>> This is reproducible over 23 runs of this happened tonight, sounds like a
>> regression to me:
>>
>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/
>>
>> I will update here with additional information once I find it.
>>
>> Last successful run was with this patch:
>>
>> https://gerrit.ovirt.org/#/c/66416/ (vdsm: API: move vm parameters fixup
>> in a method)
>>
>> Known to start failing around this patch:
>>
>> https://gerrit.ovirt.org/#/c/67647/ (vdsmapi: fix a typo in string
>> formatting)
>>
>> Please notes that we do not have gating implemented yet, so everything
>> that was merged in between those patches might have caused this (n

Re: [ovirt-devel] Experimental Flow for Master Fails to Run a VM

2016-12-02 Thread Anton Marchukov
Hello All.

Engine log can be viewed here:

http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/3838/artifact/exported-artifacts/basic_suite_master.sh-el7/exported-artifacts/test_logs/basic-suite-master/post-004_basic_sanity.py/lago-basic-suite-master-engine/_var_log_ovirt-engine/engine.log

I see the following exception there:

2016-12-02 04:29:24,030-05 DEBUG
[org.ovirt.vdsm.jsonrpc.client.internal.ResponseWorker]
(ResponseWorker) [83b6b5d] Message received: {"jsonrpc": "2.0", "id":
"ec254aad-441b-47e7-a644-aebddcc1d62c", "result": true}
2016-12-02 04:29:24,030-05 ERROR
[org.ovirt.vdsm.jsonrpc.client.JsonRpcClient] (ResponseWorker)
[83b6b5d] Not able to update response for
"ec254aad-441b-47e7-a644-aebddcc1d62c"
2016-12-02 04:29:24,041-05 DEBUG
[org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
(DefaultQuartzScheduler3) [47a31d72] Rescheduling
DEFAULT.org.ovirt.engine.core.bll.gluster.GlusterSyncJob.refreshLightWeightData#-9223372036854775775
as there is no unfired trigger.
2016-12-02 04:29:24,024-05 DEBUG
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Exception:
org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
VDSGenericException: VDSNetworkException: Timeout during xml-rpc call
at 
org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.get(FutureVDSCommand.java:73)
[vdsbroker.jar:]



2016-12-02 04:29:24,042-05 ERROR
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default
task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] Timeout waiting for
VDSM response: Internal timeout occured
2016-12-02 04:29:24,044-05 DEBUG
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetCapabilitiesVDSCommand]
(default task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] START,
GetCapabilitiesVDSCommand(HostName = lago-basic-suite-master-host0,
VdsIdAndVdsVDSCommandParametersBase:{runAsync='true',
hostId='5eb7019e-28a3-4f93-9188-685b6c64a2f5',
vds='Host[lago-basic-suite-master-host0,5eb7019e-28a3-4f93-9188-685b6c64a2f5]'}),
log id: 58f448b8
2016-12-02 04:29:24,044-05 DEBUG
[org.ovirt.vdsm.jsonrpc.client.reactors.stomp.impl.Message] (default
task-12) [d932871a-af4f-4fc9-9ee5-f7a0126a7b85] SEND
destination:jms.topic.vdsm_requests
reply-to:jms.topic.vdsm_responses
content-length:105


Please note that this runs on localhost with local bridge. So it is not
likely to be network itself.

Anton.

On Fri, Dec 2, 2016 at 10:43 AM, Anton Marchukov 
wrote:

> FYI. Experimental flow for master currently fails to run a VM. The tests
> times out while waiting for 180 seconds:
>
> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_
> master/3838/testReport/(root)/004_basic_sanity/vm_run/
>
> This is reproducible over 23 runs of this happened tonight, sounds like a
> regression to me:
>
> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/
>
> I will update here with additional information once I find it.
>
> Last successful run was with this patch:
>
> https://gerrit.ovirt.org/#/c/66416/ (vdsm: API: move vm parameters fixup
> in a method)
>
> Known to start failing around this patch:
>
> https://gerrit.ovirt.org/#/c/67647/ (vdsmapi: fix a typo in string
> formatting)
>
> Please notes that we do not have gating implemented yet, so everything
> that was merged in between those patches might have caused this (not
> necessary in vdsm project).
>
> Anton.
> --
> Anton Marchukov
> Senior Software Engineer - RHEV CI - Red Hat
>
>


-- 
Anton Marchukov
Senior Software Engineer - RHEV CI - Red Hat
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Experimental Flow for Master Fails to Run a VM

2016-12-02 Thread Anton Marchukov
FYI. Experimental flow for master currently fails to run a VM. The tests
times out while waiting for 180 seconds:

http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/3838/testReport/(root)/004_basic_sanity/vm_run/

This is reproducible over 23 runs of this happened tonight, sounds like a
regression to me:

http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/

I will update here with additional information once I find it.

Last successful run was with this patch:

https://gerrit.ovirt.org/#/c/66416/ (vdsm: API: move vm parameters fixup in
a method)

Known to start failing around this patch:

https://gerrit.ovirt.org/#/c/67647/ (vdsmapi: fix a typo in string
formatting)

Please notes that we do not have gating implemented yet, so everything that
was merged in between those patches might have caused this (not necessary
in vdsm project).

Anton.
-- 
Anton Marchukov
Senior Software Engineer - RHEV CI - Red Hat
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] python path conflicts issues while running python scripts from vdsm verbs

2016-12-02 Thread Dan Kenigsberg
On Thu, Dec 01, 2016 at 11:32:07PM -0500, Ramesh Nachimuthu wrote:
> Thank you all for helping me in this issue. Finally vdsm.gluster
> modules are moved python site-packages. 
> 

Yay!

There's a bit more work to be done:
https://travis-ci.org/oVirt/vdsm/jobs/180630800 is failing since its
docker images do not yet include python-magic and python-blivet.

Nir, would you take https://gerrit.ovirt.org/#/c/67727/ and rebuild them?
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel