Thanks MattT, MattR and Steve. Since that last update 4 runs failed

http://logs.openstack.org/20/396620/2/gate/gate-tempest-dsvm-neutron-full-ubuntu-xenial/e82ace8/
* tempest.api.compute.admin.test_migrations.MigrationsAdminTest -
test_resize_server_revert_deleted_flavor
* tempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON
- test_create_list_show_delete_interfaces

http://logs.openstack.org/45/423645/19/gate/gate-grenade-dsvm-neutron-dvr-multinode-ubuntu-xenial/61dbd0e/
http://logs.openstack.org/45/423645/19/gate/gate-grenade-dsvm-neutron-dvr-multinode-ubuntu-xenial/61dbd0e/
* Both runs failed with the following
  "Failed to fetch
http://mirror.regionone.osic-cloud1.openstack.org/ubuntu/pool/main/o/openssl/openssl_1.0.2g-1ubuntu4.6_amd64.deb";

* 
http://logs.openstack.org/04/427004/2/gate/gate-keystone-python35-db/1502dbe/console.html
  35 mins of zero logs and then timed out

Thanks,
Dims

On Tue, Jan 31, 2017 at 3:32 PM, Matt Riedemann <mriede...@gmail.com> wrote:
> On 1/31/2017 11:49 AM, Davanum Srinivas wrote:
>>
>> Folks,
>>
>> Here's the list of job failures that failed in the gate queue.
>> captured with my script[1][2] since around 10:00 AM today. All jobs
>> failed with just one bad test.
>>
>>
>> http://logs.openstack.org/48/426448/2/gate/gate-tempest-dsvm-neutron-full-ubuntu-xenial/ecb3d0a/
>>    -
>> tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON
>>
>> http://logs.openstack.org/48/426448/2/gate/gate-tempest-dsvm-neutron-full-ssh/71f6c8c/
>>              - tempest.api.compute.admin.test_servers.ServersAdminTestJSON
>>
>> http://logs.openstack.org/48/376548/8/gate/gate-tempest-dsvm-neutron-full-ubuntu-xenial/cf3028b/
>>    - tempest.api.compute.servers.test_delete_server.DeleteServersTestJSON
>>
>> http://logs.openstack.org/68/417668/8/gate/gate-tempest-dsvm-neutron-full-ssh/27bda02/
>>              -
>> tempest.api.compute.volumes.test_attach_volume.AttachVolumeShelveTestJSON
>>
>> http://logs.openstack.org/48/423548/11/gate/gate-keystone-python27-db-ubuntu-xenial/a1f55ca/
>>        - keystone.tests.unit.test_v3_auth.TestMFARules
>>
>> http://logs.openstack.org/61/424961/1/gate/gate-tempest-dsvm-cells-ubuntu-xenial/8a1f9e7/
>>           - tempest.api.compute.admin.test_servers.ServersAdminTestJSON
>>
>> http://logs.openstack.org/23/426823/3/gate/gate-tempest-dsvm-neutron-full-ubuntu-xenial/0204168/
>>    -
>> tempest.scenario.test_security_groups_basic_ops.TestSecurityGroupsBasicOps
>>
>> So our gate is now 36 deep with stuff running for little more than 4
>> hours repeatedly.... Can folks look deeper please?
>>
>> Thanks,
>> Dims
>>
>> [1] https://gist.github.com/dims/54b391bd5964d3d208113b16766ea85e
>> [2] http://paste.openstack.org/show/597071/
>>
>
> I know of two issues impacting the cells v1 job, one of which is fixed, the
> other has a patch recently posted.
>
> The first was one I posted about last night, total blocker for the cells v1
> job which was kicking things out of the gate for Nova, but that is fixed:
>
> https://review.openstack.org/#/c/427009/
>
> The other one that's not fixed yet (was just identified today) has a patch
> up now:
>
> https://review.openstack.org/#/c/427394/
>
> --
>
> Thanks,
>
> Matt Riedemann
>
> __________________________________________________________________________
> OpenStack Development Mailing List (not for usage questions)
> Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev



-- 
Davanum Srinivas :: https://twitter.com/dims

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev

Reply via email to