Thanks for the replies from you can Jeremy. We will try to try the local mirror. Inline replies are for your question.
On Mon, Nov 7, 2016 at 4:30 PM, Clark Boylan <cboy...@sapwetik.org> wrote: > On Mon, Nov 7, 2016, at 01:13 PM, Hongbin Lu wrote: > > Hi infra team, > > > > I am working on the Zun project and we experienced random failure within > > the gate. The error is as below: > > > > 2016-10-16 00:30:49.359 | ++ > > /opt/stack/new/zun/devstack/lib/zun:install_etcd_server:316 : curl -L > > https://github.com/coreos/etcd/releases/download/v3.0.7/ > etcd-v3.0.7-linux-amd64.tar.gz > > -o /opt/stack/new/zun/etcd/etcd-v3.0.7-linux-amd64.tar.gz > > .... > > curl: (7) Failed to connect to github.com port 443: Connection timed > > out > > > > By searching on logstach by using a query (message:"Failed to connect to > > github.com port 443: Connection timed out"), it looks all the failure > > were > > happening in node "ubuntu-*-osic-cloud1-*". Is that related to anything > > specific to the osic cloud? > > OSIC is one of our ipv6 "only" clouds so connections going out to ipv4 > only hosts (like github.com) have to go through a shared NAT set up. It > is possible that this is related to the problem. > > We have also seen issues where conflicts with routing tables break > instances' ability to route packets to the router which performs NAT as > well. Double check that you aren't overriding the routes for 10.0.0.0/8 > there. > I don't think we have any logic that override it. > > And generally the Internet is an unreliable place. We don't consume our > git repos from github and instead mirror them ourselves, we also mirror > ubuntu, debian, and centos packages mirrors, and pypi package mirrors > and so on for this reason. > > My suggestion would be to use the etcd pacakge out of the Xenial mirror > which should be far more reliable on any of our test hosts (mirrors are > cloud region local caches of distributed filesystems). > > As a final note it is always tremendously helpful to provide at least > one example link to the job logs that experience problems when reporting > issues to the infra team. There is a lot of data collected which is > useful for debugging. > Here is a sample link: http://logs.openstack.org/46/380646/24/check/gate-zun-devstack-dsvm/d7c7bab/logs/devstacklog.txt.gz#_2016-10-16_00_30_49_359 > > Clark > > __________________________________________________________________________ > OpenStack Development Mailing List (not for usage questions) > Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe > http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev >
__________________________________________________________________________ OpenStack Development Mailing List (not for usage questions) Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev