Re: [openstack-dev] Are we ready to put stable/ocata into extended maintenance mode?

2018-09-21 Thread Elõd Illés

Hi,

Here is an etherpad with the teams that have stable:follow-policy tag on 
their repos:


https://etherpad.openstack.org/p/ocata-final-release-before-em

On the links you can find reports about the open and unreleased changes, 
that could be a useful input for the before-EM/final release.
Please have a look at the report (and review the open patches if there 
are) so that a release can be made if necessary.


Thanks,

Előd


On 2018-09-21 00:53, Matt Riedemann wrote:

On 9/20/2018 12:08 PM, Elõd Illés wrote:

Hi Matt,

About 1.: I think it is a good idea to cut a final release 
(especially as some vendor/operator would be glad even if there would 
be some release in Extended Maintenance, too, what most probably 
won't happen...) -- saying that without knowing how much of a burden 
would it be for projects to do this final release...
After that it sounds reasonably to tag the branches EM (as it is 
written in the mentioned resolution).


Do you have any plan about how to coordinate the 'final releases' and 
do the EM-tagging?


Thanks for raising these questions!

Cheers,

Előd


For anyone following along and that cares about this (hopefully PTLs), 
Előd, Doug, Sean and I formulated a plan in IRC today [1].


[1] 
http://eavesdrop.openstack.org/irclogs/%23openstack-stable/%23openstack-stable.2018-09-20.log.html#t2018-09-20T17:10:56






__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] Are we ready to put stable/ocata into extended maintenance mode?

2018-09-20 Thread Elõd Illés

Hi Matt,

About 1.: I think it is a good idea to cut a final release (especially 
as some vendor/operator would be glad even if there would be some 
release in Extended Maintenance, too, what most probably won't 
happen...) -- saying that without knowing how much of a burden would it 
be for projects to do this final release...
After that it sounds reasonably to tag the branches EM (as it is written 
in the mentioned resolution).


Do you have any plan about how to coordinate the 'final releases' and do 
the EM-tagging?


Thanks for raising these questions!

Cheers,

Előd


On 2018-09-18 21:27, Matt Riedemann wrote:
The release page says Ocata is planned to go into extended maintenance 
mode on Aug 27 [1]. There really isn't much to this except it means we 
don't do releases for Ocata anymore [2]. There is a caveat that 
project teams that do not wish to maintain stable/ocata after this 
point can immediately end of life the branch for their project [3]. We 
can still run CI using tags, e.g. if keystone goes ocata-eol, devstack 
on stable/ocata can still continue to install from stable/ocata for 
nova and the ocata-eol tag for keystone. Having said that, if there is 
no undue burden on the project team keeping the lights on for 
stable/ocata, I would recommend not tagging the stable/ocata branch 
end of life at this point.


So, questions that need answering are:

1. Should we cut a final release for projects with stable/ocata 
branches before going into extended maintenance mode? I tend to think 
"yes" to flush the queue of backports. In fact, [3] doesn't mention 
it, but the resolution said we'd tag the branch [4] to indicate it has 
entered the EM phase.


2. Are there any projects that would want to skip EM and go directly 
to EOL (yes this feels like a Monopoly question)?


[1] https://releases.openstack.org/
[2] 
https://docs.openstack.org/project-team-guide/stable-branches.html#maintenance-phases
[3] 
https://docs.openstack.org/project-team-guide/stable-branches.html#extended-maintenance
[4] 
https://governance.openstack.org/tc/resolutions/20180301-stable-branch-eol.html#end-of-life






__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [TripleO] Regarding dropping Ocata related jobs from TripleO

2018-09-14 Thread Elõd Illés

Hi,

just a comment: Ocata release is not EOL [1][2] rather in Extended 
Maintenance. Do you really want to EOL TripleO stable/ocata?


[1] https://releases.openstack.org/
[2] 
https://governance.openstack.org/tc/resolutions/20180301-stable-branch-eol.html


Cheers,

Előd


On 2018-09-14 09:20, Juan Antonio Osorio Robles wrote:


On 09/14/2018 09:01 AM, Alex Schultz wrote:

On Fri, Sep 14, 2018 at 6:37 AM, Chandan kumar  wrote:

Hello,

As Ocata release is already EOL on 27-08-2018 [1].
In TripleO, we are running Ocata jobs in TripleO CI and in promotion pipelines.
Can we drop it all the jobs related to Ocata or do we need to keep some jobs
to support upgrades in CI?


I think unless there are any objections around upgrades, we can drop
the promotion pipelines. It's likely that we'll also want to
officially EOL the tripleo ocata branches.

sounds good to me.

Thanks,
-Alex


Links:
[1.] https://releases.openstack.org/

Thanks,

Chandan Kumar

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev




__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


[openstack-dev] [stable][networking-bgpvpn][infra] missing networking-odl repository

2018-06-06 Thread Elõd Illés

Hi,

I'm trying to create a fix for the failing networking-bgpvpn stable 
periodic sphinx-docs job [1], but meanwhile it turned out that other 
"check" (and possibly "gate") jobs are failing on stable, too, on 
networking-bgpvpn, because of missing dependency: networking-odl 
repository (for pep8, py27, py35, cover and even sphinx, too). I 
submitted a patch a couple of days ago for the stable periodic py27 job 
[2] and it solved the issue there. But now it seems that every other 
networking-bgpvpn job needs this fix if it is run against stable 
branches (something like in this patch [3]).


Question: Is there a better way to fix these issues?


The common error message of the failing jobs:

**
ERROR! /home/zuul/src/git.openstack.org/openstack/networking-odl not found
In Zuul v3 all repositories used need to be declared
in the 'required-projects' parameter on the job.
To fix this issue, add:

  openstack/networking-odl

to 'required-projects'.

While you're at it, it's worth noting that zuul-cloner itself
is deprecated and this shim is only present for transition
purposes. Start thinking about how to rework job content to
just use the git repos that zuul will place into
/home/zuul/src/git.openstack.org directly.
**


[1] https://review.openstack.org/#/c/572368/
[2] https://review.openstack.org/#/c/569111/
[3] https://review.openstack.org/#/c/572495/


Thanks,

Előd


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable] [tooz] [ceilometer] cmd2 without upper constraints causes errors in tox-py27

2018-05-30 Thread Elõd Illés

cmd2 says that:

"Python 2.7 support is EOL

Support for adding new features to the Python 2.7 release of |cmd2| was 
discontinued on April 15, 2018. Bug fixes will be supported for Python 
2.7 via 0.8.x until August 31, 2018.


Supporting Python 2 was an increasing burden on our limited resources. 
Switching to support only Python 3 will allow us to clean up the 
codebase, remove some cruft, and focus on developing new features."


See: https://github.com/python-cmd2/cmd2

Előd

On 2018-05-30 14:42, Julien Danjou wrote:

On Wed, May 30 2018, Elõd Illés wrote:


In the last two days the ceilometer [1] [2] [3] and tooz [4] [5] [6] tox-py27
periodic stable jobs are failing. The root cause is the following:
* cmd2 released version 0.9.0, which requires python >=3.4 from now on.
These projects have comment in their tox.ini that they do not consume
upper-constraints.txt (in which there is an upper constraints for cmd2).

My question is: could we use upper-constraints.txt on these projects as well,
or is there any reason why it isn't the case?
Of course an entry could be added to test-requirements.txt with "cmd2<0.9.0",
but wouldn't it be better to use the upper-constraints.txt?

The question is: why cmd2 0.9.0 does not work and how do we fix that?



__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


[openstack-dev] [stable] [tooz] [ceilometer] cmd2 without upper constraints causes errors in tox-py27

2018-05-30 Thread Elõd Illés

Hi,

In the last two days the ceilometer [1] [2] [3] and tooz [4] [5] [6] 
tox-py27 periodic stable jobs are failing. The root cause is the following:

* cmd2 released version 0.9.0, which requires python >=3.4 from now on.
These projects have comment in their tox.ini that they do not consume 
upper-constraints.txt (in which there is an upper constraints for cmd2).


My question is: could we use upper-constraints.txt on these projects as 
well, or is there any reason why it isn't the case?
Of course an entry could be added to test-requirements.txt with 
"cmd2<0.9.0", but wouldn't it be better to use the upper-constraints.txt?


Thanks in advance,

Előd

[1] 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/ceilometer/stable/queens/openstack-tox-py27/b44c7cd/
[2] 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/ceilometer/stable/pike/openstack-tox-py27/6c4fd5d/
[3] 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/ceilometer/stable/ocata/openstack-tox-py27/4d2d0b3/


[4] 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/tooz/stable/queens/openstack-tox-py27/37bd360/
[5] 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/tooz/stable/pike/openstack-tox-py27/8bb8c29/
[6] 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/tooz/stable/ocata/openstack-tox-py27/1016d56/



__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [Openstack-stable-maint] Stable check of openstack/networking-midonet failed

2018-04-10 Thread Elõd Illés

Hi,

Thanks, too. I've prepared the remaining backport [1] for stable/ocata 
to solve the issue there as well [2]


[1] https://review.openstack.org/#/c/559940/
[2] 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/networking-midonet/stable/ocata/openstack-tox-py27/928af21/job-output.txt.gz#_2018-04-10_06_29_43_146966


Thanks,

Előd


On 2018-04-09 06:16, Tony Breeds wrote:

On Tue, Apr 03, 2018 at 02:05:35PM +0200, Elõd Illés wrote:

Hi,

These patches probably solve the issue, if someone could review them:

https://review.openstack.org/#/c/557005/

and

https://review.openstack.org/#/c/557006/

Thanks,

Thanks for digging into that.  I've approved these even though they
don't have a +2 from the neutron stable team.  They look safe as the
only impact tests, unblock the gate and also have +1's from subject
matter experts.

Yours Tony.




__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [Openstack-stable-maint] Stable check of openstack/networking-midonet failed

2018-04-03 Thread Elõd Illés

Hi,

These patches probably solve the issue, if someone could review them:

https://review.openstack.org/#/c/557005/

and

https://review.openstack.org/#/c/557006/

Thanks,

Előd


On 2018-04-01 05:55, Tony Breeds wrote:

On Sat, Mar 31, 2018 at 06:17:41AM +, A mailing list for the OpenStack 
Stable Branch test reports. wrote:

Build failed.

- build-openstack-sphinx-docs 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/networking-midonet/stable/pike/build-openstack-sphinx-docs/b20c665/html/
 : SUCCESS in 5m 48s
- openstack-tox-py27 
http://logs.openstack.org/periodic-stable/git.openstack.org/openstack/networking-midonet/stable/pike/openstack-tox-py27/75db3fe/
 : FAILURE in 11m 49s
  


I'm not sure what's going on here but as with stable/ocata the
networking-midonet periodic-stable jobs have been failing like this for
close to a week.

Can someone from that team take a look

Yours Tony.


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev