Re: [openstack-dev] [all] cyclomatic complexity check in flake8 & Python 3

2017-12-18 Thread Zane Bitter

On 12/12/17 08:40, Sean Dague wrote:

On 12/12/2017 08:08 AM, Thierry Carrez wrote:

Zane Bitter wrote:



Here's how the allegedly most complex function in Heat[5] shakes out,
for example:

   mccabe  py27  py36
   0.2.1    14 6
   0.3.1    23    23
   0.6.1    23    23

I don't have a solution to suggest, but I burned most of my day digging
into this so I just wanted to let y'all know how screwed we are.

Thanks for digging into this, Zane. I'd say we have two options, based
on how useful running those tests is:

1/ If they are useful, we should bump the version, at the same time
adjusting max-complexity per repo to match the most complex function
(with some slack factor). Encourage more projects to use cyclomatic
complexity checks and fix the biggest offenders using cycle goals.

2/ If they are not useful, just drop cyclomatic complexity tests overall
rather than relying on wrong results and wasting CPU cycles

So I'd like to hear from the 60+ repos on whether using those tests lead
to any good results :)


I don't know how useful these end up being (#1), though I've seen
patches from time to time with people trying to reduce them.


One problem with the theory is that the max complexity is pretty much 
always going to be dominated by a couple of tent-pole values, with 
everything else free to accumulate complexity under the radar. (I 
apologise for the preceding mixed metaphor.) If you could specify 
per-file exceptions then you could at least use it in the way that's 
intended (whether that's useful or not is still up for debate - 
personally I'm in Thomas's "not" camp), but afaik you can't.



The current max in Nova is 35. Moving to mccabe 0.6.1 generates 2 errors:

Running flake8 on all files
./nova/compute/manager.py:763:5: C901 'ComputeManager._init_instance' is
too complex (37)
./nova/tests/functional/api_samples_test_base.py:144:5: C901
'ApiSampleTestBase._compare_result' is too complex (36)


There are some good fixes in 0.6.1 that are pushing the scores up, like 
now counting complexity that occurs inside of the 'else' part of 
try/except/else, but I suspect that the vast majority of the increase is 
due to one giant, honking regression:


https://github.com/PyCQA/mccabe/issues/27#issuecomment-352535619

so TBH I wouldn't recommend updating to any current release. Only if 
somebody is motivated to actually fix that bug might it be worth 
considering.


cheers,
Zane.


I honestly think this is one of the things that you declare a flag day a
couple weeks out, warn everyone they are going to have to adjust their
max, and move forward. It's a very easy fix when pep8 starts failing people.

-Sean




__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [all] cyclomatic complexity check in flake8 & Python 3

2017-12-12 Thread Thomas Herve
On Tue, Dec 12, 2017 at 2:08 PM, Thierry Carrez  wrote:
> Zane Bitter wrote:
>> We have around 60 repos using the 'mccabe' package to do cyclomatic
>> complexity checks in flake8 in their gate jobs.[1] Based on advice
>> received at the time[2], the version of mccabe that we use in the gate
>> is effectively pinned to 0.2.1 by hacking (the project, not... never
>> mind).[3]
>>
>> This is starting to cause problems, because version 0.2.1 does not work
>> with Python 3. By 'does not work' I don't mean that it refuses to run, I
>> mean that it returns completely different numbers. (As far as I can tell
>> they're always the same or lower.) tox creates the pep8 environment
>> using the version of python that tox itself is running in, so as distros
>> increasingly prefer to install packages using python3, people will
>> increasingly be running flake8 under python3, and they'll run into
>> situations like the one I had today where their pep8 tests pass locally
>> but fail in the gate (which still runs under python2). When the gate
>> switches to python3, as I assume it eventually must, we'll get bogus
>> results.
>>
>> Unfortunately, moving to a later version is complicated by the fact that
>> 0.2.1 was... not good software. There is exactly one regression test.[4]
>> And if you're about to ask whether it was to check the trivial case that
>> the function:
>>
>>   def foo():
>>   return None
>>
>> has a cyclomatic complexity of 1, then the joke is on you because it
>> actually returns 2 for that function.
>>
>> Later versions have more tests and appear to be consistent with each
>> other and across python versions, but they return completely different
>> results to 0.2.1. And the results are mostly higher, so if we bump the
>> requirements many of those 60 projects will break.
>>
>> (I could also go into details on how the numbers that later versions
>> report are also wrong in different ways, but I'll save it for another day.)
>>
>> Here's how the allegedly most complex function in Heat[5] shakes out,
>> for example:
>>
>>   mccabe  py27  py36
>>   0.2.114 6
>>   0.3.12323
>>   0.6.12323
>>
>> I don't have a solution to suggest, but I burned most of my day digging
>> into this so I just wanted to let y'all know how screwed we are.
> Thanks for digging into this, Zane. I'd say we have two options, based
> on how useful running those tests is:
>
> 1/ If they are useful, we should bump the version, at the same time
> adjusting max-complexity per repo to match the most complex function
> (with some slack factor). Encourage more projects to use cyclomatic
> complexity checks and fix the biggest offenders using cycle goals.
>
> 2/ If they are not useful, just drop cyclomatic complexity tests overall
> rather than relying on wrong results and wasting CPU cycles

Hugely in favor of doing that. Even if it was working properly, I
don't think for Heat it was ever a useful test. We end up either
bumping the max or moving a conditional in a function, which doesn't
solve anything.

-- 
Thomas

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [all] cyclomatic complexity check in flake8 & Python 3

2017-12-12 Thread Sean Dague
On 12/12/2017 08:08 AM, Thierry Carrez wrote:
> Zane Bitter wrote:

>> Here's how the allegedly most complex function in Heat[5] shakes out,
>> for example:
>>
>>   mccabe  py27  py36
>>   0.2.1    14 6
>>   0.3.1    23    23
>>   0.6.1    23    23
>>
>> I don't have a solution to suggest, but I burned most of my day digging
>> into this so I just wanted to let y'all know how screwed we are.
> Thanks for digging into this, Zane. I'd say we have two options, based
> on how useful running those tests is:
> 
> 1/ If they are useful, we should bump the version, at the same time
> adjusting max-complexity per repo to match the most complex function
> (with some slack factor). Encourage more projects to use cyclomatic
> complexity checks and fix the biggest offenders using cycle goals.
> 
> 2/ If they are not useful, just drop cyclomatic complexity tests overall
> rather than relying on wrong results and wasting CPU cycles
> 
> So I'd like to hear from the 60+ repos on whether using those tests lead
> to any good results :)

I don't know how useful these end up being (#1), though I've seen
patches from time to time with people trying to reduce them.

The current max in Nova is 35. Moving to mccabe 0.6.1 generates 2 errors:

Running flake8 on all files
./nova/compute/manager.py:763:5: C901 'ComputeManager._init_instance' is
too complex (37)
./nova/tests/functional/api_samples_test_base.py:144:5: C901
'ApiSampleTestBase._compare_result' is too complex (36)

I honestly think this is one of the things that you declare a flag day a
couple weeks out, warn everyone they are going to have to adjust their
max, and move forward. It's a very easy fix when pep8 starts failing people.

-Sean

-- 
Sean Dague
http://dague.net

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [all] cyclomatic complexity check in flake8 & Python 3

2017-12-12 Thread Thierry Carrez
Zane Bitter wrote:
> We have around 60 repos using the 'mccabe' package to do cyclomatic
> complexity checks in flake8 in their gate jobs.[1] Based on advice
> received at the time[2], the version of mccabe that we use in the gate
> is effectively pinned to 0.2.1 by hacking (the project, not... never
> mind).[3]
> 
> This is starting to cause problems, because version 0.2.1 does not work
> with Python 3. By 'does not work' I don't mean that it refuses to run, I
> mean that it returns completely different numbers. (As far as I can tell
> they're always the same or lower.) tox creates the pep8 environment
> using the version of python that tox itself is running in, so as distros
> increasingly prefer to install packages using python3, people will
> increasingly be running flake8 under python3, and they'll run into
> situations like the one I had today where their pep8 tests pass locally
> but fail in the gate (which still runs under python2). When the gate
> switches to python3, as I assume it eventually must, we'll get bogus
> results.
> 
> Unfortunately, moving to a later version is complicated by the fact that
> 0.2.1 was... not good software. There is exactly one regression test.[4]
> And if you're about to ask whether it was to check the trivial case that
> the function:
> 
>   def foo():
>   return None
> 
> has a cyclomatic complexity of 1, then the joke is on you because it
> actually returns 2 for that function.
> 
> Later versions have more tests and appear to be consistent with each
> other and across python versions, but they return completely different
> results to 0.2.1. And the results are mostly higher, so if we bump the
> requirements many of those 60 projects will break.
> 
> (I could also go into details on how the numbers that later versions
> report are also wrong in different ways, but I'll save it for another day.)
> 
> Here's how the allegedly most complex function in Heat[5] shakes out,
> for example:
> 
>   mccabe  py27  py36
>   0.2.1    14 6
>   0.3.1    23    23
>   0.6.1    23    23
> 
> I don't have a solution to suggest, but I burned most of my day digging
> into this so I just wanted to let y'all know how screwed we are.
Thanks for digging into this, Zane. I'd say we have two options, based
on how useful running those tests is:

1/ If they are useful, we should bump the version, at the same time
adjusting max-complexity per repo to match the most complex function
(with some slack factor). Encourage more projects to use cyclomatic
complexity checks and fix the biggest offenders using cycle goals.

2/ If they are not useful, just drop cyclomatic complexity tests overall
rather than relying on wrong results and wasting CPU cycles

So I'd like to hear from the 60+ repos on whether using those tests lead
to any good results :)

-- 
Thierry Carrez (ttx)

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


[openstack-dev] [all] cyclomatic complexity check in flake8 & Python 3

2017-12-11 Thread Zane Bitter
We have around 60 repos using the 'mccabe' package to do cyclomatic 
complexity checks in flake8 in their gate jobs.[1] Based on advice 
received at the time[2], the version of mccabe that we use in the gate 
is effectively pinned to 0.2.1 by hacking (the project, not... never 
mind).[3]


This is starting to cause problems, because version 0.2.1 does not work 
with Python 3. By 'does not work' I don't mean that it refuses to run, I 
mean that it returns completely different numbers. (As far as I can tell 
they're always the same or lower.) tox creates the pep8 environment 
using the version of python that tox itself is running in, so as distros 
increasingly prefer to install packages using python3, people will 
increasingly be running flake8 under python3, and they'll run into 
situations like the one I had today where their pep8 tests pass locally 
but fail in the gate (which still runs under python2). When the gate 
switches to python3, as I assume it eventually must, we'll get bogus 
results.


Unfortunately, moving to a later version is complicated by the fact that 
0.2.1 was... not good software. There is exactly one regression test.[4] 
And if you're about to ask whether it was to check the trivial case that 
the function:


  def foo():
  return None

has a cyclomatic complexity of 1, then the joke is on you because it 
actually returns 2 for that function.


Later versions have more tests and appear to be consistent with each 
other and across python versions, but they return completely different 
results to 0.2.1. And the results are mostly higher, so if we bump the 
requirements many of those 60 projects will break.


(I could also go into details on how the numbers that later versions 
report are also wrong in different ways, but I'll save it for another day.)


Here's how the allegedly most complex function in Heat[5] shakes out, 
for example:


  mccabe  py27  py36
  0.2.114 6
  0.3.12323
  0.6.12323

I don't have a solution to suggest, but I burned most of my day digging 
into this so I just wanted to let y'all know how screwed we are.


cheers,
Zane.

[1] 
https://github.com/search?q=org%3Aopenstack+max-complexity+extension%3A.ini+path%3A%2F&type=Code
[2] 
http://lists.openstack.org/pipermail/openstack-dev/2014-October/048701.html
[3] 
http://git.openstack.org/cgit/openstack-dev/hacking/tree/requirements.txt#n9

[4] https://github.com/PyCQA/mccabe/blob/0.2.1/test_mccabe.py
[5] 
http://git.openstack.org/cgit/openstack/heat/tree/heat/engine/scheduler.py#n299


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev