On Fri, Feb 23, 2018 at 11:31 AM, Steven D'Aprano
<steve+comp.lang.pyt...@pearwood.info> wrote:
> Big O analysis is never a substitute for actual timing measurements, and
> the assumption behind Big O analysis, namely that only some operations
> take time, and always constant time, is never correct. It is only an
> approximation to the truth, and as you can see, something which is
> algorithmically Big O logarithmic can turn out to scale worse than
> linearly once you actually measure the times.

I prefer to describe that as "hidden costs". Any step that has its own
non-constant cost becomes a hidden cost. That doesn't mean Big O is
itself limited, but that when you analyze a function, you have to
think about more than just what's right in front of you. For example:

def count_at_max(items):
    count = 0
    for item in items:
        if item == max(items):
            count += 1
    return count

def count_at_max(items):
    count = 0
    peak = max(items)
    for item in items:
        if item == peak:
            count += 1
    return count

So similar, but one of them runs in O(n²) and the other in O(n). The
max() call is a "hidden cost". (Obviously they'll usually be subtler
than that, but this is a sledgehammer for demonstrative purposes.) Big
O *can* handle this, you just have to take notice of things.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to