John wrote:
Thanks for the feedback. I wasn't aware about the assert usage not
being intended for production code.

That's not quite true. There is nothing wrong with using asserts in production code. The important thing is to use them *properly*. Asserts are for checking your internal program logic, and should never be used for validating data.

The reason is that asserts can be turned off. If the user calls Python with the optimize -O switch, all assertions are silently dropped from the code when it is compiled. Obviously this is disastrous if you are using assert to check input arguments and user data, since now those checks won't happen.

But assertions can still be useful in production code. When and why? For testing *your* logic about the program.

The sad truth is that programmers are merely human, and we make mistakes. Sometimes we reason that our code does one thing, when in fact we've forgotten a case or misunderstood it, and the code does something different from what we expected. Or we fear that some function we rely on has a bug.

As they say, beware of any situation which you are sure can't possibly happen, because it surely will. To program defensively, you might write something like this:


n = some_function(arg)
if n > 0:
    do_something(n)
else:
    # This can never happen! But just in case it does...
    raise RuntimeError("unexpected internal error")


What does this tell us? It tells us that some_function promises to always return a positive number (this is sometimes called a contract), but we don't *quite* believe it. If we trusted it completely, we would just write this:

n = some_function(arg)
do_something(n)


and that's perfectly fine. 99.999% of your code should look like that: you trust the function to do what you expect.

But perhaps you have a little nagging doubt: "some_function is really complicated, I'm pretty sure it works correctly, but maybe I made a subtle mistake". Hey, you're only human. So you leave the test in *just in case*.

Assertions are for those "just in case" tests.

If you're sure that the "else" clause will never happen, why do the test every time? Because you're cautious. But maybe you don't need to be cautious all the time. Perhaps you'd like an option to run your code faster by skipping all those tests that you're sure will never fail. So you might start with a global variable:

__debug__ = True


and write code like this:


n = some_function(arg)
if __debug__:
    # Just in case...
    if n <= 0:  # note we swap the test around.
        # This can never happen! But just in case it does...
        raise RuntimeError("unexpected internal error")
do_something(n)


That gives you the best of both worlds: if __debug__ is True (the default), you have the extra safety of the debugging test. And if it's False, you skip the test, and so your code is a little bit faster.

That's what Python *automatically* does for you. Just write your code like this instead

n = some_function(arg)
assert n <= 0, "unexpected internal error"
do_something(n)

and the Python compiler will automatically handle everything for you, according to whether or not you are running in standard "debug mode" or with the -O optimize switch.



--
Steven
_______________________________________________
Tutor maillist  -  [email protected]
To unsubscribe or change subscription options:
http://mail.python.org/mailman/listinfo/tutor

Reply via email to