On Wed, Jul 16, 2008 at 2:03 PM, Raymond Hettinger <[EMAIL PROTECTED]> wrote: > From: "Michael Foord" <[EMAIL PROTECTED]> >> assertIn / assertNotIn I use very regularly for collection membership > > - self.assert_(func(x) in result_set) > + self.assertIn(func(x), result_set) > > Yawn. The gain is zero. Actually, it's negative because the second > doesn't read as nicely as the pure python expression.
I disagree. The reason why we have assertEquals(x, y) is that the error message can show the values of x and y, whereas assert x == y can't show those. Showing the values can be tremendously useful to debugging the failure. (Doing an intelligent comparison, e.g. a string or list "diff" instead of showing the two values, can be even more useful, and I'd be in favor of that rather than adding new methods like assertListsEqual.) (Titus asks if the assert statement could be adjusted to do better reporting. But that's not going to happen -- it would require a tremendous amount of compiler support that would have to be implemented in every Python implementation (last I counted there were at least five). In addition, python -O removes all asserts from your code -- that's why we use assertXxx functions in the first place.) > Think bigger! No fat APIs. Do something cool! Checkout the > dynamic test creation in test_decimal to see if it can be generalized. > Give me some cool test runners. Maybe find a way to automatically > launch pdb or to dump the locals variables at the time of failure. > Maybe move the "test_*.py" search into the unittest module. Those ideas are cool too. -- --Guido van Rossum (home page: http://www.python.org/~guido/) _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com