Serhiy Storchaka <storch...@gmail.com> added the comment: > That would defeat the purpose of the test. We want to test whether > __sizeof__ is correct, so we shouldn't use __sizeof__ in the test to > compute the expected result. I understand that object.__sizeof__ is > actually a different implementation, but still: there might be errors e.g. > in the type definition that cancel out errors in the sizeof > implementation. The more "directly" the expected result is computed, the > better.
I do not think that the purpose of testing is a testing of object.__sizeof__. Memory consumption consists of two parts -- memory for C structure (and the base object implementation works for this) and extra memory, for which we write a specialized __sizeof__ method. If we doubt object.__sizeof__, then we are obligated to implement and test __sizeof__ methods for all C-implemented classes, not using the base object implementation. > I also realize that such tests will be fragile if the the structures > change. This is a good thing, IMO: anybody changing the layout of some > object should *have* to verify that the size computation is still correct, > so it's good that the test breaks if the structures change. Such tests is too fragile. They force the programmer to write unnecessary code in cases when it can be done automatically. We write in C code sizeof(SomeStruct), and not the sum of sizes (+paddings) of the structure fields. Let's focus on the differences, on the extra memory usage that not allows us to simply use inherited base object implementation. ---------- _______________________________________ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue15402> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com