Although I am cautiously and tentatively in favour of setting limits 
if the benefits Mark suggests are correct, I have thought of at least 
one case where a million classes may not be enough.

I've seen people write code like this:

    for attributes in list_of_attributes:
        obj = namedtuple("Spam", "fe fi fo fum")(*attributes)
        values.append(obj)


not realising that every obj is a singleton instance of a unique class. 
They might end up with a million dynamically created classes, each with 
a single instance, when what they wanted was a single class with a 
million instances.

Could there be people doing this deliberately? If so, it must be nice 
to have so much RAM that we can afford to waste it so prodigiously: a 
namedtuple with ten items uses 64 bytes, but the associated class uses 
444 bytes, plus the sizes of the methods etc. But I suppose there could 
be a justification for such a design.

(Quoted sizes on my system running 3.5; YMMV.)



-- 
Steven
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/VIK7QKORCYRJOF5EQZGYBNE6L62J5M6L/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to