A situation that often comes up is having to initialize several instance attributes that accept a default value. For a single class, passing the default values in __init__ is fine:
class Base(object): def __init__(self, x=0, y=None): self.x = x self.y = y For inherited classes that need to override __init__ while keeping a compatible interface though, the default values have to be repeated: class Derived(Base): def __init__(self, x=0, y=None, z=''): super(Derived,self).__init__(self,x,y) self.z = '' For just two attributes and two classes that's maybe not too bad but for many attributes and/or derived classes that may span multiple modules, that doesn't seem to scale from a maintenance point of view, especially if the defaults change over time. A pattern I've been using lately instead is store the defaults in class attributes and let __init__ accept keyword arguments: class Base(object): x = 0 y = None def __init__(self, **kwds): setattrs(self, kwds) where setattrs is: def setattrs(self, attrvals, strict=True): if strict: # raise AttributeError if some attr doesn't exist already for attr in attrvals.iterkeys(): getattr(self,attr) for attr,val in attrvals.iteritems(): setattr(self, attr, val) This way, only the new and overriden default attributes have to repeated in derived classes: class Derived(Base): x = 1 z = '' def __init__(self, **kwds): super(Derived,self).__init__(**kwds) print 'In Derived.__init__' Is this a good way of doing it ? Is there a better pattern ? George -- http://mail.python.org/mailman/listinfo/python-list