Dexter Hill wrote:
> The idea is to have a `default_factory` like argument (either in the `field` 
> function, or a new function entirely) that takes a function as an argument, 
> and that function, with the value provided by `__init__`, is called and the 
> return value is used as the value for the respective field. For example:
> ```py
> @dataclass
> class Foo:
>     x: str = field(init_fn=chr)
> f = Foo(65)
> f.x # "A"
> ```
> The `chr` function is called, given the value `65` and `x` is set to its 
> return value of `"A"`. I understand that there is both `__init__` and 
> `__post_init__` which can be used for this purpose, but sometimes it isn't 
> ideal to override them. If you overrided `__init__`, and were using 
> `__post_init__`, you would need to manually call it, and in my case, 
> `__post_init__` is implemented on a base class, which all other classes 
> inherit, and so overloading it would require re-implementing the logic from 
> it (and that's ignoring the fact that you also need to type the field with 
> `InitVar` to even have it passed to `__post_init__` in the first place).
> I've created a proof of concept, shown below:
> ```py
> def initfn(fn, default=None):
>     class Inner:
>         def __set_name__(_, owner_cls, owner_name):
>             old_setattr = getattr(owner_cls, "__setattr__")
> def __setattr__(self, attr_name, value):
> if attr_name == owner_name:
>                     # Bypass `__setattr__`
>                     self.__dict__[attr_name] = fac(value)
> else:
>                     old_setattr(self, attr_name, value)
> setattr(owner_cls, "__setattr__", __setattr__)
> def fac(value):
>         if isinstance(value, Inner):
>             return default
> return fn(value)
> return field(default=Inner())
> ```
> It makes use of the fact that providing `default` as an argument to `field` 
> means it checks the value for a `__set_name__` function, and calls it with 
> the class and field name as arguments. Overriding `__setattr__` is just used 
> to catch when a value is being assigned to a field, and if that field's name 
> matches the name given to `__set_name__`, it calls the function on the value, 
> at sets the field to that instead.
> It can be used like so:
> ```py
> @dataclass
> class Foo:
>     x: str = initfn(fn=chr, default="Z")
> f = Foo(65)
> f2 = Foo()
> f.x # "A"
> f2.x # "Z"
> ```
> It adds a little overhead, especially with having to override `__setattr__` 
> however, I believe it would have very little overhead if directly implemented 
> in the dataclass library.
> Even in the case of being able to override one of the init functions, I still 
> think it would be nice to have as a quality of life feature as I feel calling 
> a function is too simple to want to override the functions, if that makes 
> sense.
> Thanks.
> Dexter

What if, instead, the `init` parameter could accept either a boolean (as it 
does now) or a type? When given a type, that would mean that to created the 
property and accept the argument but pass the argument ti `__post_init__` 
rather than using it to initialize the property directly. The type passed to 
`init` would become the type hint for the argument.
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/YERVGXA5QJUHOQW357GVN7JERB2AJT6P/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to