Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/10205#issuecomment-197429014
> What's the difference between OptionalConfigEntry and something that has
default value null?
`null` is not a valid default value for lots of different types (e.g. `int`
or `long` or even `boolean`). Also, there's a semantic different between a
config value that is not set and a config value that has a default value, and
this captures it. It's basically the difference between calling
`SparkConf.get(foo, default)` and `SparkConf.getOption(foo)`.
It was the intent from the beginning to have that distinction, to avoid the
currently common case of the default values being hardcoded in different places
where the configs are used.
> Why is the builder pattern better than just a ctor with default argument
values?
Because with the default arguments, you have to copy & paste the argument
list for every type-specific builder (look at the current SQLConf). Also, you
can't overload methods with default arguments, in case for some reason you want
to.
> Function naming is inconsistent. E.g. "withDefault" and "doc" vs "withDoc"
That's intentional, although maybe the naming could be a little better.
`doc` just modifies the internal state of the builder, `withDefault` actually
builds a config entry.
> Do we really need all these classes?
I'm open to suggestions, but I couldn't find a clean way to need less
classes, because of how the types are propagated, and to be able to easily
reuse parsing functions to generate optional configs.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]