Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/19760
It would be possible for configs declared in core; because then we can
force SparkConf to initialize those classes. (e.g., referencing any config
constant causes the `o.a.s.internal.config` package object to be loaded and all
config constants to be initialized).
The only way to do that for other modules would be to move their configs to
core, so that SparkConf can initialize them the same way. (Well, you could do
it with reflection I guess, but ugh.)
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]