Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/3841#issuecomment-68378792
I'm a bit confused about this chance, since it seems like changing the code
to read that value from system properties instead of SparkConf breaks our
ability to configure it via SparkConf.
Can you add a failing unit test which demonstrates the problem / bug that
this patch addresses?
If this issue has to do with initialization ordering, I'd like to see if we
can come up with a cleaner approach which doesn't involve things like
unexplained `lazy` keywords (since I'm concerned that such approaches will
inevitably break when the code is modified).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]