Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/22887
ok, that makes sense as in I understand what you're saying, but not sure
it's what you actually want?
Why shouldn't "set spark.hadoop.*" override spark-defaults.conf?
But, in any case, it seems like the patch for SPARK-26060 (#23031) should
take care of this (by raising an error).
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]