Github user mridulm commented on the issue:
https://github.com/apache/spark/pull/16098
That looks like a bug in ParquetRecordWriter (the contract of close() is
unambiguous) ... but then, I guess there is no point in fighting against buggy
code : we have to integrate with a lot of Closeable's.
If this is common enough bug, my suggestion is a bad idea - thanks for
verifying.
Btw, if you are depending on writer being null to prevent 'double close',
copy it to a local variable, null it and close the local variable. This
prevents close itself throwing exception masking the initial exception.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]