Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/15538
  
    If `ParquetFileFormat` had a static init block somehow, we'd be done right? 
because the logging config in that static initializer would have to execute, 
once, during classloading and therefore before any usage in a constructor or 
serialization. But per above, you can't write such a block in Scala. So, make a 
dummy `ParquetLogHelper.java` class, and make `ParquetFileFormat` have a dummy 
field of its type or something. `ParquetLogHelper` can have a static init, and 
must load when `ParquetFileFormat` loads. I might miss a reason that doesn't 
work, but seems like the sort of thing to rule out before settling on a 
complicated solution. It would be a little simpler than the current code.
    
    I don't think it's so wrong to add logic directly to Logging.scala as a 
fairly temporary workaround. It wouldn't entail an actual compile or runtime 
dependency on Parquet in core. At least for me that seems simpler than the 
heroic attempt to intercept serialization and constructor paths. I'd hope it's 
sufficient to add it to Logging, which should have to init very early, 
everywhere, once.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to