Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/8499#issuecomment-135665482
I discovered this issue while trying to extend our `spark-avro` library to
publish a single artifact that is compatible with multiple Hadoop versions.
`spark-avro` itself works around these incompatibilities using the same
reflection tricks, but its tests failed due to these calls in Spark which did
not use reflection; see https://github.com/databricks/spark-avro/pull/79 for
some additional context / discussion.
I grepped to try to find all of the places that were using the
non-reflective calls, but it's possible that I might have missed a callsite or
two. I'm opening this PR now for testing + initial feedback.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]