Github user gkossakowski commented on the pull request:
https://github.com/apache/spark/pull/1929#issuecomment-53536315
> @gkossakowski, thanks for the detailed reply. From my point of view, what
we want when new JARs are added is for earlier JARs to take precedence. This is
what makes the most sense. If you already instantiated an object from the
previous version of the class and stored it in a variable, it's not possible
for it to suddenly change class. So instead the effect should be the same as
tacking on another JAR at the end of your classpath -- only classes that are
not found in earlier JARs come from there. Would these semantics be possible to
implement for 2.11?
We agree on semantics. I called changing existing class shadowing but we
mean the same: changes to existing classes should not be allowed.
Adding jars to the classpath means just adding new classes that were not
previously available. For that we need merging of packages as I explained
earlier. It's possible to implement this kind of API for 2.11 but it doesn't
exist yet .
I hope we can figure out how to merge your changes and work on the API on
the compiler side. The current approach of going deep into internals of
`Global` as seen in this PR is fine as a short term experimentation so you can
quickly deliver a fix to your users. Long term solution would be migrating most
of the Spark's code that talks to compiler internals to Scala code base.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]