>
> You can imagine all manner of jar-hell created by shading.  For instance:
>
- library L1 shades library ShadedA-1.0 and ShadedB-1.1.
> - library L2 shades library ShadedA-1.1 and ShadedB-1.0.
> - An app wants to use L1, L2, ShadedA-1.1, and ShadedB-1.1 but it can't no
> matter what classpath ordering it uses.
> - An app wants to use L1, L2, ShadedA-1.0, and ShadedB-1.0 but it can't no
> matter what classpath ordering it uses.
>

Sorry, but I cannot follow the problems you are trying to show.

L1 has ShadeA and ShadeB
L2 has ShadeA and ShadeB
but both have their own version that doesn't clash and that doesn't know
anything about the other one.
The app does not see ShadeA or ShadeB from L1 or L2 (unless it uses the
hidden package from L1 which would be stupid)
There are no clashes and every library uses the version that it needs.

To be more explicit. I am maintaining org.vafer.jdependency which uses
org.objectweb.asm.
If you look at the final jar you find

  org/vafer/jdeb/shaded/objectweb/asm/*.class

The shaded classes are relocated and become part of the context of the
library that is shading.
A shading hell is just not possible as long as there are no classpath
problems on the library itself.

It's like you are copy&pasting the code into your package.
Bytecode manipulation is really not that bad as you make it to be.

Just creating uberjars (without relocation)  - that's a whole different
story.
That should only ever been done on the final application artifact.

Reply via email to