Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/90#issuecomment-36827214
  
    Hey @sryza I have a question here. If we shade like this for the `package` 
target of core, then as far as I understand this will embed a shaded version of 
`asm` inside of our spark packaged jars, similar to what parquet does with 
thrift [1] and it will rename all references to asm inside of Spark to use that 
new name.
    
    However, in Spark's case the reason we bumped asm to 4.0 is because a Spark 
dependency, Kryo, depends on asm 4.0. So if someone is, say building an 
application against spark that they run locally they will still pull in the 
kryo dependency and transitively asm 4.0 and they still have the problem. Right?
    
    I think probably what we want here is to do the shading inside of the 
assembly pom.xml so that at least for assembly Spark builds with Maven it will 
work correctly. 
    
    Unfortunately I think the only way to have it work correctly in the 
transitive sense would be to build our own version of Kryo which depends on a 
shaded asm and then depend on that. Ala what we did with akka/protobuf.
    
    Is there a better way?
    
    [1] see contents: 
http://mvnrepository.com/artifact/com.twitter/parquet-format/2.0.0


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to