[ 
https://issues.apache.org/jira/browse/SPARK-6305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15289508#comment-15289508
 ] 

Charles Allen commented on SPARK-6305:
--------------------------------------

For what it's worth, I went through a similar exercise for druid.io recently. 
Here's my resulting list of "Hadoop go away" exclusions: 
https://github.com/druid-io/druid/blob/druid-0.9.0/extensions-core/hdfs-storage/pom.xml#L49

Getting the <scope>provided</scope>, exclusions, and <optional>true</optional> 
sorted out for dependencies is not trivial. And one of the frustrating things 
with spark is its "screw it, make it a shaded assembly" approach to 
dependencies (anyone know how to get the new s3a stuff from the hadoop storage 
extension to work?). Not sure if there is an overall epic of "handle jar 
dependencies better" but I think this ask would fit better under that than 
simply a blank update of what slf4j impl spark wants to use.

> Add support for log4j 2.x to Spark
> ----------------------------------
>
>                 Key: SPARK-6305
>                 URL: https://issues.apache.org/jira/browse/SPARK-6305
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>            Reporter: Tal Sliwowicz
>            Priority: Minor
>
> log4j 2 requires replacing the slf4j binding and adding the log4j jars in the 
> classpath. Since there are shaded jars, it must be done during the build.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to