Hi Anubhav,

This is happening because you're trying to use the configuration
generated for CDH with upstream Spark. The CDH configuration will add
extra needed jars that we don't include in our build of Spark, so
you'll end up getting duplicate classes.

You can either try to use a different Spark configuration directory
for your upstream version, or try to use the "hadoop provided"
distribution of Apache Spark; note that the latter doesn't include
certain features like Hive support.


On Wed, May 18, 2016 at 10:59 AM, Anubhav Agarwal <anubha...@gmail.com> wrote:
> Hi,
> I am having log4j trouble while running Spark using YARN as cluster manager
> in CDH 5.3.3.
> I get the following error:-
>
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/data/12/yarn/nm/filecache/34/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/hadoop/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
>
> I know this is not world ending situation but I still need to remove one of
> the StaticLoggerBinder from classpath as neither MDC nor Marker work.
>
> I need custom fields in my log.
>
>
> Has anybody made any success in this regard? Any workaround or suggestions
> are welcome.
>
>
> Thank You,
>
> Anu



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to