Hi All, after trying a lot, in vain, so pl help...

./run-example org.apache.spark.examples.SparkPi local

*1) SLF4J: Class path contains multiple SLF4J bindings.*
SLF4J: Found binding in
[jar:file:/home/sparkcluster/spark-0.8.1-incubating/examples/target/scala-2.9.3/spark-examples-assembly-0.8.1-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/sparkcluster/spark-0.8.1-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.1-incubating-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]


*2) log4j:WARN No appenders could be found for logger
(org.apache.spark.util.Utils).*log4j:WARN Please initialize the log4j
system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Pi is roughly 3.14068

Error 1) I know that in *pom.xml* some exclusion needs to be given. But
under what, what will be the *group id and artifact id.*

Error 2) I have initialized the log4j root logger to INFO, FILE.
I think Spark is not able to find *log4j.properties. *What should be done
???



-- 
*Sai Prasanna. AN*
*II M.Tech (CS), SSSIHL*


*Entire water in the ocean can never sink a ship, Unless it gets inside.All
the pressures of life can never hurt you, Unless you let them in.*

Reply via email to