Hi, I'm trying to use a custom log4j appender in my log4j.properties. It was perfectly working under spark 1.3.1 but it is now broken.
The appender is shaded/bundled in my fat-jar. Note: I've seen that spark 1.3.1 is using a different class loader.. See my SO post: http://stackoverflow.com/questions/31856532/spark-unable-to-load-custom-log4j-properties-from-fat-jar-resources log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader@4e25154f. log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@4e25154f class loader. log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource(). log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader@4e25154f. log4j: Using URL [file:/C:/Apps/Spark/spark-1.4.1-bin-hadoop2.6/conf/log4j.properties] for automatic log4j configuration. log4j: Reading configuration from URL file:/C:/Apps/Spark/spark-1.4.1-bin-hadoop2.6/conf/log4j.properties log4j: Parsing for [root] with value=[WARN, console, redis]. log4j: Level token is [WARN]. log4j: Category root set to WARN log4j: Parsing appender named "console". log4j: Parsing layout options for "console". log4j: Setting property [conversionPattern] to [%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n]. log4j: End of parsing for "console". log4j: Setting property [target] to [System.err]. log4j: Parsed "console" options. *log4j: Parsing appender named "redis". log4j:ERROR Could not instantiate class [com.ryantenney.log4j.FailoverRedisAppender]. java.lang.ClassNotFoundException: com.ryantenney.log4j.FailoverRedisAppender * at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198) at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327) at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124) at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785) at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768) at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526) at org.apache.log4j.LogManager.<clinit>(LogManager.java:127) at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285) at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155) at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132) at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:275) at org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$18.apply(Utils.scala:2262) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$18.apply(Utils.scala:2262) at scala.util.Try$.apply(Try.scala:161) at org.apache.spark.util.SparkShutdownHookManager.install(Utils.scala:2262) at org.apache.spark.util.Utils$.<init>(Utils.scala:88) at org.apache.spark.util.Utils$.<clinit>(Utils.scala) at org.apache.spark.deploy.SparkSubmitArguments.handleUnknown(SparkSubmitArguments.scala:432) at org.apache.spark.launcher.SparkSubmitOptionParser.parse(SparkSubmitOptionParser.java:174) at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:91) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:107) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) log4j:ERROR Could not instantiate appender named "redis". -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org