You have a Jackson version conflict somewhere. It might be from other
libraries you include in your application.
I am not sure Spark 2.3 works with Hadoop 3.1, so this may be the
issue. Make sure you match these to Spark, and/or use the latest
versions.

On Thu, Jul 9, 2020 at 8:23 AM Julian Jiang <juli...@synnex.com> wrote:
>
> when I run it in my idea ,it works well.but when I submit to cluster ,it 
> appear this problem.。thanks for help me .
>
>   My version is as follow:
>
>        <scala.version>2.11.8</scala.version>
>
>         <hadoop.version>3.1.1</hadoop.version>
>
>         <spark.version>2.3.2</spark.version>
>
>         <clickhouse-jdbc.version>0.2.4</clickhouse-jdbc.version>
>
> My code is as follow :
>
> val spark:SparkSession = SparkSession
>   .builder()
>   .appName("CkConnect")
>   .master("local[2]")
>   .getOrCreate()
> val properties = new Properties()
> //设置用户名和密码
> properties.setProperty("user","*")
> properties.setProperty("password","*")
> val dataFrame: DataFrame = 
> spark.read.jdbc("jdbc:clickhouse://*","stu",properties)
>
> when read jdbc ,it don’t’t work..
>
>
>
> Exception in thread "main" java.lang.ExceptionInInitializerError
>
>        at 
> org.apache.spark.scheduler.EventLoggingListener$.initEventLog(EventLoggingListener.scala:303)
>
>        at 
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:128)
>
>        at org.apache.spark.SparkContext.<init>(SparkContext.scala:522)
>
>        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)
>
>        at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934)
>
>        at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925)
>
>        at scala.Option.getOrElse(Option.scala:121)
>
>        at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925)
>
>        at org.synnex.WordCount.main(WordCount.java:25)
>
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>        at java.lang.reflect.Method.invoke(Method.java:498)
>
>        at 
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>
>        at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
>
>        at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
>
>        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
>
>        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
>
>        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Caused by: com.fasterxml.jackson.databind.JsonMappingException: Scala module 
> 2.9.6 requires Jackson Databind version >= 2.9.0 and < 2.10.0
>
>        at 
> com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:61)
>
>        at 
> com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:18)
>
>        at 
> com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:722)
>
>        at org.apache.spark.util.JsonProtocol$.<init>(JsonProtocol.scala:59)
>
>        at org.apache.spark.util.JsonProtocol$.<clinit>(JsonProtocol.scala)
>
>        ... 19 more

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to