[ 
https://issues.apache.org/jira/browse/SPARK-8332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14603751#comment-14603751
 ] 

Jonathan Kelly commented on SPARK-8332:
---------------------------------------

I was running into the same issue, so I made sure to get rid of all Jackson 2.2 
jars from my Hadoop classpath, but now I've run into a different issue:

Exception in thread "main" com.fasterxml.jackson.databind.JsonMappingException: 
Could not find creator property with name 'id' (in class 
org.apache.spark.rdd.RDDOperationScope)
 at [Source: {"id":"0","name":"parallelize"}; line: 1, column: 1]
        at 
com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
        at 
com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.addBeanProps(BeanDeserializerFactory.java:533)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:220)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143)
        at 
com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:409)
        at 
com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:358)
        at 
com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:265)
        at 
com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:245)
        at 
com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:143)
        at 
com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:439)
        at 
com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3666)
        at 
com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3558)
        at 
com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2578)
        at 
org.apache.spark.rdd.RDDOperationScope$.fromJson(RDDOperationScope.scala:82)
        at org.apache.spark.rdd.RDD$$anonfun$34.apply(RDD.scala:1486)
        at org.apache.spark.rdd.RDD$$anonfun$34.apply(RDD.scala:1486)
        at scala.Option.map(Option.scala:145)
        at org.apache.spark.rdd.RDD.<init>(RDD.scala:1486)
        at 
org.apache.spark.rdd.ParallelCollectionRDD.<init>(ParallelCollectionRDD.scala:85)
        at 
org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:697)
        at 
org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:695)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
        at org.apache.spark.SparkContext.withScope(SparkContext.scala:681)
        at org.apache.spark.SparkContext.parallelize(SparkContext.scala:695)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I'm just running: MASTER=yarn-client /usr/lib/spark/bin/run-example SparkPi 10

> NoSuchMethodError: 
> com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-8332
>                 URL: https://issues.apache.org/jira/browse/SPARK-8332
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>         Environment: spark 1.4 & hadoop 2.3.0-cdh5.0.0
>            Reporter: Tao Li
>            Priority: Critical
>              Labels: 1.4.0, NoSuchMethodError, com.fasterxml.jackson
>
> I complied new spark 1.4.0 version. 
> But when I run a simple WordCount demo, it throws NoSuchMethodError 
> {code}
> java.lang.NoSuchMethodError: 
> com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
> {code}
> I found out that the default "fasterxml.jackson.version" is 2.4.4. 
> Is there any wrong or conflict with the jackson version? 
> Or is there possibly some project maven dependency containing the wrong 
> version of jackson?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to