[
https://issues.apache.org/jira/browse/SPARK-14919?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15259740#comment-15259740
]
Sean Owen commented on SPARK-14919:
-----------------------------------
I dunno, shading is a not-uncommon thing to do, and it's easy in Maven. I'm not
sure it's a Jackson problem; if 2.6 adds a feature not present in 2.5 and you
use it, that's all fine, except that if you encounter 2.5 at runtime it fails.
Shading Jackson is Spark would probably help, if not resolve it, since I think
we'd find something somewhere in the classpath of the assembly still has
unshaded Jackson. Still, see https://issues.apache.org/jira/browse/SPARK-13022
for an (abandoned) effort to do this. I think it'd be fine to pick that up
again.
> Spark Cannot be used with software that requires jackson-databind 2.6+:
> RDDOperationScope
> -----------------------------------------------------------------------------------------
>
> Key: SPARK-14919
> URL: https://issues.apache.org/jira/browse/SPARK-14919
> Project: Spark
> Issue Type: Bug
> Components: Input/Output
> Affects Versions: 1.6.1
> Environment: Linux, OSX
> Reporter: John Ferguson
>
> When using Spark 1.4.x or Spark 1.6.1 in an application that has a front end
> requiring jackson-databind 2.6+, we see the follow exceptions:
> Subset of stack trace:
> ==================
> com.fasterxml.jackson.databind.JsonMappingException: Could not find creator
> property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)
> at [Source: {"id":"0","name":"textFile"}; line: 1, column: 1]
> at
> com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
> at
> com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843)
> at
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.addBeanProps(BeanDeserializerFactory.java:533)
> at
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:220)
> at
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:405)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:354)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:262)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:242)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:143)
> at
> com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:439)
> at
> com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3664)
> at
> com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3556)
> at
> com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2576)
> at
> org.apache.spark.rdd.RDDOperationScope$.fromJson(RDDOperationScope.scala:85)
> at
> org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
> at
> org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
> at scala.Option.map(Option.scala:145)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:136)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
> at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
> at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:1011)
> at
> org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:832)
> at
> org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:830)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
> at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
> at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]