Thanks!

I found the issue was our explicit dependency on hadoop-client.  After dropping 
that for the one provided by spark-core we no longer run into the Jackson 
classpath problem.


> On Aug 22, 2021, at 1:29 PM, Sean Owen <sro...@gmail.com> wrote:
> 
> Jackson was bumped from 2.10.x to 2.12.x, which could well explain it if 
> you're exposed to the Spark classpath and have your own different Jackson dep.
> 
> On Sun, Aug 22, 2021 at 1:21 PM Michael Heuer <heue...@gmail.com 
> <mailto:heue...@gmail.com>> wrote:
> We're seeing runtime classpath issues with Avro 1.10.2, Parquet 1.12.0, and 
> Spark 3.2.0 RC1.
> 
> Our dependency tree is deep though, and will require further investigation.
> 
> https://github.com/bigdatagenomics/adam/pull/2289 
> <https://github.com/bigdatagenomics/adam/pull/2289>
> 
> $ mvn test
> ...
> *** RUN ABORTED ***
>   java.lang.NoClassDefFoundError: com/fasterxml/jackson/annotation/JsonKey
>   at 
> com.fasterxml.jackson.databind.introspect.JacksonAnnotationIntrospector.hasAsKey(JacksonAnnotationIntrospector.java:1080)
>   at 
> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.hasAsKey(AnnotationIntrospectorPair.java:611)
>   at 
> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.hasAsKey(AnnotationIntrospectorPair.java:611)
>   at 
> com.fasterxml.jackson.databind.introspect.POJOPropertiesCollector._addFields(POJOPropertiesCollector.java:495)
>   at 
> com.fasterxml.jackson.databind.introspect.POJOPropertiesCollector.collectAll(POJOPropertiesCollector.java:421)
>   at 
> com.fasterxml.jackson.databind.introspect.POJOPropertiesCollector.getJsonValueAccessor(POJOPropertiesCollector.java:270)
>   at 
> com.fasterxml.jackson.databind.introspect.BasicBeanDescription.findJsonValueAccessor(BasicBeanDescription.java:258)
>   at 
> com.fasterxml.jackson.databind.ser.BasicSerializerFactory.findSerializerByAnnotations(BasicSerializerFactory.java:391)
>   at 
> com.fasterxml.jackson.databind.ser.BeanSerializerFactory._createSerializer2(BeanSerializerFactory.java:220)
>   at 
> com.fasterxml.jackson.databind.ser.BeanSerializerFactory.createSerializer(BeanSerializerFactory.java:169)
>   at 
> com.fasterxml.jackson.databind.SerializerProvider._createUntypedSerializer(SerializerProvider.java:1473)
>   at 
> com.fasterxml.jackson.databind.SerializerProvider._createAndCacheUntypedSerializer(SerializerProvider.java:1421)
>   at 
> com.fasterxml.jackson.databind.SerializerProvider.findValueSerializer(SerializerProvider.java:520)
>   at 
> com.fasterxml.jackson.databind.SerializerProvider.findTypedValueSerializer(SerializerProvider.java:798)
>   at 
> com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:308)
>   at 
> com.fasterxml.jackson.databind.ObjectMapper._writeValueAndClose(ObjectMapper.java:4487)
>   at 
> com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:3742)
>   at org.apache.spark.rdd.RDDOperationScope.toJson(RDDOperationScope.scala:52)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:145)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>   at org.apache.spark.SparkContext.withScope(SparkContext.scala:789)
>   at org.apache.spark.SparkContext.newAPIHadoopFile(SparkContext.scala:1239)
>   at org.bdgenomics.adam.ds.ADAMContext.readVcfRecords(ADAMContext.scala:2668)
>   at org.bdgenomics.adam.ds.ADAMContext.loadVcf(ADAMContext.scala:2686)
>   at org.bdgenomics.adam.ds.ADAMContext.loadVariants(ADAMContext.scala:3608)
>   at 
> org.bdgenomics.adam.ds.variant.VariantDatasetSuite.$anonfun$new$1(VariantDatasetSuite.scala:128)
>   at 
> org.bdgenomics.utils.misc.SparkFunSuite.$anonfun$sparkTest$1(SparkFunSuite.scala:111)
> 
> 
> 
>> On Aug 22, 2021, at 10:58 AM, Sean Owen <sro...@gmail.com 
>> <mailto:sro...@gmail.com>> wrote:
>> 
>> So far, I've tested Java 8 + Scala 2.12, Scala 2.13 and the results look 
>> good per usual.
>> Good to see Scala 2.13 artifacts!! Unless I've forgotten something we're OK 
>> for Scala 2.13 now, and Java 11 (and, IIRC, Java 14 works fine minus some 
>> very minor corners of the project's deps)
>> 
>> I think we're going to have to have this fix, which just missed the 3.2 RC:
>> https://github.com/apache/spark/commit/c441c7e365cdbed4bae55e9bfdf94fa4a118fb21
>>  
>> <https://github.com/apache/spark/commit/c441c7e365cdbed4bae55e9bfdf94fa4a118fb21>
>> I think that means we shouldn't release this RC, but, of course let's test.
>> 
>> 
>> 
>> On Fri, Aug 20, 2021 at 12:05 PM Gengliang Wang <ltn...@gmail.com 
>> <mailto:ltn...@gmail.com>> wrote:
>> Please vote on releasing the following candidate as Apache Spark version 
>> 3.2.0.
>> 
>> The vote is open until 11:59pm Pacific time Aug 25 and passes if a majority 
>> +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> 
>> [ ] +1 Release this package as Apache Spark 3.2.0
>> [ ] -1 Do not release this package because ...
>> 
>> To learn more about Apache Spark, please see http://spark.apache.org/ 
>> <http://spark.apache.org/>
>> 
>> The tag to be voted on is v3.2.0-rc1 (commit 
>> 6bb3523d8e838bd2082fb90d7f3741339245c044):
>> https://github.com/apache/spark/tree/v3.2.0-rc1 
>> <https://github.com/apache/spark/tree/v3.2.0-rc1>
>> 
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-bin/ 
>> <https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-bin/>
>> 
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS 
>> <https://dist.apache.org/repos/dist/dev/spark/KEYS>
>> 
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1388 
>> <https://repository.apache.org/content/repositories/orgapachespark-1388>
>> 
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-docs/ 
>> <https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-docs/>
>> 
>> The list of bug fixes going into 3.2.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>  <https://issues.apache.org/jira/projects/SPARK/versions/12349407>
>> 
>> This release is using the release script of the tag v3.2.0-rc1.
>> 
>> 
>> FAQ
>> 
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>> 
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>> 
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.2.0?
>> ===========================================
>> The current list of open tickets targeted at 3.2.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK 
>> <https://issues.apache.org/jira/projects/SPARK> and search for "Target 
>> Version/s" = 3.2.0
>> 
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>> 
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>> 
> 

Reply via email to