has anyone else been testing parquet with hadoop 3.4.0? the hadoop release is out and it all compiled nicely, but i'm seeing what looks like jackson complaints on TestInputOutput
17:54:03.642 [Thread-417] WARN org.apache.hadoop.mapred.LocalJobRunner - job_local1837703063_0014 java.lang.Exception: java.lang.RuntimeException: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Java 8 optional type `java.util.Optional<java.lang.Long>` not supported by default: add Module "com.fasterxml.jackson.datatype:jackson-datatype-jdk8" to enable handling (through reference chain: org.apache.parquet.hadoop.metadata.ParquetMetadata["blocks"]->java.util.ArrayList[0]->org.apache.parquet.hadoop.metadata.BlockMetaData["columns"]->java.util.Collections$UnmodifiableRandomAccessList[0]->org.apache.parquet.hadoop.metadata.IntColumnChunkMetaData["sizeStatistics"]->org.apache.parquet.column.statistics.SizeStatistics["unencodedByteArrayDataBytes"]) at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:492) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:552) Caused by: java.lang.RuntimeException: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Java 8 optional type `java.util.Optional<java.lang.Long>` not supported by default: add Module "com.fasterxml.jackson.datatype:jackson-datatype-jdk8" to enable handling (through reference chain: org.apache.parquet.hadoop.metadata.ParquetMetadata["blocks"]->java.util.ArrayList[0]->org.apache.parquet.hadoop.metadata.BlockMetaData["columns"]->java.util.Collections$UnmodifiableRandomAccessList[0]->org.apache.parquet.hadoop.metadata.IntColumnChunkMetaData["sizeStatistics"]->org.apache.parquet.column.statistics.SizeStatistics["unencodedByteArrayDataBytes"]) at org.apache.parquet.hadoop.metadata.ParquetMetadata.toJSON(ParquetMetadata.java:68) at org.apache.parquet.hadoop.metadata.ParquetMetadata.toPrettyJSON(ParquetMetadata.java:48) at org.apache.parquet.format.converter.ParquetMetadataConverter.readParquetMetadata(ParquetMetadataConverter.java:1592) at org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:622) at org.apache.parquet.hadoop.ParquetFileReader.<init>(ParquetFileReader.java:895) at org.apache.parquet.hadoop.ParquetFileReader.open(ParquetFileReader.java:703) at org.apache.parquet.hadoop.ParquetRecordReader.initializeInternalReader(ParquetRecordReader.java:159) at org.apache.parquet.hadoop.ParquetRecordReader.initialize(ParquetRecordReader.java:136) Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Java 8 optional type `java.util.Optional<java.lang.Long>` not supported by default: add Module "com.fasterxml.jackson.datatype:jackson-datatype-jdk8" to enable handling (through reference chain: org.apache.parquet.hadoop.metadata.ParquetMetadata["blocks"]->java.util.ArrayList[0]->org.apache.parquet.hadoop.metadata.BlockMetaData["columns"]->java.util.Collections$UnmodifiableRandomAccessList[0]->org.apache.parquet.hadoop.metadata.IntColumnChunkMetaData["sizeStatistics"]->org.apache.parquet.column.statistics.SizeStatistics["unencodedByteArrayDataBytes"]) at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:77) at com.fasterxml.jackson.databind.SerializerProvider.reportBadDefinition(SerializerProvider.java:1330) at com.fasterxml.jackson.databind.ser.impl.UnsupportedTypeSerializer.serialize(UnsupportedTypeSerializer.java:35) at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:732) at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:770) at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:183) at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:732) at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:770) I'll stick up a WiP pr just to see what happens on the jenkins builds, and am trying out on java17 to see what difference that makes