gszadovszky commented on code in PR #1329:
URL: https://github.com/apache/parquet-mr/pull/1329#discussion_r1579800270


##########
pom.xml:
##########
@@ -580,29 +580,11 @@
             </excludeModules>
             <excludes>
               <exclude>${shade.prefix}</exclude>
-              <exclude>org.apache.parquet.hadoop.CodecFactory</exclude> <!-- 
change field type from Configuration to ParquetConfiguration -->
-              <exclude>org.apache.parquet.hadoop.ParquetReader</exclude> <!-- 
change field type from Configuration to ParquetConfiguration -->
-              
<exclude>org.apache.parquet.thrift.projection.deprecated.PathGlobPattern</exclude>
-              <!-- japicmp is overly aggressive on interface types in 
signatures, a type was changed to a supertype but this still triggers it -->
-              
<exclude>org.apache.parquet.hadoop.ColumnChunkPageWriteStore</exclude>
-              <exclude>org.apache.parquet.hadoop.ParquetRecordWriter</exclude>
-              <!-- likely japicmp bug, triggers on new interface methods after 
updating to 0.18.1 -->
-              
<exclude>org.apache.parquet.conf.PlainParquetConfiguration#getClass(java.lang.String,java.lang.Class,java.lang.Class)</exclude>
-              
<exclude>org.apache.parquet.conf.ParquetConfiguration#getClass(java.lang.String,java.lang.Class,java.lang.Class)</exclude>
-              
<exclude>org.apache.parquet.hadoop.util.SerializationUtil#readObjectFromConfAsBase64(java.lang.String,org.apache.parquet.conf.ParquetConfiguration)</exclude>
-              
<exclude>org.apache.parquet.conf.HadoopParquetConfiguration#getClass(java.lang.String,java.lang.Class,java.lang.Class)</exclude>
-              
<exclude>org.apache.parquet.avro.AvroParquetReader#builder(org.apache.parquet.io.InputFile,org.apache.parquet.conf.ParquetConfiguration)</exclude>
-              
<exclude>org.apache.parquet.hadoop.thrift.TBaseWriteSupport#setThriftClass(org.apache.parquet.conf.ParquetConfiguration,java.lang.Class)</exclude>
-              
<exclude>org.apache.parquet.proto.ProtoParquetReader#builder(org.apache.hadoop.fs.Path,boolean)</exclude>
-              
<exclude>org.apache.parquet.proto.ProtoParquetReader#builder(org.apache.parquet.io.InputFile,boolean)</exclude>
-
+              <!-- Due to protected field type change from Configuration to 
ParquetConfiguration -->
+              
<exclude>org.apache.parquet.hadoop.CodecFactory#configuration</exclude>

Review Comment:
   Well, if the field is protected, then technically we introduce breaking 
changes by altering them. I think, the question is how easy would it be to fix 
it in the code. If it requires big changes/redesign, I am fine having the 
exclusions. Otherwise, let's fix it in the code. We should avoid the practice 
of adding exclusions instead of trying to make backward compatible code.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to