Ted-Jiang commented on code in PR #2111:
URL: https://github.com/apache/arrow-rs/pull/2111#discussion_r925727366


##########
parquet/src/column/reader/decoder.rs:
##########
@@ -307,10 +307,28 @@ impl ColumnLevelDecoder for ColumnLevelDecoderImpl {
 impl DefinitionLevelDecoder for ColumnLevelDecoderImpl {
     fn skip_def_levels(
         &mut self,
-        _num_levels: usize,
-        _max_def_level: i16,
+        num_levels: usize,
+        max_def_level: i16,
     ) -> Result<(usize, usize)> {
-        Err(nyi_err!("https://github.com/apache/arrow-rs/issues/1792";))
+        // For now only support max_def_level == 1
+        if max_def_level == 1 {

Review Comment:
   @tustvold  Thanks for your kindly explanation! 
   >Unfortunately max_def_level = 1 is an insufficient condition for using a 
PackedDecoder, as the nulls may not exist at the leaf level.
   
   You are right, i found it at add the integration testing😂.
   
   >The way this currently works is the ArrayReader implementation will pass 
either a packed or full DefinitionLevelBuffer to RecordReader, which will in 
turn pass this to DefinitionLevelBufferDecoder. This is a bit of a hack but 
avoided leaking arrow-specific details into the GenericRecordReader.
    
   Got it ! I think after 
[7a2fa01](https://github.com/apache/arrow-rs/pull/2111/commits/7a2fa01e3f048447e61740aa02a12cd768ae11b6)
 add skip_def in  `ColumnLevelDecoderImpl`. I think It's enough to decode all 
types col with `R:0 D:1` ? If i miss something plz tell me 😭
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to