[ 
https://issues.apache.org/jira/browse/PARQUET-2052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17351192#comment-17351192
 ] 

ASF GitHub Bot commented on PARQUET-2052:
-----------------------------------------

sunchao commented on pull request #910:
URL: https://github.com/apache/parquet-mr/pull/910#issuecomment-848039158


   Thanks @gszadovszky and @shangxinli for taking a look. 
   
   > It is another question if all the encoders used for writing data pages for 
binary values are prepared for similar situations.
   
   After fallback from dictionary encoding, I think it will fail later [during 
writing data 
page](https://github.com/apache/parquet-mr/blob/apache-parquet-1.12.0/parquet-hadoop/src/main/java/org/apache/parquet/hadoop/ColumnChunkPageWriteStore.java#L163).
 I think this is still better than generating corrupted page.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


> Integer overflow when writing huge binary using dictionary encoding
> -------------------------------------------------------------------
>
>                 Key: PARQUET-2052
>                 URL: https://issues.apache.org/jira/browse/PARQUET-2052
>             Project: Parquet
>          Issue Type: Bug
>            Reporter: Chao Sun
>            Assignee: Chao Sun
>            Priority: Major
>
> To check whether it should fallback to plain encoding, 
> {{DictionaryValuesWriter}} currently use two variables: 
> {{dictionaryByteSize}} and {{maxDictionaryByteSize}}, both of which are 
> integer. This will cause issue when one first writes a relatively small 
> binary within the threshold and then write a huge string which cause 
> {{dictionaryByteSize}} overflow and becoming negative.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to