Github user wgtmac commented on a diff in the pull request:

    https://github.com/apache/orc/pull/245#discussion_r181950612
  
    --- Diff: site/_docs/encodings.md ---
    @@ -109,10 +109,20 @@ DIRECT_V2     | PRESENT         | Yes      | Boolean 
RLE
     Decimal was introduced in Hive 0.11 with infinite precision (the total
     number of digits). In Hive 0.13, the definition was change to limit
     the precision to a maximum of 38 digits, which conveniently uses 127
    -bits plus a sign bit. The current encoding of decimal columns stores
    -the integer representation of the value as an unbounded length zigzag
    -encoded base 128 varint. The scale is stored in the SECONDARY stream
    -as an signed integer.
    +bits plus a sign bit.
    +
    +DIRECT and DIRECT_V2 encodings of decimal columns stores the integer
    +representation of the value as an unbounded length zigzag encoded base
    +128 varint. The scale is stored in the SECONDARY stream as an signed
    +integer.
    +
    +In ORC 2.0, DECIMAL_V1 and DECIMAL_V2 encodins are introduced and
    --- End diff --
    
    As I have said, I don't see any obvious benefit to use RLEv2 and abandon 
RLEv1 in our experiment. So I don't think it is a good idea not to provide an 
option to choose a RLE version.



---

Reply via email to