ajantha-bhat commented on a change in pull request #3887:
URL: https://github.com/apache/carbondata/pull/3887#discussion_r477058306



##########
File path: 
core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/adaptive/AdaptiveDeltaFloatingCodec.java
##########
@@ -282,6 +288,12 @@ public void decodeAndFillVector(byte[] pageData, 
ColumnVectorInfo vectorInfo, Bi
           for (int i = 0; i < size; i += DataTypes.INT.getSizeInBytes()) {
             vector.putFloat(rowId++, (max - 
ByteUtil.toIntLittleEndian(pageData, i)) / floatFactor);
           }
+        } else if (pageDataType == DataTypes.LONG) {

Review comment:
       For this, I have checked 
   If No complex type, (if it is just primitive type) same values goes to 
DirectCompress, not adaptive. But for complex primitive it goes to adaptive 
because of below code. And as min max is stored as double precision. Long is 
chosen for this.
   
   
   `DefaultEncodingFactory#selectCodecByAlgorithmForFloating()`
   
   ```
   } else if (decimalCount < 0 && !isComplexPrimitive) {
         return new DirectCompressCodec(DataTypes.DOUBLE);
       } else {
         return getColumnPageCodec(stats, isComplexPrimitive, columnSpec, 
srcDataType, maxValue,
             minValue, decimalCount, absMaxValue);
       }
   ```
   
   I don't know (remember) why complex primitive should not enter direct 
compress.  why that check is explicitly added.
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to