Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/3029#discussion_r244959393
  
    --- Diff: 
processing/src/main/java/org/apache/carbondata/processing/sort/sortdata/SingleThreadFinalSortFilesMerger.java
 ---
    @@ -114,6 +113,31 @@ public void startFinalMerge() throws 
CarbonDataWriterException {
         startSorting(filesToMerge);
       }
     
    +  /**
    +   * Below method will be used to add in memory raw result iterator to 
priority queue.
    +   * This will be called in case of compaction, when it is compacting 
sorted and unsorted
    +   * both type of carbon data file
    +   * This method will add sorted file's RawResultIterator to priority 
queue using
    +   * InMemorySortTempChunkHolder as wrapper
    +   *
    +   * @param sortedRawResultMergerList
    +   * @param segmentProperties
    +   * @param noDicAndComplexColumns
    +   * @throws CarbonSortKeyAndGroupByException
    +   */
    +  public void addInMemoryRawResultIterator(List<RawResultIterator> 
sortedRawResultMergerList,
    +      SegmentProperties segmentProperties, CarbonColumn[] 
noDicAndComplexColumns,
    +      DataType[] measureDataType)
    +      throws CarbonSortKeyAndGroupByException {
    +    for (RawResultIterator rawResultIterator : sortedRawResultMergerList) {
    +      InMemorySortTempChunkHolder inMemorySortTempChunkHolder =
    +          new InMemorySortTempChunkHolder(rawResultIterator, 
segmentProperties,
    +              noDicAndComplexColumns, sortParameters, measureDataType);
    +      inMemorySortTempChunkHolder.readRow();
    --- End diff --
    
    Don't need to check hasNext here before reading the row first time?


---

Reply via email to