Github user akashrn5 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2533#discussion_r204282454
  
    --- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/hive/CarbonRelation.scala
 ---
    @@ -173,15 +175,38 @@ case class CarbonRelation(
                 .getValidAndInvalidSegments.getValidSegments.asScala
               var size = 0L
               // for each segment calculate the size
    -          segments.foreach {validSeg =>
    -            // for older store
    -            if (null != validSeg.getLoadMetadataDetails.getDataSize &&
    -                null != validSeg.getLoadMetadataDetails.getIndexSize) {
    -              size = size + 
validSeg.getLoadMetadataDetails.getDataSize.toLong +
    -                     validSeg.getLoadMetadataDetails.getIndexSize.toLong
    -            } else {
    -              size = size + FileFactory.getDirectorySize(
    -                CarbonTablePath.getSegmentPath(tablePath, 
validSeg.getSegmentNo))
    +          if 
(carbonTable.getTableInfo.getFactTable.getTableProperties.asScala
    +                .get(CarbonCommonConstants.FLAT_FOLDER).isDefined &&
    +              
carbonTable.getTableInfo.getFactTable.getTableProperties.asScala
    --- End diff --
    
    if `validSeg.getLoadMetadataDetails.getDataSize` or 
`alidSeg.getLoadMetadataDetails.getIndexSize` is null, then it will try to get 
size using the segment path which will not present in flat folder case, it will 
throw segment does not exists exception, i got this exception, so we can handle 
like this


---

Reply via email to