Github user gvramana commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1641#discussion_r156276866
  
    --- Diff: 
integration/spark2/src/main/scala/org/apache/carbondata/spark/rdd/CarbonDataRDDFactory.scala
 ---
    @@ -486,6 +486,21 @@ object CarbonDataRDDFactory {
           // if segment is empty then fail the data load
           if 
(!carbonLoadModel.getCarbonDataLoadSchema.getCarbonTable.isChildDataMap &&
               !CarbonLoaderUtil.isValidSegment(carbonLoadModel, 
carbonLoadModel.getSegmentId.toInt)) {
    +
    +        if (overwriteTable && dataFrame.isDefined) {
    +          carbonLoadModel.getLoadMetadataDetails.asScala.foreach {
    +            loadDetails =>
    +              if 
(loadDetails.getSegmentStatus.equals(SegmentStatus.SUCCESS)) {
    +                
loadDetails.setSegmentStatus(SegmentStatus.MARKED_FOR_DELETE)
    +              }
    +          }
    +          val carbonTablePath = CarbonStorePath
    --- End diff --
    
    1) loadTablePreStatusUpdateEvent is not fired,
    2) how about old dictionary to be overwritten?
    3) updatestatus file also needs to be handled accordingly.
    Suggest to flow the original flow handling empty segment case


---

Reply via email to