[CARBONDATA-2802][BloomDataMap] Remove clearing cache after rebuiding index 
datamap

This is no need to clear cache after rebuilding index datamap due to the
following reasons:

1.currently it will clear all the caches for all index datamaps, not
only for the current rebuilding one
2.the life cycle of table data and index datamap data is the same,
there is no need to clear it. (once the index datamap is created or
once the main table is loaded, data of the datamap will be generated too
-- in both scenarios, data of the datamap is up to date with the main
table.

This closes #2597


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/26f72d23
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/26f72d23
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/26f72d23

Branch: refs/heads/branch-1.4
Commit: 26f72d23b8b896a6cb5a48bfd353a4e6e63278d5
Parents: 5718382
Author: xuchuanyin <xuchuan...@hust.edu.cn>
Authored: Thu Aug 2 10:45:17 2018 +0800
Committer: ravipesala <ravi.pes...@gmail.com>
Committed: Thu Aug 9 23:43:36 2018 +0530

----------------------------------------------------------------------
 .../org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala      | 1 -
 1 file changed, 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/carbondata/blob/26f72d23/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
----------------------------------------------------------------------
diff --git 
a/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
 
b/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
index 2d684bf..f92ed6c 100644
--- 
a/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
@@ -131,7 +131,6 @@ object IndexDataMapRebuildRDD {
     if (failedSegments.nonEmpty) {
       throw new Exception(s"Failed to refresh datamap ${ schema.getDataMapName 
}")
     }
-    DataMapStoreManager.getInstance().clearDataMaps(tableIdentifier)
 
     val buildDataMapPostExecutionEvent = new 
BuildDataMapPostExecutionEvent(sparkSession,
       tableIdentifier)

Reply via email to