[CARBONDATA-2802][BloomDataMap] Remove clearing cache after rebuiding index 
datamap

This is no need to clear cache after rebuilding index datamap due to the
following reasons:

1.currently it will clear all the caches for all index datamaps, not
only for the current rebuilding one
2.the life cycle of table data and index datamap data is the same,
there is no need to clear it. (once the index datamap is created or
once the main table is loaded, data of the datamap will be generated too
-- in both scenarios, data of the datamap is up to date with the main
table.

This closes #2597


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/26d9f3d8
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/26d9f3d8
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/26d9f3d8

Branch: refs/heads/external-format
Commit: 26d9f3d8e4cbba1242768eec46e8b119b6678bfe
Parents: 38384cb
Author: xuchuanyin <xuchuan...@hust.edu.cn>
Authored: Thu Aug 2 10:45:17 2018 +0800
Committer: Jacky Li <jacky.li...@qq.com>
Committed: Fri Aug 3 11:55:24 2018 +0800

----------------------------------------------------------------------
 .../org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala      | 1 -
 1 file changed, 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/carbondata/blob/26d9f3d8/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
----------------------------------------------------------------------
diff --git 
a/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
 
b/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
index 2d684bf..f92ed6c 100644
--- 
a/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
@@ -131,7 +131,6 @@ object IndexDataMapRebuildRDD {
     if (failedSegments.nonEmpty) {
       throw new Exception(s"Failed to refresh datamap ${ schema.getDataMapName 
}")
     }
-    DataMapStoreManager.getInstance().clearDataMaps(tableIdentifier)
 
     val buildDataMapPostExecutionEvent = new 
BuildDataMapPostExecutionEvent(sparkSession,
       tableIdentifier)

Reply via email to