Jenkins build is still unstable: carbondata-master-spark-2.2 #836

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build became unstable: carbondata-master-spark-2.2 » Apache CarbonData :: Spark Common Test #836

2018-08-02 Thread Apache Jenkins Server
See 




Build failed in Jenkins: carbondata-master-spark-2.1 #2757

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 23376, done.
remote: Compressing objects:   0% (1/7729)   remote: Compressing 
objects:   1% (78/7729)   remote: Compressing objects:   2% (155/7729)  
 remote: Compressing objects:   3% (232/7729)   remote: 
Compressing objects:   4% (310/7729)   remote: Compressing objects:   
5% (387/7729)   remote: Compressing objects:   6% (464/7729)   
remote: Compressing objects:   7% (542/7729)   remote: Compressing 
objects:   8% (619/7729)   remote: Compressing objects:   9% (696/7729) 
  remote: Compressing objects:  10% (773/7729)   remote: 
Compressing objects:  11% (851/7729)   remote: Compressing objects:  
12% (928/7729)   remote: Compressing objects:  13% (1005/7729)  
 remote: Compressing objects:  14% (1083/7729)   remote: Compressing 
objects:  15% (1160/7729)   remote: Compressing objects:  16% 
(1237/7729)   remote: Compressing objects:  17% (1314/7729)   
remote: Compressing objects:  18% (1392/7729)   remote: Compressing 
objects:  19% (1469/7729)   remote: Compressing objects:  20% 
(1546/7729)   remote: Compressing objects:  21% (1624/7729)   
remote: Compressing objects:  22% (1701/7729)   remote: Compressing 
objects:  23% (1778/7729)   remote: Compressing objects:  24% 
(1855/7729)   remote: Compressing objects:  25% (1933/7729)   
remote: Compressing objects:  26% (2010/7729)   remote: Compressing 
objects:  27% (2087/7729)   remote: Compressing objects:  28% 
(2165/7729)   remote: Compressing objects:  29% (2242/7729)   
remote: Compressing objects:  30% (2319/7729)   remote: Compressing 
objects:  31% (2396/7729)   remote: Compressing objects:  32% 
(2474/7729)   remote: Compressing objects:  33% (2551/7729)   
remote: Compressing objects:  34% (2628/7729)   remote: Compressing 
objects:  35% (2706/7729)   remote: Compressing objects:  36% 
(2783/7729)   remote: Compressing objects:  37% (2860/7729)   
remote: Compressing objects:  38% (2938/7729)   remote: Compressing 
objects:  39% (3015/7729)   remote: Compressing objects:  40% 
(3092/7729)   remote: Compressing objects:  41% (3169/7729)   
remote: Compressing objects:  42% (3247/7729)   remote: Compressing 
objects:  43% (3324/7729)   remote: Compressing objects:  44% 
(3401/7729)   remote: Compressing objects:  45% (3479/7729)   
remote: Compressing objects:  46% (3556/7729)   remote: Compressing 
objects:  47% (3633/7729)   remote: Compressing objects:  48% 
(3710/7729)   remote: Compressing objects:  49% (3788/7729)   
remote: Compressing objects:  50% (3865/7729)   remote: Compressing 
objects:  

carbondata git commit: [CARBONDATA-2802][BloomDataMap] Remove clearing cache after rebuiding index datamap

2018-08-02 Thread jackylk
Repository: carbondata
Updated Branches:
  refs/heads/master 38384cb9f -> 26d9f3d8e


[CARBONDATA-2802][BloomDataMap] Remove clearing cache after rebuiding index 
datamap

This is no need to clear cache after rebuilding index datamap due to the
following reasons:

1.currently it will clear all the caches for all index datamaps, not
only for the current rebuilding one
2.the life cycle of table data and index datamap data is the same,
there is no need to clear it. (once the index datamap is created or
once the main table is loaded, data of the datamap will be generated too
-- in both scenarios, data of the datamap is up to date with the main
table.

This closes #2597


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/26d9f3d8
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/26d9f3d8
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/26d9f3d8

Branch: refs/heads/master
Commit: 26d9f3d8e4cbba1242768eec46e8b119b6678bfe
Parents: 38384cb
Author: xuchuanyin 
Authored: Thu Aug 2 10:45:17 2018 +0800
Committer: Jacky Li 
Committed: Fri Aug 3 11:55:24 2018 +0800

--
 .../org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala  | 1 -
 1 file changed, 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/26d9f3d8/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
 
b/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
index 2d684bf..f92ed6c 100644
--- 
a/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/carbondata/datamap/IndexDataMapRebuildRDD.scala
@@ -131,7 +131,6 @@ object IndexDataMapRebuildRDD {
 if (failedSegments.nonEmpty) {
   throw new Exception(s"Failed to refresh datamap ${ schema.getDataMapName 
}")
 }
-DataMapStoreManager.getInstance().clearDataMaps(tableIdentifier)
 
 val buildDataMapPostExecutionEvent = new 
BuildDataMapPostExecutionEvent(sparkSession,
   tableIdentifier)



Jenkins build is back to normal : carbondata-master-spark-2.1 #2756

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : carbondata-master-spark-2.1 #2754

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : carbondata-master-spark-2.2 #833

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is unstable: carbondata-master-spark-2.1 #2748

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark Common Test #2748

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Processing #2748

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : carbondata-master-spark-2.1 #2750

2018-08-02 Thread Apache Jenkins Server
See 




Build failed in Jenkins: carbondata-master-spark-2.1 #2755

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 23364, done.
remote: Compressing objects:   0% (1/7719)   remote: Compressing 
objects:   1% (78/7719)   remote: Compressing objects:   2% (155/7719)  
 remote: Compressing objects:   3% (232/7719)   remote: 
Compressing objects:   4% (309/7719)   remote: Compressing objects:   
5% (386/7719)   remote: Compressing objects:   6% (464/7719)   
remote: Compressing objects:   7% (541/7719)   remote: Compressing 
objects:   8% (618/7719)   remote: Compressing objects:   9% (695/7719) 
  remote: Compressing objects:  10% (772/7719)   remote: 
Compressing objects:  11% (850/7719)   remote: Compressing objects:  
12% (927/7719)   remote: Compressing objects:  13% (1004/7719)  
 remote: Compressing objects:  14% (1081/7719)   remote: Compressing 
objects:  15% (1158/7719)   remote: Compressing objects:  16% 
(1236/7719)   remote: Compressing objects:  17% (1313/7719)   
remote: Compressing objects:  18% (1390/7719)   remote: Compressing 
objects:  19% (1467/7719)   remote: Compressing objects:  20% 
(1544/7719)   remote: Compressing objects:  21% (1621/7719)   
remote: Compressing objects:  22% (1699/7719)   remote: Compressing 
objects:  23% (1776/7719)   remote: Compressing objects:  24% 
(1853/7719)   remote: Compressing objects:  25% (1930/7719)   
remote: Compressing objects:  26% (2007/7719)   remote: Compressing 
objects:  27% (2085/7719)   remote: Compressing objects:  28% 
(2162/7719)   remote: Compressing objects:  29% (2239/7719)   
remote: Compressing objects:  30% (2316/7719)   remote: Compressing 
objects:  31% (2393/7719)   remote: Compressing objects:  32% 
(2471/7719)   remote: Compressing objects:  33% (2548/7719)   
remote: Compressing objects:  34% (2625/7719)   remote: Compressing 
objects:  35% (2702/7719)   remote: Compressing objects:  36% 
(2779/7719)   remote: Compressing objects:  37% (2857/7719)   
remote: Compressing objects:  38% (2934/7719)   remote: Compressing 
objects:  39% (3011/7719)   remote: Compressing objects:  40% 
(3088/7719)   remote: Compressing objects:  41% (3165/7719)   
remote: Compressing objects:  42% (3242/7719)   remote: Compressing 
objects:  43% (3320/7719)   remote: Compressing objects:  44% 
(3397/7719)   remote: Compressing objects:  45% (3474/7719)   
remote: Compressing objects:  46% (3551/7719)   remote: Compressing 
objects:  47% (3628/7719)   remote: Compressing objects:  48% 
(3706/7719)   remote: Compressing objects:  49% (3783/7719)   
remote: Compressing objects:  50% (3860/7719)   remote: Compressing 
objects:  

carbondata git commit: [CARBONDATA-2813] Fixed code to get data size from LoadDetails if size is written there

2018-08-02 Thread manishgupta88
Repository: carbondata
Updated Branches:
  refs/heads/master f2e898ac5 -> 38384cb9f


[CARBONDATA-2813] Fixed code to get data size from LoadDetails if size is 
written there

Problem:
In 1.3.x when index files are merged to form mergeindex file a mapping of which 
index files if merged to which mergeindex is kept in the segments
file. In 1.4.x both the index and merge index files are scanned to calculate 
the size of segments for major compaction. As the index file was
deleted in the 1.3.x store therefore in 1.4.x it was throwing "Unable to get 
File status exception".

Solution:
Try to the size of the segments from LoadMetadataDetails. If not present then 
try to read the size from index files.

This closes #2600


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/38384cb9
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/38384cb9
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/38384cb9

Branch: refs/heads/master
Commit: 38384cb9f309cc7eb83e61e85c48dd8583921004
Parents: f2e898a
Author: kunal642 
Authored: Thu Aug 2 11:44:20 2018 +0530
Committer: manishgupta88 
Committed: Thu Aug 2 18:14:56 2018 +0530

--
 .../processing/merger/CarbonDataMergerUtil.java | 12 ++--
 1 file changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/38384cb9/processing/src/main/java/org/apache/carbondata/processing/merger/CarbonDataMergerUtil.java
--
diff --git 
a/processing/src/main/java/org/apache/carbondata/processing/merger/CarbonDataMergerUtil.java
 
b/processing/src/main/java/org/apache/carbondata/processing/merger/CarbonDataMergerUtil.java
index 1162fc2..e3da86d 100644
--- 
a/processing/src/main/java/org/apache/carbondata/processing/merger/CarbonDataMergerUtil.java
+++ 
b/processing/src/main/java/org/apache/carbondata/processing/merger/CarbonDataMergerUtil.java
@@ -49,6 +49,8 @@ import 
org.apache.carbondata.core.writer.CarbonDeleteDeltaWriterImpl;
 import org.apache.carbondata.processing.loading.model.CarbonLoadModel;
 import org.apache.carbondata.processing.util.CarbonLoaderUtil;
 
+import org.apache.commons.lang.StringUtils;
+
 /**
  * utility class for load merging.
  */
@@ -649,8 +651,14 @@ public final class CarbonDataMergerUtil {
   // variable to store one  segment size across partition.
   long sizeOfOneSegmentAcrossPartition;
   if (segment.getSegmentFile() != null) {
-sizeOfOneSegmentAcrossPartition = CarbonUtil.getSizeOfSegment(
-carbonTable.getTablePath(), new Segment(segId, 
segment.getSegmentFile()));
+// If LoadMetaDataDetail already has data size no need to calculate 
the data size from
+// index files. If not there then read the index file and calculate 
size.
+if (!StringUtils.isEmpty(segment.getDataSize())) {
+  sizeOfOneSegmentAcrossPartition = 
Long.parseLong(segment.getDataSize());
+} else {
+  sizeOfOneSegmentAcrossPartition = 
CarbonUtil.getSizeOfSegment(carbonTable.getTablePath(),
+  new Segment(segId, segment.getSegmentFile()));
+}
   } else {
 sizeOfOneSegmentAcrossPartition = 
getSizeOfSegment(carbonTable.getTablePath(), segId);
   }



Build failed in Jenkins: carbondata-master-spark-2.1 #2753

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H27 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 23249, done.
remote: Compressing objects:   0% (1/7693)   remote: Compressing 
objects:   1% (77/7693)   remote: Compressing objects:   2% (154/7693)  
 remote: Compressing objects:   3% (231/7693)   remote: 
Compressing objects:   4% (308/7693)   remote: Compressing objects:   
5% (385/7693)   remote: Compressing objects:   6% (462/7693)   
remote: Compressing objects:   7% (539/7693)   remote: Compressing 
objects:   8% (616/7693)   remote: Compressing objects:   9% (693/7693) 
  remote: Compressing objects:  10% (770/7693)   remote: 
Compressing objects:  11% (847/7693)   remote: Compressing objects:  
12% (924/7693)   remote: Compressing objects:  13% (1001/7693)  
 remote: Compressing objects:  14% (1078/7693)   remote: Compressing 
objects:  15% (1154/7693)   remote: Compressing objects:  16% 
(1231/7693)   remote: Compressing objects:  17% (1308/7693)   
remote: Compressing objects:  18% (1385/7693)   remote: Compressing 
objects:  19% (1462/7693)   remote: Compressing objects:  20% 
(1539/7693)   remote: Compressing objects:  21% (1616/7693)   
remote: Compressing objects:  22% (1693/7693)   remote: Compressing 
objects:  23% (1770/7693)   remote: Compressing objects:  24% 
(1847/7693)   remote: Compressing objects:  25% (1924/7693)   
remote: Compressing objects:  26% (2001/7693)   remote: Compressing 
objects:  27% (2078/7693)   remote: Compressing objects:  28% 
(2155/7693)   remote: Compressing objects:  29% (2231/7693)   
remote: Compressing objects:  30% (2308/7693)   remote: Compressing 
objects:  31% (2385/7693)   remote: Compressing objects:  32% 
(2462/7693)   remote: Compressing objects:  33% (2539/7693)   
remote: Compressing objects:  34% (2616/7693)   remote: Compressing 
objects:  35% (2693/7693)   remote: Compressing objects:  36% 
(2770/7693)   remote: Compressing objects:  37% (2847/7693)   
remote: Compressing objects:  38% (2924/7693)   remote: Compressing 
objects:  39% (3001/7693)   remote: Compressing objects:  40% 
(3078/7693)   remote: Compressing objects:  41% (3155/7693)   
remote: Compressing objects:  42% (3232/7693)   remote: Compressing 
objects:  43% (3308/7693)   remote: Compressing objects:  44% 
(3385/7693)   remote: Compressing objects:  45% (3462/7693)   
remote: Compressing objects:  46% (3539/7693)   remote: Compressing 
objects:  47% (3616/7693)   remote: Compressing objects:  48% 
(3693/7693)   remote: Compressing objects:  49% (3770/7693)   
remote: Compressing objects:  50% (3847/7693)   remote: Compressing 
objects:  

carbondata git commit: [CARBONDATA-2812] Implement freeMemory for complex pages

2018-08-02 Thread kunalkapoor
Repository: carbondata
Updated Branches:
  refs/heads/master a2928e314 -> f2e898ac5


[CARBONDATA-2812] Implement freeMemory for complex pages

Problem:
The memory used by the ColumnPageWrapper (for complex data types) is not
cleared and so it requires more memory to Load and Query.

Solution:
Clear the used memory in the freeMemory method.

This closes #2599


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/f2e898ac
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/f2e898ac
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/f2e898ac

Branch: refs/heads/master
Commit: f2e898ac585458b6c99e08c8fac0e47bec93fee0
Parents: a2928e3
Author: dhatchayani 
Authored: Thu Aug 2 08:30:32 2018 +0530
Committer: kunal642 
Committed: Thu Aug 2 17:49:26 2018 +0530

--
 .../core/datastore/chunk/store/ColumnPageWrapper.java   | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/f2e898ac/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/ColumnPageWrapper.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/ColumnPageWrapper.java
 
b/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/ColumnPageWrapper.java
index 180b3a2..a5d5917 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/ColumnPageWrapper.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/ColumnPageWrapper.java
@@ -163,7 +163,10 @@ public class ColumnPageWrapper implements 
DimensionColumnPage {
 
   @Override
   public void freeMemory() {
-
+if (null != columnPage) {
+  columnPage.freeMemory();
+  columnPage = null;
+}
   }
 
   public boolean isAdaptiveComplexPrimitive() {



Build failed in Jenkins: carbondata-master-spark-2.2 #832

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H33 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22543, done.
remote: Compressing objects:   0% (1/7468)   remote: Compressing 
objects:   1% (75/7468)   remote: Compressing objects:   2% (150/7468)  
 remote: Compressing objects:   3% (225/7468)   remote: 
Compressing objects:   4% (299/7468)   remote: Compressing objects:   
5% (374/7468)   remote: Compressing objects:   6% (449/7468)   
remote: Compressing objects:   7% (523/7468)   remote: Compressing 
objects:   8% (598/7468)   remote: Compressing objects:   9% (673/7468) 
  remote: Compressing objects:  10% (747/7468)   remote: 
Compressing objects:  11% (822/7468)   remote: Compressing objects:  
12% (897/7468)   remote: Compressing objects:  13% (971/7468)   
remote: Compressing objects:  14% (1046/7468)   remote: Compressing 
objects:  15% (1121/7468)   remote: Compressing objects:  16% 
(1195/7468)   remote: Compressing objects:  17% (1270/7468)   
remote: Compressing objects:  18% (1345/7468)   remote: Compressing 
objects:  19% (1419/7468)   remote: Compressing objects:  20% 
(1494/7468)   remote: Compressing objects:  21% (1569/7468)   
remote: Compressing objects:  22% (1643/7468)   remote: Compressing 
objects:  23% (1718/7468)   remote: Compressing objects:  24% 
(1793/7468)   remote: Compressing objects:  25% (1867/7468)   
remote: Compressing objects:  26% (1942/7468)   remote: Compressing 
objects:  27% (2017/7468)   remote: Compressing objects:  28% 
(2092/7468)   remote: Compressing objects:  29% (2166/7468)   
remote: Compressing objects:  30% (2241/7468)   remote: Compressing 
objects:  31% (2316/7468)   remote: Compressing objects:  32% 
(2390/7468)   remote: Compressing objects:  33% (2465/7468)   
remote: Compressing objects:  34% (2540/7468)   remote: Compressing 
objects:  35% (2614/7468)   remote: Compressing objects:  36% 
(2689/7468)   remote: Compressing objects:  37% (2764/7468)   
remote: Compressing objects:  38% (2838/7468)   remote: Compressing 
objects:  39% (2913/7468)   remote: Compressing objects:  40% 
(2988/7468)   remote: Compressing objects:  41% (3062/7468)   
remote: Compressing objects:  42% (3137/7468)   remote: Compressing 
objects:  43% (3212/7468)   remote: Compressing objects:  44% 
(3286/7468)   remote: Compressing objects:  45% (3361/7468)   
remote: Compressing objects:  46% (3436/7468)   remote: Compressing 
objects:  47% (3510/7468)   remote: Compressing objects:  48% 
(3585/7468)   remote: Compressing objects:  49% (3660/7468)   
remote: Compressing objects:  50% (3734/7468)   remote: Compressing 
objects:  51% 

carbondata git commit: [Documentation] [Unsafe Configuration] Added carbon.unsafe.driver.working.memory.in.mb parameter to differentiate between driver and executor unsafe memory

2018-08-02 Thread ravipesala
Repository: carbondata
Updated Branches:
  refs/heads/master 7e93d7b87 -> a2928e314


[Documentation] [Unsafe Configuration] Added 
carbon.unsafe.driver.working.memory.in.mb parameter to differentiate between 
driver and executor unsafe memory

Added carbon.unsafe.driver.working.memory.in.mb parameter to differentiate 
between driver and executor unsafe memory

Usually in production scenarios driver memory will be less than the executor 
memory. Now we are using unsafe for caching block/blocklet dataMap in driver. 
Current unsafe memory configured for executor is getting used for driver also 
which is not a good idea.
Therefore it is required to separate out driver and executor unsafe memory.
You can observe the same in spark configuration also that spark has given 
different parameters for configuring driver and executor memory overhead to 
control the unsafe memory usage.
spark.yarn.driver.memoryOverhead and spark.yarn.executor.memoryOverhead

This closes #2595


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/a2928e31
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/a2928e31
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/a2928e31

Branch: refs/heads/master
Commit: a2928e314a4c45dd35923d7d29b75508e401dd3f
Parents: 7e93d7b
Author: manishgupta88 
Authored: Wed Aug 1 19:38:30 2018 +0530
Committer: ravipesala 
Committed: Thu Aug 2 17:07:16 2018 +0530

--
 .../core/constants/CarbonCommonConstants.java   |  4 
 .../core/memory/UnsafeMemoryManager.java| 25 
 docs/configuration-parameters.md|  2 ++
 3 files changed, 27 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/a2928e31/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
index 6d7215e..e480007 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
@@ -1276,6 +1276,10 @@ public final class CarbonCommonConstants {
   @CarbonProperty
   public static final String UNSAFE_WORKING_MEMORY_IN_MB = 
"carbon.unsafe.working.memory.in.mb";
   public static final String UNSAFE_WORKING_MEMORY_IN_MB_DEFAULT = "512";
+
+  @CarbonProperty
+  public static final String UNSAFE_DRIVER_WORKING_MEMORY_IN_MB =
+  "carbon.unsafe.driver.working.memory.in.mb";
   /**
* Sorts the data in batches and writes the batch data to store with index 
file.
*/

http://git-wip-us.apache.org/repos/asf/carbondata/blob/a2928e31/core/src/main/java/org/apache/carbondata/core/memory/UnsafeMemoryManager.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/memory/UnsafeMemoryManager.java 
b/core/src/main/java/org/apache/carbondata/core/memory/UnsafeMemoryManager.java
index 2115f82..9133f0f 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/memory/UnsafeMemoryManager.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/memory/UnsafeMemoryManager.java
@@ -41,11 +41,28 @@ public class UnsafeMemoryManager {
   CarbonCommonConstants.ENABLE_OFFHEAP_SORT_DEFAULT));
   private static Map> taskIdToMemoryBlockMap;
   static {
-long size;
+long size = 0L;
 try {
-  size = Long.parseLong(CarbonProperties.getInstance()
-  .getProperty(CarbonCommonConstants.UNSAFE_WORKING_MEMORY_IN_MB,
-  CarbonCommonConstants.UNSAFE_WORKING_MEMORY_IN_MB_DEFAULT));
+  // check if driver unsafe memory is configured and JVM process is in 
driver. In that case
+  // initialize unsafe memory configured for driver
+  boolean isDriver = Boolean.parseBoolean(CarbonProperties.getInstance()
+  .getProperty(CarbonCommonConstants.IS_DRIVER_INSTANCE, "false"));
+  boolean initializedWithUnsafeDriverMemory = false;
+  if (isDriver) {
+String driverUnsafeMemorySize = CarbonProperties.getInstance()
+
.getProperty(CarbonCommonConstants.UNSAFE_DRIVER_WORKING_MEMORY_IN_MB);
+if (null != driverUnsafeMemorySize) {
+  size = Long.parseLong(CarbonProperties.getInstance()
+  
.getProperty(CarbonCommonConstants.UNSAFE_DRIVER_WORKING_MEMORY_IN_MB,
+  CarbonCommonConstants.UNSAFE_WORKING_MEMORY_IN_MB_DEFAULT));
+  initializedWithUnsafeDriverMemory = true;
+}
+  }
+  if (!initializedWithUnsafeDriverMemory) {
+size = 

Build failed in Jenkins: carbondata-master-spark-2.2 #831

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H33 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22524, done.
remote: Compressing objects:   0% (1/7456)   remote: Compressing 
objects:   1% (75/7456)   remote: Compressing objects:   2% (150/7456)  
 remote: Compressing objects:   3% (224/7456)   remote: 
Compressing objects:   4% (299/7456)   remote: Compressing objects:   
5% (373/7456)   remote: Compressing objects:   6% (448/7456)   
remote: Compressing objects:   7% (522/7456)   remote: Compressing 
objects:   8% (597/7456)   remote: Compressing objects:   9% (672/7456) 
  remote: Compressing objects:  10% (746/7456)   remote: 
Compressing objects:  11% (821/7456)   remote: Compressing objects:  
12% (895/7456)   remote: Compressing objects:  13% (970/7456)   
remote: Compressing objects:  14% (1044/7456)   remote: Compressing 
objects:  15% (1119/7456)   remote: Compressing objects:  16% 
(1193/7456)   remote: Compressing objects:  17% (1268/7456)   
remote: Compressing objects:  18% (1343/7456)   remote: Compressing 
objects:  19% (1417/7456)   remote: Compressing objects:  20% 
(1492/7456)   remote: Compressing objects:  21% (1566/7456)   
remote: Compressing objects:  22% (1641/7456)   remote: Compressing 
objects:  23% (1715/7456)   remote: Compressing objects:  24% 
(1790/7456)   remote: Compressing objects:  25% (1864/7456)   
remote: Compressing objects:  26% (1939/7456)   remote: Compressing 
objects:  27% (2014/7456)   remote: Compressing objects:  28% 
(2088/7456)   remote: Compressing objects:  29% (2163/7456)   
remote: Compressing objects:  30% (2237/7456)   remote: Compressing 
objects:  31% (2312/7456)   remote: Compressing objects:  32% 
(2386/7456)   remote: Compressing objects:  33% (2461/7456)   
remote: Compressing objects:  34% (2536/7456)   remote: Compressing 
objects:  35% (2610/7456)   remote: Compressing objects:  36% 
(2685/7456)   remote: Compressing objects:  37% (2759/7456)   
remote: Compressing objects:  38% (2834/7456)   remote: Compressing 
objects:  39% (2908/7456)   remote: Compressing objects:  40% 
(2983/7456)   remote: Compressing objects:  41% (3057/7456)   
remote: Compressing objects:  42% (3132/7456)   remote: Compressing 
objects:  43% (3207/7456)   remote: Compressing objects:  44% 
(3281/7456)   remote: Compressing objects:  45% (3356/7456)   
remote: Compressing objects:  46% (3430/7456)   remote: Compressing 
objects:  47% (3505/7456)   remote: Compressing objects:  48% 
(3579/7456)   remote: Compressing objects:  49% (3654/7456)   
remote: Compressing objects:  50% (3728/7456)   remote: Compressing 
objects:  51% 

carbondata git commit: [CARBONDATA-2803]fix wrong datasize calculation and Refactoring for better readability and handle local dictionary for older tables

2018-08-02 Thread ravipesala
Repository: carbondata
Updated Branches:
  refs/heads/master bd6abbbff -> 7e93d7b87


[CARBONDATA-2803]fix wrong datasize calculation and Refactoring for better 
readability and handle local dictionary for older tables

Changes in this PR:
1.data size was calculation wrongly, indexmap contains duplicate paths as it 
stores all blocklets, so remove duplicate and maintain uniq block paths for 
proper datasize calculation
2.Refactored code for better readability in carbonTableInputFormat
3. If the tableperoperties contain local dictionary enable property as null, it 
is old table, and put the flase value in the properties map

This closes #2583


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/7e93d7b8
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/7e93d7b8
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/7e93d7b8

Branch: refs/heads/master
Commit: 7e93d7b8707c36bf3f8d1f153b67a8cb997fa0f4
Parents: bd6abbb
Author: akashrn5 
Authored: Mon Jul 30 19:41:34 2018 +0530
Committer: ravipesala 
Committed: Thu Aug 2 17:00:05 2018 +0530

--
 .../indexstore/blockletindex/BlockDataMap.java  |  2 +-
 .../core/metadata/SegmentFileStore.java | 23 ++--
 .../core/metadata/schema/table/CarbonTable.java | 10 ++--
 .../hadoop/api/CarbonTableInputFormat.java  | 61 
 .../FlatFolderTableLoadingTestCase.scala| 31 ++
 5 files changed, 81 insertions(+), 46 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/7e93d7b8/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockDataMap.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockDataMap.java
 
b/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockDataMap.java
index f4bb58e..0875e75 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockDataMap.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockDataMap.java
@@ -588,7 +588,7 @@ public class BlockDataMap extends CoarseGrainDataMap
 
   private boolean useMinMaxForExecutorPruning(FilterResolverIntf 
filterResolverIntf) {
 boolean useMinMaxForPruning = false;
-if (this instanceof BlockletDataMap) {
+if (!isLegacyStore && this instanceof BlockletDataMap) {
   useMinMaxForPruning = BlockletDataMapUtil
   .useMinMaxForBlockletPruning(filterResolverIntf, 
getMinMaxCacheColumns());
 }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/7e93d7b8/core/src/main/java/org/apache/carbondata/core/metadata/SegmentFileStore.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/metadata/SegmentFileStore.java 
b/core/src/main/java/org/apache/carbondata/core/metadata/SegmentFileStore.java
index 111e444..1acf0ea 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/metadata/SegmentFileStore.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/metadata/SegmentFileStore.java
@@ -16,22 +16,9 @@
  */
 package org.apache.carbondata.core.metadata;
 
-import java.io.BufferedReader;
-import java.io.BufferedWriter;
-import java.io.DataInputStream;
-import java.io.DataOutputStream;
-import java.io.File;
-import java.io.IOException;
-import java.io.InputStreamReader;
-import java.io.OutputStreamWriter;
-import java.io.Serializable;
+import java.io.*;
 import java.nio.charset.Charset;
-import java.util.ArrayList;
-import java.util.HashMap;
-import java.util.HashSet;
-import java.util.List;
-import java.util.Map;
-import java.util.Set;
+import java.util.*;
 
 import org.apache.carbondata.common.logging.LogService;
 import org.apache.carbondata.common.logging.LogServiceFactory;
@@ -511,11 +498,13 @@ public class SegmentFileStore {
 for (Map.Entry entry : carbonIndexMap.entrySet()) {
   List indexInfo =
   fileFooterConverter.getIndexInfo(entry.getKey(), entry.getValue());
-  List blocks = new ArrayList<>();
+  // carbonindex file stores blocklets so block filename will be 
duplicated, use set to remove
+  // duplicates
+  Set blocks = new LinkedHashSet<>();
   for (DataFileFooter footer : indexInfo) {
 blocks.add(footer.getBlockInfo().getTableBlockInfo().getFilePath());
   }
-  indexFilesMap.put(entry.getKey(), blocks);
+  indexFilesMap.put(entry.getKey(), new ArrayList<>(blocks));
   boolean added = false;
   for (Map.Entry> mergeFile : indexFileStore
   .getCarbonMergeFileToIndexFilesMap().entrySet()) {


Build failed in Jenkins: carbondata-master-spark-2.1 #2749

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H22 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22558, done.
remote: Compressing objects:   0% (1/7471)   remote: Compressing 
objects:   1% (75/7471)   remote: Compressing objects:   2% (150/7471)  
 remote: Compressing objects:   3% (225/7471)   remote: 
Compressing objects:   4% (299/7471)   remote: Compressing objects:   
5% (374/7471)   remote: Compressing objects:   6% (449/7471)   
remote: Compressing objects:   7% (523/7471)   remote: Compressing 
objects:   8% (598/7471)   remote: Compressing objects:   9% (673/7471) 
  remote: Compressing objects:  10% (748/7471)   remote: 
Compressing objects:  11% (822/7471)   remote: Compressing objects:  
12% (897/7471)   remote: Compressing objects:  13% (972/7471)   
remote: Compressing objects:  14% (1046/7471)   remote: Compressing 
objects:  15% (1121/7471)   remote: Compressing objects:  16% 
(1196/7471)   remote: Compressing objects:  17% (1271/7471)   
remote: Compressing objects:  18% (1345/7471)   remote: Compressing 
objects:  19% (1420/7471)   remote: Compressing objects:  20% 
(1495/7471)   remote: Compressing objects:  21% (1569/7471)   
remote: Compressing objects:  22% (1644/7471)   remote: Compressing 
objects:  23% (1719/7471)   remote: Compressing objects:  24% 
(1794/7471)   remote: Compressing objects:  25% (1868/7471)   
remote: Compressing objects:  26% (1943/7471)   remote: Compressing 
objects:  27% (2018/7471)   remote: Compressing objects:  28% 
(2092/7471)   remote: Compressing objects:  29% (2167/7471)   
remote: Compressing objects:  30% (2242/7471)   remote: Compressing 
objects:  31% (2317/7471)   remote: Compressing objects:  32% 
(2391/7471)   remote: Compressing objects:  33% (2466/7471)   
remote: Compressing objects:  34% (2541/7471)   remote: Compressing 
objects:  35% (2615/7471)   remote: Compressing objects:  36% 
(2690/7471)   remote: Compressing objects:  37% (2765/7471)   
remote: Compressing objects:  38% (2839/7471)   remote: Compressing 
objects:  39% (2914/7471)   remote: Compressing objects:  40% 
(2989/7471)   remote: Compressing objects:  41% (3064/7471)   
remote: Compressing objects:  42% (3138/7471)   remote: Compressing 
objects:  43% (3213/7471)   remote: Compressing objects:  44% 
(3288/7471)   remote: Compressing objects:  45% (3362/7471)   
remote: Compressing objects:  46% (3437/7471)   remote: Compressing 
objects:  47% (3512/7471)   remote: Compressing objects:  48% 
(3587/7471)   remote: Compressing objects:  49% (3661/7471)   
remote: Compressing objects:  50% (3736/7471)   remote: Compressing 
objects:  

Build failed in Jenkins: carbondata-master-spark-2.2 #830

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H33 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22484, done.
remote: Compressing objects:   0% (1/7428)   remote: Compressing 
objects:   1% (75/7428)   remote: Compressing objects:   2% (149/7428)  
 remote: Compressing objects:   3% (223/7428)   remote: 
Compressing objects:   4% (298/7428)   remote: Compressing objects:   
5% (372/7428)   remote: Compressing objects:   6% (446/7428)   
remote: Compressing objects:   7% (520/7428)   remote: Compressing 
objects:   8% (595/7428)   remote: Compressing objects:   9% (669/7428) 
  remote: Compressing objects:  10% (743/7428)   remote: 
Compressing objects:  11% (818/7428)   remote: Compressing objects:  
12% (892/7428)   remote: Compressing objects:  13% (966/7428)   
remote: Compressing objects:  14% (1040/7428)   remote: Compressing 
objects:  15% (1115/7428)   remote: Compressing objects:  16% 
(1189/7428)   remote: Compressing objects:  17% (1263/7428)   
remote: Compressing objects:  18% (1338/7428)   remote: Compressing 
objects:  19% (1412/7428)   remote: Compressing objects:  20% 
(1486/7428)   remote: Compressing objects:  21% (1560/7428)   
remote: Compressing objects:  22% (1635/7428)   remote: Compressing 
objects:  23% (1709/7428)   remote: Compressing objects:  24% 
(1783/7428)   remote: Compressing objects:  25% (1857/7428)   
remote: Compressing objects:  26% (1932/7428)   remote: Compressing 
objects:  27% (2006/7428)   remote: Compressing objects:  28% 
(2080/7428)   remote: Compressing objects:  29% (2155/7428)   
remote: Compressing objects:  30% (2229/7428)   remote: Compressing 
objects:  31% (2303/7428)   remote: Compressing objects:  32% 
(2377/7428)   remote: Compressing objects:  33% (2452/7428)   
remote: Compressing objects:  34% (2526/7428)   remote: Compressing 
objects:  35% (2600/7428)   remote: Compressing objects:  36% 
(2675/7428)   remote: Compressing objects:  37% (2749/7428)   
remote: Compressing objects:  38% (2823/7428)   remote: Compressing 
objects:  39% (2897/7428)   remote: Compressing objects:  40% 
(2972/7428)   remote: Compressing objects:  41% (3046/7428)   
remote: Compressing objects:  42% (3120/7428)   remote: Compressing 
objects:  43% (3195/7428)   remote: Compressing objects:  44% 
(3269/7428)   remote: Compressing objects:  45% (3343/7428)   
remote: Compressing objects:  46% (3417/7428)   remote: Compressing 
objects:  47% (3492/7428)   remote: Compressing objects:  48% 
(3566/7428)   remote: Compressing objects:  49% (3640/7428)   
remote: Compressing objects:  50% (3714/7428)   remote: Compressing 
objects:  51% 

carbondata git commit: [CARBONDATA-2799][BloomDataMap] Fix bugs in querying with bloom datamap on preagg with dictionary column

2018-08-02 Thread kunalkapoor
Repository: carbondata
Updated Branches:
  refs/heads/master b65bf9bc7 -> bd6abbbff


[CARBONDATA-2799][BloomDataMap] Fix bugs in querying with bloom datamap on 
preagg with dictionary column

For preaggregate table, if the groupby column is dictionary column in
parent table, the preaggregate table will inherit the dictionary
encoding as well as the dictionary file from the parent table.

So for dictionary columns, during query with bloom, we need to
convert the plain filter value to dictionarty encoded value based on
parent table's dictionary file.

This closes #2580


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/bd6abbbf
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/bd6abbbf
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/bd6abbbf

Branch: refs/heads/master
Commit: bd6abbbffd36b5ca0aaad9d937d401982d1d60eb
Parents: b65bf9b
Author: xuchuanyin 
Authored: Mon Jul 30 17:50:51 2018 +0800
Committer: kunal642 
Committed: Thu Aug 2 16:55:59 2018 +0530

--
 .../datamap/bloom/BloomCoarseGrainDataMap.java  | 21 -
 .../BloomCoarseGrainDataMapFunctionSuite.scala  | 97 
 2 files changed, 117 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/bd6abbbf/datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomCoarseGrainDataMap.java
--
diff --git 
a/datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomCoarseGrainDataMap.java
 
b/datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomCoarseGrainDataMap.java
index be531d6..71b1c55 100644
--- 
a/datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomCoarseGrainDataMap.java
+++ 
b/datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomCoarseGrainDataMap.java
@@ -47,10 +47,12 @@ import 
org.apache.carbondata.core.devapi.DictionaryGenerationException;
 import org.apache.carbondata.core.indexstore.Blocklet;
 import org.apache.carbondata.core.indexstore.PartitionSpec;
 import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier;
+import org.apache.carbondata.core.metadata.CarbonMetadata;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import org.apache.carbondata.core.metadata.datatype.DataTypes;
 import org.apache.carbondata.core.metadata.encoder.Encoding;
 import org.apache.carbondata.core.metadata.schema.table.CarbonTable;
+import org.apache.carbondata.core.metadata.schema.table.RelationIdentifier;
 import org.apache.carbondata.core.metadata.schema.table.column.CarbonColumn;
 import org.apache.carbondata.core.scan.expression.ColumnExpression;
 import org.apache.carbondata.core.scan.expression.Expression;
@@ -108,6 +110,7 @@ public class BloomCoarseGrainDataMap extends 
CoarseGrainDataMap {
 for (CarbonColumn col : indexedColumn) {
   this.name2Col.put(col.getColName(), col);
 }
+String parentTablePath = getAncestorTablePath(carbonTable);
 
 try {
   this.name2Converters = new HashMap<>(indexedColumn.size());
@@ -129,7 +132,7 @@ public class BloomCoarseGrainDataMap extends 
CoarseGrainDataMap {
 dataField.setTimestampFormat(tsFormat);
 FieldConverter fieldConverter = FieldEncoderFactory.getInstance()
 .createFieldEncoder(dataField, absoluteTableIdentifier, i, 
nullFormat, null, false,
-localCaches[i], false, carbonTable.getTablePath());
+localCaches[i], false, parentTablePath);
 this.name2Converters.put(indexedColumn.get(i).getColName(), 
fieldConverter);
   }
 } catch (IOException e) {
@@ -140,6 +143,22 @@ public class BloomCoarseGrainDataMap extends 
CoarseGrainDataMap {
 this.badRecordLogHolder.setLogged(false);
   }
 
+  /**
+   * recursively find the ancestor's table path. This is used for dictionary 
scenario
+   * where preagg will use the dictionary of the parent table.
+   */
+  private String getAncestorTablePath(CarbonTable currentTable) {
+if (!currentTable.isChildDataMap()) {
+  return currentTable.getTablePath();
+}
+
+RelationIdentifier parentIdentifier =
+currentTable.getTableInfo().getParentRelationIdentifiers().get(0);
+CarbonTable parentTable = CarbonMetadata.getInstance().getCarbonTable(
+parentIdentifier.getDatabaseName(), parentIdentifier.getTableName());
+return getAncestorTablePath(parentTable);
+  }
+
   @Override
   public List prune(FilterResolverIntf filterExp, SegmentProperties 
segmentProperties,
   List partitions) throws IOException {

http://git-wip-us.apache.org/repos/asf/carbondata/blob/bd6abbbf/integration/spark2/src/test/scala/org/apache/carbondata/datamap/bloom/BloomCoarseGrainDataMapFunctionSuite.scala

Build failed in Jenkins: carbondata-master-spark-2.2 #829

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H33 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22460, done.
remote: Compressing objects:   0% (1/7413)   remote: Compressing 
objects:   1% (75/7413)   remote: Compressing objects:   2% (149/7413)  
 remote: Compressing objects:   3% (223/7413)   remote: 
Compressing objects:   4% (297/7413)   remote: Compressing objects:   
5% (371/7413)   remote: Compressing objects:   6% (445/7413)   
remote: Compressing objects:   7% (519/7413)   remote: Compressing 
objects:   8% (594/7413)   remote: Compressing objects:   9% (668/7413) 
  remote: Compressing objects:  10% (742/7413)   remote: 
Compressing objects:  11% (816/7413)   remote: Compressing objects:  
12% (890/7413)   remote: Compressing objects:  13% (964/7413)   
remote: Compressing objects:  14% (1038/7413)   remote: Compressing 
objects:  15% (1112/7413)   remote: Compressing objects:  16% 
(1187/7413)   remote: Compressing objects:  17% (1261/7413)   
remote: Compressing objects:  18% (1335/7413)   remote: Compressing 
objects:  19% (1409/7413)   remote: Compressing objects:  20% 
(1483/7413)   remote: Compressing objects:  21% (1557/7413)   
remote: Compressing objects:  22% (1631/7413)   remote: Compressing 
objects:  23% (1705/7413)   remote: Compressing objects:  24% 
(1780/7413)   remote: Compressing objects:  25% (1854/7413)   
remote: Compressing objects:  26% (1928/7413)   remote: Compressing 
objects:  27% (2002/7413)   remote: Compressing objects:  28% 
(2076/7413)   remote: Compressing objects:  29% (2150/7413)   
remote: Compressing objects:  30% (2224/7413)   remote: Compressing 
objects:  31% (2299/7413)   remote: Compressing objects:  32% 
(2373/7413)   remote: Compressing objects:  33% (2447/7413)   
remote: Compressing objects:  34% (2521/7413)   remote: Compressing 
objects:  35% (2595/7413)   remote: Compressing objects:  36% 
(2669/7413)   remote: Compressing objects:  37% (2743/7413)   
remote: Compressing objects:  38% (2817/7413)   remote: Compressing 
objects:  39% (2892/7413)   remote: Compressing objects:  40% 
(2966/7413)   remote: Compressing objects:  41% (3040/7413)   
remote: Compressing objects:  42% (3114/7413)   remote: Compressing 
objects:  43% (3188/7413)   remote: Compressing objects:  44% 
(3262/7413)   remote: Compressing objects:  45% (3336/7413)   
remote: Compressing objects:  46% (3410/7413)   remote: Compressing 
objects:  47% (3485/7413)   remote: Compressing objects:  48% 
(3559/7413)   remote: Compressing objects:  49% (3633/7413)   
remote: Compressing objects:  50% (3707/7413)   remote: Compressing 
objects:  51% 

[2/2] carbondata git commit: [HOTFIX][PR 2575] Fixed modular plan creation only if valid datamaps are available

2018-08-02 Thread kunalkapoor
[HOTFIX][PR 2575] Fixed modular plan creation only if valid datamaps are 
available

update query is failing in spark-2.2 cluster if mv jars are available because 
catalogs
are not empty if datamap are created for other table also and returns true from 
isValidPlan() inside MVAnalyzerRule.

This closes #2579


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/b65bf9bc
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/b65bf9bc
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/b65bf9bc

Branch: refs/heads/master
Commit: b65bf9bc7104cbcfad1277c99090853d9e7b0386
Parents: f52c133
Author: ravipesala 
Authored: Mon Jul 30 15:00:00 2018 +0530
Committer: kunal642 
Committed: Thu Aug 2 16:52:21 2018 +0530

--
 .../carbondata/core/datamap/DataMapCatalog.java |  4 +-
 .../carbondata/mv/datamap/MVAnalyzerRule.scala  | 57 
 .../mv/rewrite/SummaryDatasetCatalog.scala  |  9 +++-
 .../mv/rewrite/MVCreateTestCase.scala   |  4 ++
 4 files changed, 60 insertions(+), 14 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/b65bf9bc/core/src/main/java/org/apache/carbondata/core/datamap/DataMapCatalog.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapCatalog.java 
b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapCatalog.java
index 89f2838..5dd4871 100644
--- a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapCatalog.java
+++ b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapCatalog.java
@@ -38,10 +38,10 @@ public interface DataMapCatalog {
   void unregisterSchema(String dataMapName);
 
   /**
-   * List all registered schema catalogs
+   * List all registered valid schema catalogs
* @return
*/
-  T[] listAllSchema();
+  T[] listAllValidSchema();
 
   /**
* It reloads/removes all registered schema catalogs

http://git-wip-us.apache.org/repos/asf/carbondata/blob/b65bf9bc/datamap/mv/core/src/main/scala/org/apache/carbondata/mv/datamap/MVAnalyzerRule.scala
--
diff --git 
a/datamap/mv/core/src/main/scala/org/apache/carbondata/mv/datamap/MVAnalyzerRule.scala
 
b/datamap/mv/core/src/main/scala/org/apache/carbondata/mv/datamap/MVAnalyzerRule.scala
index 483780f..9e0f8e5 100644
--- 
a/datamap/mv/core/src/main/scala/org/apache/carbondata/mv/datamap/MVAnalyzerRule.scala
+++ 
b/datamap/mv/core/src/main/scala/org/apache/carbondata/mv/datamap/MVAnalyzerRule.scala
@@ -16,8 +16,11 @@
  */
 package org.apache.carbondata.mv.datamap
 
+import scala.collection.JavaConverters._
+
 import org.apache.spark.sql.SparkSession
 import org.apache.spark.sql.catalyst.analysis.{UnresolvedAlias, 
UnresolvedAttribute}
+import org.apache.spark.sql.catalyst.catalog.CatalogTable
 import org.apache.spark.sql.catalyst.expressions.{Alias, ScalaUDF}
 import org.apache.spark.sql.catalyst.plans.logical.{Command, 
DeserializeToObject, LogicalPlan}
 import org.apache.spark.sql.catalyst.rules.Rule
@@ -79,27 +82,59 @@ class MVAnalyzerRule(sparkSession: SparkSession) extends 
Rule[LogicalPlan] {
 }
   }
 
+  /**
+   * Whether the plan is valid for doing modular plan matching and datamap 
replacing.
+   */
   def isValidPlan(plan: LogicalPlan, catalog: SummaryDatasetCatalog): Boolean 
= {
-!plan.isInstanceOf[Command] && !isDataMapExists(plan, 
catalog.listAllSchema()) &&
-!plan.isInstanceOf[DeserializeToObject]
+if (!plan.isInstanceOf[Command]  && 
!plan.isInstanceOf[DeserializeToObject]) {
+  val catalogs = extractCatalogs(plan)
+  !isDataMapReplaced(catalog.listAllValidSchema(), catalogs) &&
+  isDataMapExists(catalog.listAllValidSchema(), catalogs)
+} else {
+  false
+}
+
   }
   /**
* Check whether datamap table already updated in the query.
*
-   * @param plan
-   * @param mvs
-   * @return
+   * @param mvdataSetArray Array of available mvdataset which include modular 
plans
+   * @return Boolean whether already datamap replaced in the plan or not
*/
-  def isDataMapExists(plan: LogicalPlan, mvs: Array[SummaryDataset]): Boolean 
= {
-val catalogs = plan collect {
-  case l: LogicalRelation => l.catalogTable
-}
-catalogs.isEmpty || catalogs.exists { c =>
-  mvs.exists { mv =>
+  def isDataMapReplaced(
+  mvdataSetArray: Array[SummaryDataset],
+  catalogs: Seq[Option[CatalogTable]]): Boolean = {
+catalogs.exists { c =>
+  mvdataSetArray.exists { mv =>
 val identifier = mv.dataMapSchema.getRelationIdentifier
 identifier.getTableName.equals(c.get.identifier.table) &&
 identifier.getDatabaseName.equals(c.get.database)
   }
 }
   }
+
+  /**
+   * Check 

[1/2] carbondata git commit: [HOTFIX][PR 2575] Fixed modular plan creation only if valid datamaps are available

2018-08-02 Thread kunalkapoor
Repository: carbondata
Updated Branches:
  refs/heads/master b483a5746 -> b65bf9bc7


[HOTFIX][PR 2575] Fixed modular plan creation only if valid datamaps are 
available

update query is failing in spark-2.2 cluster if mv jars are available because 
catalogs
are not empty if datamap are created for other table also and returns true from 
isValidPlan() inside MVAnalyzerRule.

This closes #2579


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/f52c1338
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/f52c1338
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/f52c1338

Branch: refs/heads/master
Commit: f52c13380828ac7f388273cb2460971fc1a5eed1
Parents: b483a57
Author: rahul 
Authored: Mon Jul 30 12:01:49 2018 +0530
Committer: kunal642 
Committed: Thu Aug 2 16:46:16 2018 +0530

--
 .../carbondata/mv/rewrite/MVCreateTestCase.scala  | 14 ++
 1 file changed, 14 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/f52c1338/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/MVCreateTestCase.scala
--
diff --git 
a/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/MVCreateTestCase.scala
 
b/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/MVCreateTestCase.scala
index 6adb14e..0b96202 100644
--- 
a/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/MVCreateTestCase.scala
+++ 
b/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/MVCreateTestCase.scala
@@ -18,6 +18,7 @@ package org.apache.carbondata.mv.rewrite
 
 import java.io.File
 
+import org.apache.spark.sql.Row
 import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
 import org.apache.spark.sql.execution.datasources.LogicalRelation
 import org.apache.spark.sql.test.util.QueryTest
@@ -885,6 +886,19 @@ class MVCreateTestCase extends QueryTest with 
BeforeAndAfterAll {
 sql("drop datamap if exists datamap_subqry")
   }
 
+  test("basic scenario") {
+
+sql("drop table if exists mvtable1")
+sql("create table mvtable1(name string,age int,salary int) stored by 
'carbondata'")
+sql(" insert into mvtable1 select 'n1',12,12")
+sql("  insert into mvtable1 select 'n1',12,12")
+sql(" insert into mvtable1 select 'n3',12,12")
+sql(" insert into mvtable1 select 'n4',12,12")
+sql("update mvtable1 set(name) = ('updatedName')").show()
+checkAnswer(sql("select count(*) from mvtable1 where name = 
'updatedName'"),Seq(Row(4)))
+sql("drop table if exists mvtable1")
+  }
+
   def verifyMVDataMap(logicalPlan: LogicalPlan, dataMapName: String): Boolean 
= {
 val tables = logicalPlan collect {
   case l: LogicalRelation => l.catalogTable.get



Build failed in Jenkins: carbondata-master-spark-2.2 #828

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H33 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22409, done.
remote: Compressing objects:   0% (1/7386)   remote: Compressing 
objects:   1% (74/7386)   remote: Compressing objects:   2% (148/7386)  
 remote: Compressing objects:   3% (222/7386)   remote: 
Compressing objects:   4% (296/7386)   remote: Compressing objects:   
5% (370/7386)   remote: Compressing objects:   6% (444/7386)   
remote: Compressing objects:   7% (518/7386)   remote: Compressing 
objects:   8% (591/7386)   remote: Compressing objects:   9% (665/7386) 
  remote: Compressing objects:  10% (739/7386)   remote: 
Compressing objects:  11% (813/7386)   remote: Compressing objects:  
12% (887/7386)   remote: Compressing objects:  13% (961/7386)   
remote: Compressing objects:  14% (1035/7386)   remote: Compressing 
objects:  15% (1108/7386)   remote: Compressing objects:  16% 
(1182/7386)   remote: Compressing objects:  17% (1256/7386)   
remote: Compressing objects:  18% (1330/7386)   remote: Compressing 
objects:  19% (1404/7386)   remote: Compressing objects:  20% 
(1478/7386)   remote: Compressing objects:  21% (1552/7386)   
remote: Compressing objects:  22% (1625/7386)   remote: Compressing 
objects:  23% (1699/7386)   remote: Compressing objects:  24% 
(1773/7386)   remote: Compressing objects:  25% (1847/7386)   
remote: Compressing objects:  26% (1921/7386)   remote: Compressing 
objects:  27% (1995/7386)   remote: Compressing objects:  28% 
(2069/7386)   remote: Compressing objects:  29% (2142/7386)   
remote: Compressing objects:  30% (2216/7386)   remote: Compressing 
objects:  31% (2290/7386)   remote: Compressing objects:  32% 
(2364/7386)   remote: Compressing objects:  33% (2438/7386)   
remote: Compressing objects:  34% (2512/7386)   remote: Compressing 
objects:  35% (2586/7386)   remote: Compressing objects:  36% 
(2659/7386)   remote: Compressing objects:  37% (2733/7386)   
remote: Compressing objects:  38% (2807/7386)   remote: Compressing 
objects:  39% (2881/7386)   remote: Compressing objects:  40% 
(2955/7386)   remote: Compressing objects:  41% (3029/7386)   
remote: Compressing objects:  42% (3103/7386)   remote: Compressing 
objects:  43% (3176/7386)   remote: Compressing objects:  44% 
(3250/7386)   remote: Compressing objects:  45% (3324/7386)   
remote: Compressing objects:  46% (3398/7386)   remote: Compressing 
objects:  47% (3472/7386)   remote: Compressing objects:  48% 
(3546/7386)   remote: Compressing objects:  49% (3620/7386)   
remote: Compressing objects:  50% (3693/7386)   remote: Compressing 
objects:  51% 

Build failed in Jenkins: carbondata-master-spark-2.2 #827

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H33 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22409, done.
remote: Compressing objects:   0% (1/7386)   remote: Compressing 
objects:   1% (74/7386)   remote: Compressing objects:   2% (148/7386)  
 remote: Compressing objects:   3% (222/7386)   remote: 
Compressing objects:   4% (296/7386)   remote: Compressing objects:   
5% (370/7386)   remote: Compressing objects:   6% (444/7386)   
remote: Compressing objects:   7% (518/7386)   remote: Compressing 
objects:   8% (591/7386)   remote: Compressing objects:   9% (665/7386) 
  remote: Compressing objects:  10% (739/7386)   remote: 
Compressing objects:  11% (813/7386)   remote: Compressing objects:  
12% (887/7386)   remote: Compressing objects:  13% (961/7386)   
remote: Compressing objects:  14% (1035/7386)   remote: Compressing 
objects:  15% (1108/7386)   remote: Compressing objects:  16% 
(1182/7386)   remote: Compressing objects:  17% (1256/7386)   
remote: Compressing objects:  18% (1330/7386)   remote: Compressing 
objects:  19% (1404/7386)   remote: Compressing objects:  20% 
(1478/7386)   remote: Compressing objects:  21% (1552/7386)   
remote: Compressing objects:  22% (1625/7386)   remote: Compressing 
objects:  23% (1699/7386)   remote: Compressing objects:  24% 
(1773/7386)   remote: Compressing objects:  25% (1847/7386)   
remote: Compressing objects:  26% (1921/7386)   remote: Compressing 
objects:  27% (1995/7386)   remote: Compressing objects:  28% 
(2069/7386)   remote: Compressing objects:  29% (2142/7386)   
remote: Compressing objects:  30% (2216/7386)   remote: Compressing 
objects:  31% (2290/7386)   remote: Compressing objects:  32% 
(2364/7386)   remote: Compressing objects:  33% (2438/7386)   
remote: Compressing objects:  34% (2512/7386)   remote: Compressing 
objects:  35% (2586/7386)   remote: Compressing objects:  36% 
(2659/7386)   remote: Compressing objects:  37% (2733/7386)   
remote: Compressing objects:  38% (2807/7386)   remote: Compressing 
objects:  39% (2881/7386)   remote: Compressing objects:  40% 
(2955/7386)   remote: Compressing objects:  41% (3029/7386)   
remote: Compressing objects:  42% (3103/7386)   remote: Compressing 
objects:  43% (3176/7386)   remote: Compressing objects:  44% 
(3250/7386)   remote: Compressing objects:  45% (3324/7386)   
remote: Compressing objects:  46% (3398/7386)   remote: Compressing 
objects:  47% (3472/7386)   remote: Compressing objects:  48% 
(3546/7386)   remote: Compressing objects:  49% (3620/7386)   
remote: Compressing objects:  50% (3693/7386)   remote: Compressing 
objects:  51% 

carbondata git commit: [CARBONDATA-2792][schema restructure] Create external table fails post schema restructure.

2018-08-02 Thread kunalkapoor
Repository: carbondata
Updated Branches:
  refs/heads/master 625a2efa7 -> b483a5746


[CARBONDATA-2792][schema restructure] Create external table fails post schema 
restructure.

Problem
Once the table schema is restructured "(do column drop and column add)".
The api 
org.apache.carbondata.spark.util.CarbonSparkUtil.getRawSchema(carbonRelation: 
CarbonRelation) : String
to get visible columns raw schema string in ascending order of their ordinal 
value throws
ArrayIndexOutOfBoundException while creating external table.
The api prepares the array of raw column schema of visible columns in the 
ascending order of schema ordinal.
It uses the schemaOrdinal as index to prepare the array, so the colum having 
the schemaOrdinal value more
than the visible column count will cause the ArrayIndexOutOfBoundException.

Solution
Filter the visible and valid columns
sort the columns based on the schema ordinal.
Prepare the raw column schema based on the sorted columns index.

This closes #2571


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/b483a574
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/b483a574
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/b483a574

Branch: refs/heads/master
Commit: b483a57464a860fe9dbd2d074a3e08fd59141edc
Parents: 625a2ef
Author: mohammadshahidkhan 
Authored: Fri Jul 27 12:51:49 2018 +0530
Committer: kunal642 
Committed: Thu Aug 2 16:38:02 2018 +0530

--
 .../createTable/TestCreateExternalTable.scala | 14 ++
 .../carbondata/spark/util/CarbonSparkUtil.scala   | 14 +-
 2 files changed, 23 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/b483a574/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateExternalTable.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateExternalTable.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateExternalTable.scala
index a9b8d57..6fb24c7 100644
--- 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateExternalTable.scala
+++ 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateExternalTable.scala
@@ -32,6 +32,8 @@ class TestCreateExternalTable extends QueryTest with 
BeforeAndAfterAll {
 
   override def beforeAll(): Unit = {
 sql("DROP TABLE IF EXISTS origin")
+sql("drop table IF EXISTS rsext")
+sql("drop table IF EXISTS rstest1")
 // create carbon table and insert data
 sql("CREATE TABLE origin(key INT, value STRING) STORED BY 'carbondata'")
 sql("INSERT INTO origin select 100,'spark'")
@@ -41,6 +43,8 @@ class TestCreateExternalTable extends QueryTest with 
BeforeAndAfterAll {
 
   override def afterAll(): Unit = {
 sql("DROP TABLE IF EXISTS origin")
+sql("drop table IF EXISTS rsext")
+sql("drop table IF EXISTS rstest1")
   }
 
   test("create external table with existing files") {
@@ -111,5 +115,15 @@ class TestCreateExternalTable extends QueryTest with 
BeforeAndAfterAll {
 }
 assert(exception.getMessage().contains("Create external table as select"))
   }
+  test("create external table with post schema resturcture") {
+sql("create table rstest1 (c1 string,c2 int) STORED BY 
'org.apache.carbondata.format'")
+sql("Alter table rstest1 drop columns(c2)")
+sql(
+  "Alter table rstest1 add columns(c4 string) 
TBLPROPERTIES('DICTIONARY_EXCLUDE'='c4', " +
+  "'DEFAULT.VALUE.c4'='def')")
+sql(s"""CREATE EXTERNAL TABLE rsext STORED BY 'carbondata' LOCATION 
'$storeLocation/rstest1'""")
+sql("insert into rsext select 'shahid', 1")
+checkAnswer(sql("select * from rstest1"),  sql("select * from rsext"))
+  }
 
 }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/b483a574/integration/spark2/src/main/scala/org/apache/carbondata/spark/util/CarbonSparkUtil.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/carbondata/spark/util/CarbonSparkUtil.scala
 
b/integration/spark2/src/main/scala/org/apache/carbondata/spark/util/CarbonSparkUtil.scala
index b9e2442..a0c0545 100644
--- 
a/integration/spark2/src/main/scala/org/apache/carbondata/spark/util/CarbonSparkUtil.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/carbondata/spark/util/CarbonSparkUtil.scala
@@ -18,13 +18,14 @@
 package org.apache.carbondata.spark.util
 
 import scala.collection.JavaConverters._
+import scala.collection.mutable
 
 import 

Jenkins build is back to stable : carbondata-master-spark-2.2 » Apache CarbonData :: Spark2 #826

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : carbondata-master-spark-2.2 #826

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : carbondata-master-spark-2.1 #2746

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #2746

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is unstable: carbondata-master-spark-2.2 #824

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build became unstable: carbondata-master-spark-2.2 » Apache CarbonData :: Spark2 #824

2018-08-02 Thread Apache Jenkins Server
See 




Build failed in Jenkins: carbondata-master-spark-2.2 #825

2018-08-02 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on H33 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
 > git --version # timeout=10
using GIT_SSH to set credentials 
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/carbondata.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/carbondata.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/carbondata.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: remote: Counting objects: 22388, done.
remote: Compressing objects:   0% (1/7362)   remote: Compressing 
objects:   1% (74/7362)   remote: Compressing objects:   2% (148/7362)  
 remote: Compressing objects:   3% (221/7362)   remote: 
Compressing objects:   4% (295/7362)   remote: Compressing objects:   
5% (369/7362)   remote: Compressing objects:   6% (442/7362)   
remote: Compressing objects:   7% (516/7362)   remote: Compressing 
objects:   8% (589/7362)   remote: Compressing objects:   9% (663/7362) 
  remote: Compressing objects:  10% (737/7362)   remote: 
Compressing objects:  11% (810/7362)   remote: Compressing objects:  
12% (884/7362)   remote: Compressing objects:  13% (958/7362)   
remote: Compressing objects:  14% (1031/7362)   remote: Compressing 
objects:  15% (1105/7362)   remote: Compressing objects:  16% 
(1178/7362)   remote: Compressing objects:  17% (1252/7362)   
remote: Compressing objects:  18% (1326/7362)   remote: Compressing 
objects:  19% (1399/7362)   remote: Compressing objects:  20% 
(1473/7362)   remote: Compressing objects:  21% (1547/7362)   
remote: Compressing objects:  22% (1620/7362)   remote: Compressing 
objects:  23% (1694/7362)   remote: Compressing objects:  24% 
(1767/7362)   remote: Compressing objects:  25% (1841/7362)   
remote: Compressing objects:  26% (1915/7362)   remote: Compressing 
objects:  27% (1988/7362)   remote: Compressing objects:  28% 
(2062/7362)   remote: Compressing objects:  29% (2135/7362)   
remote: Compressing objects:  30% (2209/7362)   remote: Compressing 
objects:  31% (2283/7362)   remote: Compressing objects:  32% 
(2356/7362)   remote: Compressing objects:  33% (2430/7362)   
remote: Compressing objects:  34% (2504/7362)   remote: Compressing 
objects:  35% (2577/7362)   remote: Compressing objects:  36% 
(2651/7362)   remote: Compressing objects:  37% (2724/7362)   
remote: Compressing objects:  38% (2798/7362)   remote: Compressing 
objects:  39% (2872/7362)   remote: Compressing objects:  40% 
(2945/7362)   remote: Compressing objects:  41% (3019/7362)   
remote: Compressing objects:  42% (3093/7362)   remote: Compressing 
objects:  43% (3166/7362)   remote: Compressing objects:  44% 
(3240/7362)   remote: Compressing objects:  45% (3313/7362)   
remote: Compressing objects:  46% (3387/7362)   remote: Compressing 
objects:  47% (3461/7362)   remote: Compressing objects:  48% 
(3534/7362)   remote: Compressing objects:  49% (3608/7362)   
remote: Compressing objects:  50% (3681/7362)   remote: Compressing 
objects:  51% 

Jenkins build became unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #2745

2018-08-02 Thread Apache Jenkins Server
See 




Jenkins build is unstable: carbondata-master-spark-2.1 #2745

2018-08-02 Thread Apache Jenkins Server
See