Jenkins build is back to stable : carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #1629

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : carbondata-master-spark-2.1 » Apache CarbonData :: Spark Common Test #1629

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark Common Test #1628

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build became unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #1628

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: carbondata-master-spark-2.1 #1628

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: carbondata-master-spark-2.1 #1627

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark Common Test #1627

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #1627

2017-11-18 Thread Apache Jenkins Server
See 




[1/2] carbondata git commit: [CARBONDATA-1734] Ignore empty line while reading CSV

2017-11-18 Thread kumarvishal09
Repository: carbondata
Updated Branches:
  refs/heads/master c2528975a -> 198e5b689


[CARBONDATA-1734] Ignore empty line while reading CSV


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/40f06084
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/40f06084
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/40f06084

Branch: refs/heads/master
Commit: 40f06084a9969e5dd7e14055633efdba8ea2190d
Parents: c252897
Author: dhatchayani 
Authored: Fri Nov 17 12:41:49 2017 +0530
Committer: kumarvishal 
Committed: Sat Nov 18 21:17:58 2017 +0530

--
 .../core/constants/CarbonCommonConstants.java   |  8 +-
 .../constants/CarbonLoadOptionConstants.java|  6 ++
 .../src/test/resources/emptylines.csv   |  7 ++
 .../testsuite/emptyrow/TestSkipEmptyLines.scala | 99 
 .../carbondata/spark/util/CommonUtil.scala  |  1 +
 .../carbondata/spark/util/DataLoadingUtil.scala |  9 ++
 .../spark/sql/catalyst/CarbonDDLSqlParser.scala | 11 ++-
 .../loading/DataLoadProcessBuilder.java |  2 +
 .../constants/DataLoadProcessorConstants.java   |  2 +
 .../loading/csvinput/CSVInputFormat.java| 28 +-
 .../loading/model/CarbonLoadModel.java  | 15 +++
 11 files changed, 185 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/40f06084/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
index 762ef6d..4046538 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
@@ -1379,7 +1379,6 @@ public final class CarbonCommonConstants {
   public static final String CARBON_MERGE_INDEX_IN_SEGMENT_DEFAULT = "true";
 
   public static final String AGGREGATIONDATAMAPSCHEMA = 
"AggregateDataMapHandler";
-
   /*
* The total size of carbon data
*/
@@ -1406,6 +1405,13 @@ public final class CarbonCommonConstants {
 
   public static final String LAST_UPDATE_TIME = "Last Update Time";
 
+  /**
+   * this will be used to skip / ignore empty lines while loading
+   */
+  @CarbonProperty public static final String CARBON_SKIP_EMPTY_LINE = 
"carbon.skip.empty.line";
+
+  public static final String CARBON_SKIP_EMPTY_LINE_DEFAULT = "false";
+
   private CarbonCommonConstants() {
   }
 }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/40f06084/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
index e78d125..30d5959 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
@@ -46,6 +46,12 @@ public final class CarbonLoadOptionConstants {
   public static final String CARBON_OPTIONS_IS_EMPTY_DATA_BAD_RECORD_DEFAULT = 
"false";
 
   /**
+   * option to specify whether to skip empty lines in load
+   */
+  @CarbonProperty public static final String CARBON_OPTIONS_SKIP_EMPTY_LINE =
+  "carbon.options.is.empty.data.bad.record";
+
+  /**
* option to specify the dateFormat in load option for all date columns in 
table
*/
   @CarbonProperty

http://git-wip-us.apache.org/repos/asf/carbondata/blob/40f06084/integration/spark-common-test/src/test/resources/emptylines.csv
--
diff --git a/integration/spark-common-test/src/test/resources/emptylines.csv 
b/integration/spark-common-test/src/test/resources/emptylines.csv
new file mode 100644
index 000..67f3cfe
--- /dev/null
+++ b/integration/spark-common-test/src/test/resources/emptylines.csv
@@ -0,0 +1,7 @@
+name,age
+a,25
+
+b,22
+
+c,23
+

http://git-wip-us.apache.org/repos/asf/carbondata/blob/40f06084/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/emptyrow/TestSkipEmptyLines.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/emptyrow/TestSkipEmptyLines.scala
 

[2/2] carbondata git commit: [CARBONDATA-1734] Ignore empty line while reading CSV This closes #1520

2017-11-18 Thread kumarvishal09
[CARBONDATA-1734] Ignore empty line while reading CSV This closes #1520


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/198e5b68
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/198e5b68
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/198e5b68

Branch: refs/heads/master
Commit: 198e5b689b721bb959c779e35609d463e9f171e5
Parents: c252897 40f0608
Author: kumarvishal 
Authored: Sat Nov 18 21:18:41 2017 +0530
Committer: kumarvishal 
Committed: Sat Nov 18 21:18:41 2017 +0530

--
 .../core/constants/CarbonCommonConstants.java   |  8 +-
 .../constants/CarbonLoadOptionConstants.java|  6 ++
 .../src/test/resources/emptylines.csv   |  7 ++
 .../testsuite/emptyrow/TestSkipEmptyLines.scala | 99 
 .../carbondata/spark/util/CommonUtil.scala  |  1 +
 .../carbondata/spark/util/DataLoadingUtil.scala |  9 ++
 .../spark/sql/catalyst/CarbonDDLSqlParser.scala | 11 ++-
 .../loading/DataLoadProcessBuilder.java |  2 +
 .../constants/DataLoadProcessorConstants.java   |  2 +
 .../loading/csvinput/CSVInputFormat.java| 28 +-
 .../loading/model/CarbonLoadModel.java  | 15 +++
 11 files changed, 185 insertions(+), 3 deletions(-)
--




carbondata git commit: [CARBONDATA-1765] Remove repeat code of Boolean

2017-11-18 Thread jackylk
Repository: carbondata
Updated Branches:
  refs/heads/master 0a972e0a3 -> c2528975a


[CARBONDATA-1765] Remove repeat code of Boolean

Remove repeat code of Boolean

This closes #1529


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/c2528975
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/c2528975
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/c2528975

Branch: refs/heads/master
Commit: c2528975ae76ff61a1bce73701a9e9162421fcfa
Parents: 0a972e0
Author: xubo245 <601450...@qq.com>
Authored: Sat Nov 18 17:39:37 2017 +0800
Committer: Jacky Li 
Committed: Sat Nov 18 23:46:34 2017 +0800

--
 .../main/java/org/apache/carbondata/core/util/DataTypeUtil.java  | 4 
 1 file changed, 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/c2528975/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java 
b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
index 3a25988..d8c13a3 100644
--- a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
+++ b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
@@ -307,8 +307,6 @@ public final class DataTypeUtil {
   return ByteUtil.toBytes(BooleanConvert.parseBoolean(dimensionValue));
 } else if (actualDataType == DataTypes.STRING) {
   return ByteUtil.toBytes(dimensionValue);
-} else if (actualDataType == DataTypes.BOOLEAN) {
-  return ByteUtil.toBytes(Boolean.parseBoolean(dimensionValue));
 } else if (actualDataType == DataTypes.SHORT) {
   return ByteUtil.toBytes(Short.parseShort(dimensionValue));
 } else if (actualDataType == DataTypes.INT) {
@@ -354,8 +352,6 @@ public final class DataTypeUtil {
 return ByteUtil.toBoolean(dataInBytes);
   } else if (actualDataType == DataTypes.STRING) {
 return getDataTypeConverter().convertFromByteToUTF8String(dataInBytes);
-  } else if (actualDataType == DataTypes.BOOLEAN) {
-return ByteUtil.toBoolean(dataInBytes);
   } else if (actualDataType == DataTypes.SHORT) {
 return ByteUtil.toShort(dataInBytes, 0, dataInBytes.length);
   } else if (actualDataType == DataTypes.INT) {



[12/28] carbondata git commit: [CARBONDATA-1326] Fixed high priority findbug issue

2017-11-18 Thread jackylk
[CARBONDATA-1326] Fixed high priority findbug issue

 Fixed high priority findbug issue

This closes #1507


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/0f46ef04
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/0f46ef04
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/0f46ef04

Branch: refs/heads/fgdatamap
Commit: 0f46ef04d66a513f0987b05ace393016c151fd1c
Parents: 5fc7f06
Author: dhatchayani 
Authored: Thu Nov 16 16:52:18 2017 +0530
Committer: Jacky Li 
Committed: Fri Nov 17 14:51:58 2017 +0800

--
 .../core/cache/dictionary/ManageDictionaryAndBTree.java | 12 +---
 .../datastore/page/encoding/bool/BooleanConvert.java|  4 +++-
 .../core/statusmanager/SegmentStatusManager.java|  8 ++--
 .../org/apache/carbondata/core/util/CarbonUtil.java |  3 ++-
 4 files changed, 16 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/0f46ef04/core/src/main/java/org/apache/carbondata/core/cache/dictionary/ManageDictionaryAndBTree.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/cache/dictionary/ManageDictionaryAndBTree.java
 
b/core/src/main/java/org/apache/carbondata/core/cache/dictionary/ManageDictionaryAndBTree.java
index f8d2495..4f8f724 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/cache/dictionary/ManageDictionaryAndBTree.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/cache/dictionary/ManageDictionaryAndBTree.java
@@ -102,14 +102,12 @@ public class ManageDictionaryAndBTree {
 // clear Btree cache from LRU cache
 LoadMetadataDetails[] loadMetadataDetails =
 
SegmentStatusManager.readLoadMetadata(carbonTable.getMetaDataFilepath());
-if (null != loadMetadataDetails) {
-  String[] segments = new String[loadMetadataDetails.length];
-  int i = 0;
-  for (LoadMetadataDetails loadMetadataDetail : loadMetadataDetails) {
-segments[i++] = loadMetadataDetail.getLoadName();
-  }
-  invalidateBTreeCache(carbonTable.getAbsoluteTableIdentifier(), segments);
+String[] segments = new String[loadMetadataDetails.length];
+int i = 0;
+for (LoadMetadataDetails loadMetadataDetail : loadMetadataDetails) {
+  segments[i++] = loadMetadataDetail.getLoadName();
 }
+invalidateBTreeCache(carbonTable.getAbsoluteTableIdentifier(), segments);
 // clear dictionary cache from LRU cache
 List dimensions =
 carbonTable.getDimensionByTableName(carbonTable.getTableName());

http://git-wip-us.apache.org/repos/asf/carbondata/blob/0f46ef04/core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/bool/BooleanConvert.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/bool/BooleanConvert.java
 
b/core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/bool/BooleanConvert.java
index b373adf..10a9767 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/bool/BooleanConvert.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/bool/BooleanConvert.java
@@ -17,6 +17,8 @@
 
 package org.apache.carbondata.core.datastore.page.encoding.bool;
 
+import java.util.Locale;
+
 /**
  * convert tools for boolean data type
  */
@@ -51,7 +53,7 @@ public class BooleanConvert {
* @return Boolean type data
*/
   public static Boolean parseBoolean(String input) {
-String value = input.toLowerCase();
+String value = input.toLowerCase(Locale.getDefault());
 if (("false").equals(value)) {
   return Boolean.FALSE;
 } else if (("true").equals(value)) {

http://git-wip-us.apache.org/repos/asf/carbondata/blob/0f46ef04/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
index e3dbfed..1944f96 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
@@ -120,6 +120,10 @@ public class SegmentStatusManager {
 BufferedReader buffReader =
 new BufferedReader(new InputStreamReader(dataInputStream, 
"UTF-8"));
 loadFolderDetailsArray = gson.fromJson(buffReader, 
LoadMetadataDetails[].class);
+// if loadFolderDetailsArray is null, assign a empty array
+if (null == 

[04/28] carbondata git commit: [CARBONDATA-1608]Support Column Comment for Create Table This closes #1432

2017-11-18 Thread jackylk
[CARBONDATA-1608]Support Column Comment for Create Table This closes #1432


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/808a334f
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/808a334f
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/808a334f

Branch: refs/heads/fgdatamap
Commit: 808a334f03ea00856e7c6556418835e6d29c1e50
Parents: 17892b1 9c9521b
Author: kumarvishal 
Authored: Thu Nov 16 20:42:10 2017 +0530
Committer: kumarvishal 
Committed: Thu Nov 16 20:42:10 2017 +0530

--
 .../TestCreateTableWithColumnComment.scala  | 54 
 .../CarbonDescribeFormattedCommand.scala| 19 ---
 .../sql/parser/CarbonSpark2SqlParser.scala  | 14 +++--
 .../BooleanDataTypesInsertTest.scala| 40 +++
 4 files changed, 115 insertions(+), 12 deletions(-)
--




[07/28] carbondata git commit: [CARBONDATA-1732] Add S3 support in FileFactory

2017-11-18 Thread jackylk
[CARBONDATA-1732] Add S3 support in FileFactory

This closes #1504


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/733bb516
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/733bb516
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/733bb516

Branch: refs/heads/fgdatamap
Commit: 733bb516dc3fc4a1e2be02b6574c70aafa7d3b9d
Parents: 6551620
Author: Jacky Li 
Authored: Thu Nov 16 17:27:21 2017 +0800
Committer: QiangCai 
Committed: Fri Nov 17 09:54:56 2017 +0800

--
 .../core/constants/CarbonCommonConstants.java   | 21 +++---
 .../core/datastore/impl/FileFactory.java| 30 +---
 .../apache/carbondata/core/util/CarbonUtil.java | 11 ---
 .../core/util/path/HDFSLeaseUtils.java  |  1 +
 .../carbondata/hadoop/util/SchemaReader.java|  1 +
 .../spark/rdd/CarbonDataRDDFactory.scala| 20 +
 6 files changed, 61 insertions(+), 23 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/733bb516/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
index aeca19f..0a7dfdd 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
@@ -147,19 +147,21 @@ public final class CarbonCommonConstants {
* Load Folder Name
*/
   public static final String LOAD_FOLDER = "Segment_";
-  /**
-   * HDFSURL_PREFIX
-   */
+
   public static final String HDFSURL_PREFIX = "hdfs://";
-  /**
-   * VIEWFSURL_PREFIX
-   */
+
+  public static final String LOCAL_FILE_PREFIX = "file://";
+
   public static final String VIEWFSURL_PREFIX = "viewfs://";
 
-  /**
-   * ALLUXIO_PREFIX
-   */
   public static final String ALLUXIOURL_PREFIX = "alluxio://";
+
+  public static final String S3_PREFIX = "s3://";
+
+  public static final String S3N_PREFIX = "s3n://";
+
+  public static final String S3A_PREFIX = "s3a://";
+
   /**
* FS_DEFAULT_FS
*/
@@ -1261,7 +1263,6 @@ public final class CarbonCommonConstants {
 
   public static final String MAJOR = "major";
 
-  public static final String LOCAL_FILE_PREFIX = "file://";
   @CarbonProperty
   public static final String CARBON_CUSTOM_BLOCK_DISTRIBUTION = 
"carbon.custom.block.distribution";
   public static final String CARBON_CUSTOM_BLOCK_DISTRIBUTION_DEFAULT = 
"false";

http://git-wip-us.apache.org/repos/asf/carbondata/blob/733bb516/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java 
b/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
index e4e4ae2..57a48ec 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
@@ -73,6 +73,7 @@ public final class FileFactory {
   case HDFS:
   case ALLUXIO:
   case VIEWFS:
+  case S3:
 return new DFSFileHolderImpl();
   default:
 return new FileHolderImpl();
@@ -80,12 +81,17 @@ public final class FileFactory {
   }
 
   public static FileType getFileType(String path) {
-if (path.startsWith(CarbonCommonConstants.HDFSURL_PREFIX)) {
+String lowerPath = path.toLowerCase();
+if (lowerPath.startsWith(CarbonCommonConstants.HDFSURL_PREFIX)) {
   return FileType.HDFS;
-} else if (path.startsWith(CarbonCommonConstants.ALLUXIOURL_PREFIX)) {
+} else if (lowerPath.startsWith(CarbonCommonConstants.ALLUXIOURL_PREFIX)) {
   return FileType.ALLUXIO;
-} else if (path.startsWith(CarbonCommonConstants.VIEWFSURL_PREFIX)) {
+} else if (lowerPath.startsWith(CarbonCommonConstants.VIEWFSURL_PREFIX)) {
   return FileType.VIEWFS;
+} else if (lowerPath.startsWith(CarbonCommonConstants.S3N_PREFIX) ||
+lowerPath.startsWith(CarbonCommonConstants.S3A_PREFIX) ||
+lowerPath.startsWith(CarbonCommonConstants.S3_PREFIX)) {
+  return FileType.S3;
 }
 return FileType.LOCAL;
   }
@@ -99,6 +105,7 @@ public final class FileFactory {
   case LOCAL:
 return new LocalCarbonFile(getUpdatedFilePath(path, fileType));
   case HDFS:
+  case S3:
 return new HDFSCarbonFile(path);
   case ALLUXIO:
 return new AlluxioCarbonFile(path);
@@ -134,6 +141,7 @@ public final class 

[09/28] carbondata git commit: [CARBONDATA-1739] Clean up store path interface

2017-11-18 Thread jackylk
http://git-wip-us.apache.org/repos/asf/carbondata/blob/5fc7f06f/integration/spark2/src/main/scala/org/apache/spark/util/AlterTableUtil.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/spark/util/AlterTableUtil.scala 
b/integration/spark2/src/main/scala/org/apache/spark/util/AlterTableUtil.scala
index 153b169..07491d1 100644
--- 
a/integration/spark2/src/main/scala/org/apache/spark/util/AlterTableUtil.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/spark/util/AlterTableUtil.scala
@@ -65,7 +65,7 @@ object AlterTableUtil {
   sys.error(s"Table $dbName.$tableName does not exist")
 }
 // acquire the lock first
-val table = relation.tableMeta.carbonTable
+val table = relation.carbonTable
 val acquiredLocks = ListBuffer[ICarbonLock]()
 try {
   locksToBeAcquired.foreach { lock =>
@@ -133,7 +133,7 @@ object AlterTableUtil {
   thriftTable: TableInfo)(sparkSession: SparkSession,
   sessionState: CarbonSessionState): Unit = {
 val dbName = carbonTable.getDatabaseName
-val tableName = carbonTable.getFactTableName
+val tableName = carbonTable.getTableName
 CarbonEnv.getInstance(sparkSession).carbonMetastore
   .updateTableSchemaForAlter(carbonTable.getCarbonTableIdentifier,
 carbonTable.getCarbonTableIdentifier,
@@ -232,10 +232,7 @@ object AlterTableUtil {
   def revertAddColumnChanges(dbName: String, tableName: String, timeStamp: 
Long)
 (sparkSession: SparkSession): Unit = {
 val metastore = CarbonEnv.getInstance(sparkSession).carbonMetastore
-val carbonTable = metastore
-  .lookupRelation(Some(dbName), 
tableName)(sparkSession).asInstanceOf[CarbonRelation].tableMeta
-  .carbonTable
-
+val carbonTable = CarbonEnv.getCarbonTable(Some(dbName), 
tableName)(sparkSession)
 val carbonTablePath = 
CarbonStorePath.getCarbonTablePath(carbonTable.getTablePath,
   carbonTable.getCarbonTableIdentifier)
 val thriftTable: TableInfo = 
metastore.getThriftTableInfo(carbonTablePath)(sparkSession)
@@ -262,9 +259,7 @@ object AlterTableUtil {
   def revertDropColumnChanges(dbName: String, tableName: String, timeStamp: 
Long)
 (sparkSession: SparkSession): Unit = {
 val metastore = CarbonEnv.getInstance(sparkSession).carbonMetastore
-val carbonTable = metastore
-  .lookupRelation(Some(dbName), 
tableName)(sparkSession).asInstanceOf[CarbonRelation].tableMeta
-  .carbonTable
+val carbonTable = CarbonEnv.getCarbonTable(Some(dbName), 
tableName)(sparkSession)
 val carbonTablePath = 
CarbonStorePath.getCarbonTablePath(carbonTable.getTablePath,
   carbonTable.getCarbonTableIdentifier)
 val thriftTable: TableInfo = 
metastore.getThriftTableInfo(carbonTablePath)(sparkSession)
@@ -297,9 +292,7 @@ object AlterTableUtil {
   def revertDataTypeChanges(dbName: String, tableName: String, timeStamp: Long)
 (sparkSession: SparkSession): Unit = {
 val metastore = CarbonEnv.getInstance(sparkSession).carbonMetastore
-val carbonTable = metastore
-  .lookupRelation(Some(dbName), 
tableName)(sparkSession).asInstanceOf[CarbonRelation].tableMeta
-  .carbonTable
+val carbonTable = CarbonEnv.getCarbonTable(Some(dbName), 
tableName)(sparkSession)
 val carbonTablePath = 
CarbonStorePath.getCarbonTablePath(carbonTable.getTablePath,
   carbonTable.getCarbonTableIdentifier)
 val thriftTable: TableInfo = 
metastore.getThriftTableInfo(carbonTablePath)(sparkSession)
@@ -343,30 +336,27 @@ object AlterTableUtil {
 val locksToBeAcquired = List(LockUsage.METADATA_LOCK, 
LockUsage.COMPACTION_LOCK)
 var locks = List.empty[ICarbonLock]
 var timeStamp = 0L
-var newCols = 
Seq[org.apache.carbondata.core.metadata.schema.table.column.ColumnSchema]()
 var carbonTable: CarbonTable = null
 try {
   locks = AlterTableUtil
 .validateTableAndAcquireLock(dbName, tableName, 
locksToBeAcquired)(sparkSession)
   val metastore = CarbonEnv.getInstance(sparkSession).carbonMetastore
-  carbonTable = metastore
-.lookupRelation(Some(dbName), 
tableName)(sparkSession).asInstanceOf[CarbonRelation]
-.tableMeta.carbonTable
+  carbonTable = CarbonEnv.getCarbonTable(Some(dbName), 
tableName)(sparkSession)
   // get the latest carbon table
   // read the latest schema file
   val carbonTablePath = 
CarbonStorePath.getCarbonTablePath(carbonTable.getTablePath,
 carbonTable.getCarbonTableIdentifier)
   val thriftTableInfo: TableInfo = 
metastore.getThriftTableInfo(carbonTablePath)(sparkSession)
   val schemaConverter = new ThriftWrapperSchemaConverterImpl()
-  val wrapperTableInfo = schemaConverter
-.fromExternalToWrapperTableInfo(thriftTableInfo,
-  dbName,
-  tableName,
-  carbonTable.getTablePath)
+  val wrapperTableInfo = schemaConverter.fromExternalToWrapperTableInfo(
+thriftTableInfo,
+

[22/28] carbondata git commit: [CARBONDATA-1614][Streaming] Show file format for segment

2017-11-18 Thread jackylk
[CARBONDATA-1614][Streaming] Show file format for segment

This closes #1498


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/ee71610e
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/ee71610e
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/ee71610e

Branch: refs/heads/fgdatamap
Commit: ee71610e1c7686117f3feebab75fdeb82dc31d54
Parents: 91355ef
Author: Jacky Li 
Authored: Wed Nov 15 00:05:03 2017 +0800
Committer: ravipesala 
Committed: Sat Nov 18 00:28:25 2017 +0530

--
 .../carbondata/core/statusmanager/FileFormat.java  | 17 +++--
 .../core/statusmanager/LoadMetadataDetails.java|  2 +-
 .../apache/carbondata/hadoop/CarbonInputSplit.java |  2 +-
 .../carbondata/hadoop/CarbonMultiBlockSplit.java   |  2 +-
 .../hadoop/api/CarbonTableInputFormat.java |  6 +++---
 .../streaming/CarbonStreamInputFormatTest.java |  2 +-
 .../org/apache/carbondata/api/CarbonStore.scala|  3 ++-
 .../carbondata/spark/rdd/CarbonMergerRDD.scala | 12 ++--
 .../carbondata/spark/rdd/CarbonScanRDD.scala   |  6 +++---
 .../apache/carbondata/spark/util/CommonUtil.scala  |  4 ++--
 .../apache/spark/sql/CarbonCatalystOperators.scala |  1 +
 .../segmentreading/TestSegmentReading.scala|  2 +-
 .../carbondata/TestStreamingTableOperation.scala   |  7 ---
 .../streaming/segment/StreamSegment.java   |  6 +++---
 14 files changed, 40 insertions(+), 32 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/ee71610e/core/src/main/java/org/apache/carbondata/core/statusmanager/FileFormat.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/FileFormat.java 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/FileFormat.java
index 83a4813..c154c5f 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/FileFormat.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/FileFormat.java
@@ -18,23 +18,28 @@
 package org.apache.carbondata.core.statusmanager;
 
 /**
- * the data file format which was supported
+ * The data file format supported in carbondata project
  */
 public enum FileFormat {
-  carbondata, rowformat;
+
+  // carbondata columnar file format, optimized for read
+  COLUMNAR_V3,
+
+  // carbondata row file format, optimized for write
+  ROW_V1;
 
   public static FileFormat getByOrdinal(int ordinal) {
 if (ordinal < 0 || ordinal >= FileFormat.values().length) {
-  return carbondata;
+  return COLUMNAR_V3;
 }
 
 switch (ordinal) {
   case 0:
-return carbondata;
+return COLUMNAR_V3;
   case 1:
-return rowformat;
+return ROW_V1;
 }
 
-return carbondata;
+return COLUMNAR_V3;
   }
 }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/ee71610e/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
index b282d53..bb7fc9d 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
@@ -98,7 +98,7 @@ public class LoadMetadataDetails implements Serializable {
   /**
* the file format of this segment
*/
-  private FileFormat fileFormat = FileFormat.carbondata;
+  private FileFormat fileFormat = FileFormat.COLUMNAR_V3;
 
   public String getPartitionCount() {
 return partitionCount;

http://git-wip-us.apache.org/repos/asf/carbondata/blob/ee71610e/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonInputSplit.java
--
diff --git 
a/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonInputSplit.java 
b/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonInputSplit.java
index f7b372f..e89c2d6 100644
--- a/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonInputSplit.java
+++ b/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonInputSplit.java
@@ -82,7 +82,7 @@ public class CarbonInputSplit extends FileSplit
 
   private BlockletDetailInfo detailInfo;
 
-  private FileFormat fileFormat = FileFormat.carbondata;
+  private FileFormat fileFormat = FileFormat.COLUMNAR_V3;
 
   public CarbonInputSplit() {
 segmentId = null;

http://git-wip-us.apache.org/repos/asf/carbondata/blob/ee71610e/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonMultiBlockSplit.java

[05/28] carbondata git commit: [CARBONDATA-1733] While load is in progress, Show segments is throwing NPE

2017-11-18 Thread jackylk
[CARBONDATA-1733] While load is in progress, Show segments is throwing NPE


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/aff3b9e4
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/aff3b9e4
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/aff3b9e4

Branch: refs/heads/fgdatamap
Commit: aff3b9e4c033ac736de4888195a85906ebd61b13
Parents: 808a334
Author: dhatchayani 
Authored: Thu Nov 16 16:15:38 2017 +0530
Committer: kumarvishal 
Committed: Thu Nov 16 20:47:01 2017 +0530

--
 .../carbondata/core/statusmanager/LoadMetadataDetails.java   | 4 
 1 file changed, 4 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/aff3b9e4/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
index f42ca23..d838e2e 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
@@ -202,6 +202,10 @@ public class LoadMetadataDetails implements Serializable {
   LOGGER.error("Cannot convert" + factTimeStamp + " to Time/Long type 
value" + e.getMessage());
   parser = new SimpleDateFormat(CarbonCommonConstants.CARBON_TIMESTAMP);
   try {
+// if the load is in progress, factTimeStamp will be null, so use 
current time
+if (null == factTimeStamp) {
+  return System.currentTimeMillis();
+}
 dateToStr = parser.parse(factTimeStamp);
 return dateToStr.getTime();
   } catch (ParseException e1) {



[25/28] carbondata git commit: [CARBONDATA-1762] Remove existing column level dateformat and support dateformat, timestampformat in the load option

2017-11-18 Thread jackylk
[CARBONDATA-1762] Remove existing column level dateformat and support 
dateformat, timestampformat in the load option

(1) Remove column level dateformat option
(2) Support dateformat and timestampformat in load options(table level)

This closes #1524


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/85dc4fff
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/85dc4fff
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/85dc4fff

Branch: refs/heads/fgdatamap
Commit: 85dc4fff0ecca160654085379310a1c3096731f7
Parents: 2a12938
Author: akashrn5 
Authored: Fri Nov 17 16:55:33 2017 +0530
Committer: Jacky Li 
Committed: Sat Nov 18 16:34:43 2017 +0800

--
 .../constants/CarbonLoadOptionConstants.java| 10 +-
 .../carbondata/core/util/DataTypeUtil.java  |  2 +-
 .../TestLoadDataWithDiffTimestampFormat.scala   | 31 
 .../carbondata/spark/load/ValidateUtil.scala| 38 +---
 .../carbondata/spark/util/DataLoadingUtil.scala | 13 ++-
 .../spark/sql/catalyst/CarbonDDLSqlParser.scala |  2 +-
 .../spark/rdd/CarbonDataRDDFactory.scala| 24 +++--
 .../processing/loading/DataField.java   | 10 ++
 .../loading/DataLoadProcessBuilder.java | 10 +++---
 .../DirectDictionaryFieldConverterImpl.java | 10 +-
 .../impl/NonDictionaryFieldConverterImpl.java   |  8 -
 .../loading/model/CarbonLoadModel.java  | 13 +++
 .../util/CarbonDataProcessorUtil.java   | 19 --
 13 files changed, 114 insertions(+), 76 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/85dc4fff/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
index ac278d9..e78d125 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
@@ -46,12 +46,20 @@ public final class CarbonLoadOptionConstants {
   public static final String CARBON_OPTIONS_IS_EMPTY_DATA_BAD_RECORD_DEFAULT = 
"false";
 
   /**
-   * option to specify the load option
+   * option to specify the dateFormat in load option for all date columns in 
table
*/
   @CarbonProperty
   public static final String CARBON_OPTIONS_DATEFORMAT =
   "carbon.options.dateformat";
   public static final String CARBON_OPTIONS_DATEFORMAT_DEFAULT = "";
+
+  /**
+   * option to specify the timestampFormat in load option for all timestamp 
columns in table
+   */
+  @CarbonProperty
+  public static final String CARBON_OPTIONS_TIMESTAMPFORMAT =
+  "carbon.options.timestampformat";
+  public static final String CARBON_OPTIONS_TIMESTAMPFORMAT_DEFAULT = "";
   /**
* option to specify the sort_scope
*/

http://git-wip-us.apache.org/repos/asf/carbondata/blob/85dc4fff/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java 
b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
index 0961a63..3a25988 100644
--- a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
+++ b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
@@ -319,7 +319,7 @@ public final class DataTypeUtil {
   Date dateToStr = null;
   DateFormat dateFormatter = null;
   try {
-if (null != dateFormat) {
+if (null != dateFormat && !dateFormat.trim().isEmpty()) {
   dateFormatter = new SimpleDateFormat(dateFormat);
 } else {
   dateFormatter = timeStampformatter.get();

http://git-wip-us.apache.org/repos/asf/carbondata/blob/85dc4fff/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataWithDiffTimestampFormat.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataWithDiffTimestampFormat.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataWithDiffTimestampFormat.scala
index 71d6466..906f05a 100644
--- 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataWithDiffTimestampFormat.scala
+++ 

[02/28] carbondata git commit: [CARBONDATA-1720][FILTER] Wrong data displayed for <= filter for timestamp column(dictionary column)

2017-11-18 Thread jackylk
[CARBONDATA-1720][FILTER] Wrong data displayed for <= filter for timestamp 
column(dictionary column)

Issue:
<= filter is giving wrong results for timestamp dictioinary column
Solution:
In less than equal to filter, we are considering surrogate 2 as default value. 
But surrogate 1 is for default value.

This closes #1502


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/17892b17
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/17892b17
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/17892b17

Branch: refs/heads/fgdatamap
Commit: 17892b17b688eaa637b3dd97c25286edb4183eaa
Parents: 09d0205
Author: dhatchayani 
Authored: Wed Nov 15 18:41:00 2017 +0530
Committer: ravipesala 
Committed: Thu Nov 16 16:12:19 2017 +0530

--
 .../executer/RangeValueFilterExecuterImpl.java  |   2 +-
 ...velRangeLessThanEqualFilterExecuterImpl.java |  10 +-
 .../RowLevelRangeLessThanFiterExecuterImpl.java |  14 +-
 .../src/test/resources/timestamp.csv| 301 +++
 .../RangeFilterAllDataTypesTestCases.scala  |   9 +
 5 files changed, 327 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/17892b17/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java
 
b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java
index 0cfa198..ee373c5 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java
@@ -554,7 +554,7 @@ public class RangeValueFilterExecuterImpl extends 
ValueBasedFilterExecuterImpl {
   if 
(dimColEvaluatorInfo.getDimension().hasEncoding(Encoding.DIRECT_DICTIONARY)) {
 DirectDictionaryGenerator directDictionaryGenerator = 
DirectDictionaryKeyGeneratorFactory
 
.getDirectDictionaryGenerator(dimColEvaluatorInfo.getDimension().getDataType());
-int key = directDictionaryGenerator.generateDirectSurrogateKey(null) + 
1;
+int key = directDictionaryGenerator.generateDirectSurrogateKey(null);
 CarbonDimension currentBlockDimension =
 segmentProperties.getDimensions().get(dimensionBlocksIndex);
 if (currentBlockDimension.isSortColumn()) {

http://git-wip-us.apache.org/repos/asf/carbondata/blob/17892b17/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java
 
b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java
index 5e0bb41..88cf75c 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java
@@ -267,7 +267,7 @@ public class RowLevelRangeLessThanEqualFilterExecuterImpl 
extends RowLevelFilter
   DirectDictionaryGenerator directDictionaryGenerator = 
DirectDictionaryKeyGeneratorFactory
   .getDirectDictionaryGenerator(
   dimColEvaluatorInfoList.get(0).getDimension().getDataType());
-  int key = directDictionaryGenerator.generateDirectSurrogateKey(null) + 1;
+  int key = directDictionaryGenerator.generateDirectSurrogateKey(null);
   CarbonDimension currentBlockDimension =
   segmentProperties.getDimensions().get(dimensionBlocksIndex[0]);
   if (currentBlockDimension.isSortColumn()) {
@@ -324,7 +324,9 @@ public class RowLevelRangeLessThanEqualFilterExecuterImpl 
extends RowLevelFilter
   return bitSet;
 }
   } else {
-skip = start;
+// as start will be last index of null value inclusive
+// so adding 1 to skip last null value
+skip = start + 1;
   }
   startIndex = skip;
 }
@@ -392,7 +394,9 @@ public class RowLevelRangeLessThanEqualFilterExecuterImpl 
extends RowLevelFilter
 return bitSet;
   }
 } else {
-  skip = start;
+  // as start will be last index of null value inclusive
+  // so adding 1 to skip last null value
+  skip = start + 1;
 }
 startIndex = skip;
   }


[28/28] carbondata git commit: [CARBONDATA-1544][Datamap] Datamap FineGrain implementation

2017-11-18 Thread jackylk
[CARBONDATA-1544][Datamap] Datamap FineGrain implementation

Implemented interfaces for FG datamap and integrated to filterscanner to use 
the pruned bitset from FG datamap.
FG Query flow as follows.
1.The user can add FG datamap to any table and implement there interfaces.
2. Any filter query which hits the table with datamap will call prune method of 
FGdatamap.
3. The prune method of FGDatamap return list FineGrainBlocklet , these 
blocklets contain the information of block, blocklet, page and rowids 
information as well.
4. The pruned blocklets are internally wriitten to file and returns only the 
block , blocklet and filepath information as part of Splits.
5. Based on the splits scanrdd schedule the tasks.
6. In filterscanner we check the datamapwriterpath from split and reNoteads the 
bitset if exists. And pass this bitset as input to it.

This closes #1471


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/36c34cd0
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/36c34cd0
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/36c34cd0

Branch: refs/heads/fgdatamap
Commit: 36c34cd02893bd59018d7cab89778c314c3b119e
Parents: 2943108
Author: ravipesala 
Authored: Wed Nov 15 19:48:40 2017 +0530
Committer: Jacky Li 
Committed: Sat Nov 18 23:25:19 2017 +0800

--
 .../carbondata/core/datamap/DataMapMeta.java|   8 +-
 .../core/datamap/DataMapStoreManager.java   |  30 +-
 .../carbondata/core/datamap/DataMapType.java|  21 +
 .../carbondata/core/datamap/TableDataMap.java   |  31 +-
 .../core/datamap/dev/AbstractDataMapWriter.java | 110 +
 .../core/datamap/dev/BlockletSerializer.java|  57 +++
 .../carbondata/core/datamap/dev/DataMap.java|   4 +-
 .../core/datamap/dev/DataMapFactory.java|  14 +-
 .../core/datamap/dev/DataMapWriter.java |  57 ---
 .../cgdatamap/AbstractCoarseGrainDataMap.java   |  24 +
 .../AbstractCoarseGrainDataMapFactory.java  |  34 ++
 .../dev/fgdatamap/AbstractFineGrainDataMap.java |  24 +
 .../AbstractFineGrainDataMapFactory.java|  38 ++
 .../carbondata/core/datastore/DataRefNode.java  |   7 +
 .../core/datastore/block/TableBlockInfo.java|  10 +
 .../impl/btree/AbstractBTreeLeafNode.java   |   5 +
 .../datastore/impl/btree/BTreeNonLeafNode.java  |   5 +
 .../carbondata/core/indexstore/Blocklet.java|  30 +-
 .../core/indexstore/BlockletDetailsFetcher.java |   8 +
 .../core/indexstore/ExtendedBlocklet.java   |  19 +-
 .../core/indexstore/FineGrainBlocklet.java  | 120 +
 .../blockletindex/BlockletDataMap.java  |  11 +-
 .../blockletindex/BlockletDataMapFactory.java   |  62 ++-
 .../BlockletDataRefNodeWrapper.java |  27 +-
 .../indexstore/blockletindex/IndexWrapper.java  |  18 +
 .../core/indexstore/schema/FilterType.java  |  24 -
 .../executer/ExcludeFilterExecuterImpl.java |   3 +
 .../executer/IncludeFilterExecuterImpl.java |   3 +
 .../core/scan/processor/BlocksChunkHolder.java  |   5 -
 .../core/scan/scanner/impl/FilterScanner.java   |   2 +
 .../apache/carbondata/core/util/CarbonUtil.java |  97 +
 .../datamap/examples/MinMaxDataMap.java |  20 +-
 .../datamap/examples/MinMaxDataMapFactory.java  |  49 ++-
 .../datamap/examples/MinMaxDataWriter.java  |  36 +-
 .../examples/MinMaxIndexBlockDetails.java   |  13 -
 .../carbondata/hadoop/CarbonInputFormat.java|   2 +-
 .../carbondata/hadoop/CarbonInputSplit.java |  20 +-
 .../hadoop/api/CarbonTableInputFormat.java  |  23 +-
 .../testsuite/datamap/CGDataMapTestCase.scala   | 357 +++
 .../testsuite/datamap/DataMapWriterSuite.scala  |  49 ++-
 .../testsuite/datamap/FGDataMapTestCase.scala   | 436 +++
 .../carbondata/spark/rdd/CarbonScanRDD.scala|   6 +-
 .../datamap/DataMapWriterListener.java  |  57 ++-
 .../store/CarbonFactDataHandlerModel.java   |  10 +-
 .../store/writer/AbstractFactDataWriter.java| 126 +-
 45 files changed, 1731 insertions(+), 381 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/36c34cd0/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java 
b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
index 7746acf..dd15ccb 100644
--- a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
+++ b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
@@ -19,15 +19,15 @@ package org.apache.carbondata.core.datamap;
 
 import java.util.List;
 
-import org.apache.carbondata.core.indexstore.schema.FilterType;
+import 

[08/28] carbondata git commit: [CARBONDATA-1717]Remove spark broadcast for gettting hadoop configurations

2017-11-18 Thread jackylk
[CARBONDATA-1717]Remove spark broadcast for gettting hadoop configurations

This closes #1500


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/b6777fcc
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/b6777fcc
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/b6777fcc

Branch: refs/heads/fgdatamap
Commit: b6777fcc32df3ce3616ea02f5566ab5bf4ca6e30
Parents: 733bb51
Author: akashrn5 
Authored: Fri Oct 27 18:11:03 2017 +0530
Committer: QiangCai 
Committed: Fri Nov 17 14:33:10 2017 +0800

--
 .../spark/rdd/NewCarbonDataLoadRDD.scala| 26 
 1 file changed, 21 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/b6777fcc/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/NewCarbonDataLoadRDD.scala
--
diff --git 
a/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/NewCarbonDataLoadRDD.scala
 
b/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/NewCarbonDataLoadRDD.scala
index 6f44a0d..9ca21bc 100644
--- 
a/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/NewCarbonDataLoadRDD.scala
+++ 
b/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/NewCarbonDataLoadRDD.scala
@@ -17,7 +17,7 @@
 
 package org.apache.carbondata.spark.rdd
 
-import java.io.{File, IOException, ObjectInputStream, ObjectOutputStream}
+import java.io._
 import java.nio.ByteBuffer
 import java.text.SimpleDateFormat
 import java.util.{Date, UUID}
@@ -41,7 +41,9 @@ import org.apache.carbondata.common.CarbonIterator
 import org.apache.carbondata.common.logging.LogServiceFactory
 import org.apache.carbondata.common.logging.impl.StandardLogService
 import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datastore.compression.CompressorFactory
 import org.apache.carbondata.core.statusmanager.{LoadMetadataDetails, 
SegmentStatus}
+import org.apache.carbondata.core.statusmanager.LoadMetadataDetails
 import org.apache.carbondata.core.util.{CarbonProperties, 
CarbonTimeStatisticsFactory, ThreadLocalTaskInfo}
 import org.apache.carbondata.processing.loading.{DataLoadExecutor, 
FailureCauses}
 import org.apache.carbondata.processing.loading.csvinput.{BlockDetails, 
CSVInputFormat, CSVRecordReaderIterator}
@@ -187,9 +189,23 @@ class NewCarbonDataLoadRDD[K, V](
 formatter.format(new Date())
   }
 
-  // A Hadoop Configuration can be about 10 KB, which is pretty big, so 
broadcast it
-  private val confBroadcast =
-sc.broadcast(new SerializableConfiguration(sc.hadoopConfiguration))
+  private val confBytes = {
+val bao = new ByteArrayOutputStream()
+val oos = new ObjectOutputStream(bao)
+sc.hadoopConfiguration.write(oos)
+oos.close()
+CompressorFactory.getInstance().getCompressor.compressByte(bao.toByteArray)
+  }
+
+  private def getConf = {
+val configuration = new Configuration(false)
+val bai = new 
ByteArrayInputStream(CompressorFactory.getInstance().getCompressor
+  .unCompressByte(confBytes))
+val ois = new ObjectInputStream(bai)
+configuration.readFields(ois)
+ois.close()
+configuration
+  }
 
   override def getPartitions: Array[Partition] = {
 blocksGroupBy.zipWithIndex.map { b =>
@@ -255,7 +271,7 @@ class NewCarbonDataLoadRDD[K, V](
 
   def getInputIterators: Array[CarbonIterator[Array[AnyRef]]] = {
 val attemptId = new TaskAttemptID(jobTrackerId, id, TaskType.MAP, 
theSplit.index, 0)
-var configuration: Configuration = confBroadcast.value.value
+var configuration: Configuration = getConf
 if (configuration == null) {
   configuration = new Configuration()
 }



[19/28] carbondata git commit: [CARBONDATA-1626]add data size and index size in table status file

2017-11-18 Thread jackylk
[CARBONDATA-1626]add data size and index size in table status file

This closes #1435


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/589f126d
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/589f126d
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/589f126d

Branch: refs/heads/fgdatamap
Commit: 589f126dea872f54c2096c9572436bf10589b1ca
Parents: f22e614
Author: akashrn5 
Authored: Wed Oct 25 15:27:37 2017 +0530
Committer: ravipesala 
Committed: Fri Nov 17 21:43:15 2017 +0530

--
 .../core/constants/CarbonCommonConstants.java   |  26 +++
 .../core/datastore/impl/FileFactory.java|   2 +-
 .../core/statusmanager/LoadMetadataDetails.java |  18 ++
 .../apache/carbondata/core/util/CarbonUtil.java | 152 
 .../core/util/path/CarbonTablePath.java |   8 +
 .../spark/rdd/CarbonDataRDDFactory.scala|  11 +-
 .../CarbonDescribeFormattedCommand.scala|  10 ++
 .../spark/sql/GetDataSizeAndIndexSizeTest.scala | 172 +++
 .../processing/merger/CarbonDataMergerUtil.java |   7 +-
 9 files changed, 398 insertions(+), 8 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/589f126d/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
index 0a7dfdd..762ef6d 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java
@@ -1380,6 +1380,32 @@ public final class CarbonCommonConstants {
 
   public static final String AGGREGATIONDATAMAPSCHEMA = 
"AggregateDataMapHandler";
 
+  /*
+   * The total size of carbon data
+   */
+  public static final String CARBON_TOTAL_DATA_SIZE = "datasize";
+
+  /**
+   * The total size of carbon index
+   */
+  public static final String CARBON_TOTAL_INDEX_SIZE = "indexsize";
+
+  /**
+   * ENABLE_CALCULATE_DATA_INDEX_SIZE
+   */
+  @CarbonProperty public static final String ENABLE_CALCULATE_SIZE = 
"carbon.enable.calculate.size";
+
+  /**
+   * DEFAULT_ENABLE_CALCULATE_DATA_INDEX_SIZE
+   */
+  @CarbonProperty public static final String DEFAULT_ENABLE_CALCULATE_SIZE = 
"true";
+
+  public static final String TABLE_DATA_SIZE = "Table Data Size";
+
+  public static final String TABLE_INDEX_SIZE = "Table Index Size";
+
+  public static final String LAST_UPDATE_TIME = "Last Update Time";
+
   private CarbonCommonConstants() {
   }
 }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/589f126d/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java 
b/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
index 57a48ec..240253d 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/datastore/impl/FileFactory.java
@@ -541,7 +541,7 @@ public final class FileFactory {
* @param fileType
* @return updated file path without url for local
*/
-  private static String getUpdatedFilePath(String filePath, FileType fileType) 
{
+  public static String getUpdatedFilePath(String filePath, FileType fileType) {
 switch (fileType) {
   case HDFS:
   case ALLUXIO:

http://git-wip-us.apache.org/repos/asf/carbondata/blob/589f126d/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
index d838e2e..b282d53 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/LoadMetadataDetails.java
@@ -41,6 +41,24 @@ public class LoadMetadataDetails implements Serializable {
   private String partitionCount;
 
   private String isDeleted = CarbonCommonConstants.KEYWORD_FALSE;
+  private String dataSize;
+  private String indexSize;
+
+  public String getDataSize() {
+return dataSize;
+  }
+
+  public void setDataSize(String dataSize) {
+this.dataSize = dataSize;
+  }
+
+  public String getIndexSize() {
+return indexSize;
+  }
+
+  public void 

[10/28] carbondata git commit: [CARBONDATA-1739] Clean up store path interface

2017-11-18 Thread jackylk
http://git-wip-us.apache.org/repos/asf/carbondata/blob/5fc7f06f/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDataMapShowCommand.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDataMapShowCommand.scala
 
b/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDataMapShowCommand.scala
index 822455c..64a066c 100644
--- 
a/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDataMapShowCommand.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDataMapShowCommand.scala
@@ -47,9 +47,7 @@ case class CarbonDataMapShowCommand(
 
   override def processData(sparkSession: SparkSession): Seq[Row] = {
 Checker.validateTableExists(databaseNameOp, tableName, sparkSession)
-val carbonTable = CarbonEnv.getInstance(sparkSession).carbonMetastore.
-  lookupRelation(databaseNameOp, 
tableName)(sparkSession).asInstanceOf[CarbonRelation].
-  tableMeta.carbonTable
+val carbonTable = CarbonEnv.getCarbonTable(databaseNameOp, 
tableName)(sparkSession)
 val schemaList = carbonTable.getTableInfo.getDataMapSchemaList
 if (schemaList != null && schemaList.size() > 0) {
   schemaList.asScala.map { s =>

http://git-wip-us.apache.org/repos/asf/carbondata/blob/5fc7f06f/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDropDataMapCommand.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDropDataMapCommand.scala
 
b/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDropDataMapCommand.scala
index 66f2756..f34afbf 100644
--- 
a/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDropDataMapCommand.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDropDataMapCommand.scala
@@ -34,6 +34,7 @@ import org.apache.carbondata.core.locks.{CarbonLockUtil, 
ICarbonLock, LockUsage}
 import org.apache.carbondata.core.metadata.{AbsoluteTableIdentifier, 
CarbonTableIdentifier}
 import 
org.apache.carbondata.core.metadata.converter.ThriftWrapperSchemaConverterImpl
 import org.apache.carbondata.core.metadata.schema.table.CarbonTable
+import org.apache.carbondata.core.util.CarbonProperties
 import org.apache.carbondata.events._
 
 
@@ -60,12 +61,11 @@ case class CarbonDropDataMapCommand(
 val LOGGER: LogService = 
LogServiceFactory.getLogService(this.getClass.getCanonicalName)
 val dbName = GetDB.getDatabaseName(databaseNameOp, sparkSession)
 val identifier = TableIdentifier(tableName, Option(dbName))
-val carbonTableIdentifier = new CarbonTableIdentifier(dbName, tableName, 
"")
 val locksToBeAcquired = List(LockUsage.METADATA_LOCK)
 val carbonEnv = CarbonEnv.getInstance(sparkSession)
 val catalog = carbonEnv.carbonMetastore
 val databaseLocation = GetDB.getDatabaseLocation(dbName, sparkSession,
-  CarbonEnv.getInstance(sparkSession).storePath)
+  CarbonProperties.getStorePath)
 val tablePath = databaseLocation + CarbonCommonConstants.FILE_SEPARATOR + 
tableName.toLowerCase
 val tableIdentifier =
   AbsoluteTableIdentifier.from(tablePath, dbName.toLowerCase, 
tableName.toLowerCase)
@@ -76,20 +76,19 @@ case class CarbonDropDataMapCommand(
 lock => carbonLocks += CarbonLockUtil.getLockObject(tableIdentifier, 
lock)
   }
   LOGGER.audit(s"Deleting datamap [$dataMapName] under table [$tableName]")
-  val carbonTable: Option[CarbonTable] =
-catalog.getTableFromMetadataCache(dbName, tableName) match {
-  case Some(tableMeta) => Some(tableMeta.carbonTable)
-  case None => try {
-Some(catalog.lookupRelation(identifier)(sparkSession)
-  .asInstanceOf[CarbonRelation].metaData.carbonTable)
-  } catch {
-case ex: NoSuchTableException =>
-  if (!ifExistsSet) {
-throw ex
-  }
-  None
-  }
+  var carbonTable: Option[CarbonTable] =
+catalog.getTableFromMetadataCache(dbName, tableName)
+  if (carbonTable.isEmpty) {
+try {
+  carbonTable = Some(catalog.lookupRelation(identifier)(sparkSession)
+.asInstanceOf[CarbonRelation].metaData.carbonTable)
+} catch {
+  case ex: NoSuchTableException =>
+if (!ifExistsSet) {
+  throw ex
+}
 }
+  }
   if (carbonTable.isDefined && 
carbonTable.get.getTableInfo.getDataMapSchemaList.size() > 0) {
 val dataMapSchema = 
carbonTable.get.getTableInfo.getDataMapSchemaList.asScala.zipWithIndex.
   

[06/28] carbondata git commit: [CARBONDATA-1733] While load is in progress, Show segments is throwing NPE This closes #1505

2017-11-18 Thread jackylk
[CARBONDATA-1733] While load is in progress, Show segments is throwing NPE This 
closes #1505


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/6551620b
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/6551620b
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/6551620b

Branch: refs/heads/fgdatamap
Commit: 6551620b2db60667aeb0fb95bd573759d2bd4636
Parents: 808a334 aff3b9e
Author: kumarvishal 
Authored: Thu Nov 16 20:48:01 2017 +0530
Committer: kumarvishal 
Committed: Thu Nov 16 20:48:01 2017 +0530

--
 .../carbondata/core/statusmanager/LoadMetadataDetails.java   | 4 
 1 file changed, 4 insertions(+)
--




[20/28] carbondata git commit: [CARBONDATA-1764] Fix issue of when create table with short data type

2017-11-18 Thread jackylk
[CARBONDATA-1764] Fix issue of when create table with short data type

Fix issue of when create table with short data type

This closes #1526


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/d74251fa
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/d74251fa
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/d74251fa

Branch: refs/heads/fgdatamap
Commit: d74251fa35debc5703fc8cb128eef3f58d9ab59f
Parents: 589f126
Author: xubo245 <601450...@qq.com>
Authored: Fri Nov 17 23:23:31 2017 +0800
Committer: Jacky Li 
Committed: Sat Nov 18 01:32:34 2017 +0800

--
 .../aggquery/IntegerDataTypeTestCase.scala  | 21 -
 .../spark/util/DataTypeConverterUtilSuite.scala | 33 
 .../spark/util/DataTypeConverterUtil.scala  |  1 +
 3 files changed, 54 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/d74251fa/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/aggquery/IntegerDataTypeTestCase.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/aggquery/IntegerDataTypeTestCase.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/aggquery/IntegerDataTypeTestCase.scala
index dc4dc3a..4f9d09d 100644
--- 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/aggquery/IntegerDataTypeTestCase.scala
+++ 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/aggquery/IntegerDataTypeTestCase.scala
@@ -32,6 +32,7 @@ class IntegerDataTypeTestCase extends QueryTest with 
BeforeAndAfterAll {
 
   override def beforeAll {
 sql("DROP TABLE IF EXISTS integertypetableAgg")
+sql("DROP TABLE IF EXISTS short_table")
 sql("CREATE TABLE integertypetableAgg (empno int, workgroupcategory 
string, deptno int, projectcode int, attendance int) STORED BY 
'org.apache.carbondata.format'")
 sql(s"""LOAD DATA local inpath '$resourcesPath/data.csv' INTO TABLE 
integertypetableAgg OPTIONS ('DELIMITER'= ',', 'QUOTECHAR'= '\"', 
'FILEHEADER'='')""")
   }
@@ -141,7 +142,25 @@ class IntegerDataTypeTestCase extends QueryTest with 
BeforeAndAfterAll {
 | DROP TABLE short_int_target_table
   """.stripMargin)
   }
-  
+
+  test("Create a table that contains short data type") {
+sql("CREATE TABLE if not exists short_table(col1 short, col2 BOOLEAN) 
STORED BY 'carbondata'")
+
+sql("insert into short_table values(1,true)")
+sql("insert into short_table values(11,false)")
+sql("insert into short_table values(211,false)")
+sql("insert into short_table values(3111,true)")
+sql("insert into short_table values(3,false)")
+sql("insert into short_table values(41,false)")
+sql("insert into short_table values(511,true)")
+
+checkAnswer(
+  sql("select count(*) from short_table"),
+  Row(7)
+)
+sql("DROP TABLE IF EXISTS short_table")
+  }
+
   override def afterAll {
 sql("drop table if exists integertypetableAgg")
 CarbonProperties.getInstance().addProperty(

http://git-wip-us.apache.org/repos/asf/carbondata/blob/d74251fa/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/util/DataTypeConverterUtilSuite.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/util/DataTypeConverterUtilSuite.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/util/DataTypeConverterUtilSuite.scala
new file mode 100644
index 000..0dd7b23
--- /dev/null
+++ 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/util/DataTypeConverterUtilSuite.scala
@@ -0,0 +1,33 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package 

[27/28] carbondata git commit: [CARBONDATA-1544][Datamap] Datamap FineGrain implementation

2017-11-18 Thread jackylk
http://git-wip-us.apache.org/repos/asf/carbondata/blob/36c34cd0/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
--
diff --git 
a/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
 
b/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
index 78544d3..fe0bbcf 100644
--- 
a/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
+++ 
b/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
@@ -19,7 +19,6 @@ package org.apache.carbondata.datamap.examples;
 
 import java.io.BufferedWriter;
 import java.io.DataOutputStream;
-import java.io.File;
 import java.io.IOException;
 import java.io.OutputStreamWriter;
 import java.util.ArrayList;
@@ -29,17 +28,18 @@ import java.util.Map;
 
 import org.apache.carbondata.common.logging.LogService;
 import org.apache.carbondata.common.logging.LogServiceFactory;
-import org.apache.carbondata.core.constants.CarbonCommonConstants;
-import org.apache.carbondata.core.datamap.dev.DataMapWriter;
+import org.apache.carbondata.core.datamap.dev.AbstractDataMapWriter;
 import org.apache.carbondata.core.datastore.impl.FileFactory;
 import org.apache.carbondata.core.datastore.page.ColumnPage;
+import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier;
 import org.apache.carbondata.core.metadata.schema.table.TableInfo;
 import org.apache.carbondata.core.util.ByteUtil;
 import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.path.CarbonTablePath;
 
 import com.google.gson.Gson;
 
-public class MinMaxDataWriter implements DataMapWriter {
+public class MinMaxDataWriter extends AbstractDataMapWriter {
 
   private static final LogService LOGGER =
   LogServiceFactory.getLogService(TableInfo.class.getName());
@@ -50,17 +50,23 @@ public class MinMaxDataWriter implements DataMapWriter {
 
   private Map blockMinMaxMap;
 
-  private String blockPath;
+  private String dataWritePath;
 
+  public MinMaxDataWriter(AbsoluteTableIdentifier identifier, String segmentId,
+  String dataWritePath) {
+super(identifier, segmentId, dataWritePath);
+this.identifier = identifier;
+this.segmentId = segmentId;
+this.dataWritePath = dataWritePath;
+  }
 
-  @Override public void onBlockStart(String blockId, String blockPath) {
+  @Override public void onBlockStart(String blockId) {
 pageLevelMax = null;
 pageLevelMin = null;
 blockletLevelMax = null;
 blockletLevelMin = null;
 blockMinMaxMap = null;
 blockMinMaxMap = new HashMap();
-this.blockPath = blockPath;
   }
 
   @Override public void onBlockEnd(String blockId) {
@@ -161,7 +167,7 @@ public class MinMaxDataWriter implements DataMapWriter {
 List tempMinMaxIndexBlockDetails = null;
 tempMinMaxIndexBlockDetails = loadBlockDetails();
 try {
-  writeMinMaxIndexFile(tempMinMaxIndexBlockDetails, blockPath, blockId);
+  writeMinMaxIndexFile(tempMinMaxIndexBlockDetails, blockId);
 } catch (IOException ex) {
   LOGGER.info(" Unable to write the file");
 }
@@ -178,7 +184,6 @@ public class MinMaxDataWriter implements DataMapWriter {
   
tmpminMaxIndexBlockDetails.setMinValues(blockMinMaxMap.get(index).getMin());
   
tmpminMaxIndexBlockDetails.setMaxValues(blockMinMaxMap.get(index).getMax());
   tmpminMaxIndexBlockDetails.setBlockletId(index);
-  tmpminMaxIndexBlockDetails.setFilePath(this.blockPath);
   minMaxIndexBlockDetails.add(tmpminMaxIndexBlockDetails);
 }
 return minMaxIndexBlockDetails;
@@ -187,22 +192,19 @@ public class MinMaxDataWriter implements DataMapWriter {
   /**
* Write the data to a file. This is JSON format file.
* @param minMaxIndexBlockDetails
-   * @param blockPath
* @param blockId
* @throws IOException
*/
   public void writeMinMaxIndexFile(List 
minMaxIndexBlockDetails,
-  String blockPath, String blockId) throws IOException {
-String filePath = blockPath.substring(0, 
blockPath.lastIndexOf(File.separator) + 1) + blockId
-+ ".minmaxindex";
+  String blockId) throws IOException {
+String filePath = dataWritePath +"/" + blockId + ".minmaxindex";
 BufferedWriter brWriter = null;
 DataOutputStream dataOutStream = null;
 try {
   FileFactory.createNewFile(filePath, FileFactory.getFileType(filePath));
   dataOutStream = FileFactory.getDataOutputStream(filePath, 
FileFactory.getFileType(filePath));
   Gson gsonObjectToWrite = new Gson();
-  brWriter = new BufferedWriter(new OutputStreamWriter(dataOutStream,
-  CarbonCommonConstants.CARBON_DEFAULT_STREAM_ENCODEFORMAT));
+  brWriter = new BufferedWriter(new 

[11/28] carbondata git commit: [CARBONDATA-1739] Clean up store path interface

2017-11-18 Thread jackylk
[CARBONDATA-1739] Clean up store path interface

This closes #1509


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/5fc7f06f
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/5fc7f06f
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/5fc7f06f

Branch: refs/heads/fgdatamap
Commit: 5fc7f06f23e944719b2735b97176d68fe209ad75
Parents: b6777fc
Author: Jacky Li 
Authored: Thu Nov 16 19:41:19 2017 +0800
Committer: QiangCai 
Committed: Fri Nov 17 14:46:19 2017 +0800

--
 .../dictionary/ManageDictionaryAndBTree.java|   2 +-
 .../core/metadata/CarbonMetadata.java   |   2 +-
 .../core/metadata/schema/table/CarbonTable.java |   4 +-
 .../core/mutate/CarbonUpdateUtil.java   |   8 +-
 .../carbondata/core/scan/model/QueryModel.java  |   4 +-
 .../carbondata/core/util/CarbonProperties.java  |   7 +
 .../core/metadata/CarbonMetadataTest.java   |   9 +-
 .../metadata/schema/table/CarbonTableTest.java  |   3 +-
 .../table/CarbonTableWithComplexTypesTest.java  |   2 +-
 .../carbondata/examples/StreamExample.scala |   4 +-
 .../carbondata/hadoop/CarbonInputFormat.java|   2 +-
 .../hadoop/api/CarbonTableInputFormat.java  |   4 +-
 .../streaming/CarbonStreamRecordReader.java |  10 +-
 .../streaming/CarbonStreamRecordWriter.java |   4 +-
 .../hadoop/util/CarbonInputFormatUtil.java  |   6 +-
 .../hadoop/test/util/StoreCreator.java  |   4 +-
 .../presto/impl/CarbonTableReader.java  |   2 +-
 .../presto/util/CarbonDataStoreCreator.scala|   4 +-
 .../TestPreAggregateTableSelection.scala|   2 +-
 .../partition/TestDDLForPartitionTable.scala|   6 +-
 ...ForPartitionTableWithDefaultProperties.scala |   8 +-
 .../carbondata/spark/load/ValidateUtil.scala|   4 +-
 .../spark/rdd/AlterTableLoadPartitionRDD.scala  |   2 +-
 .../spark/rdd/NewCarbonDataLoadRDD.scala|   2 +-
 .../carbondata/spark/rdd/PartitionDropper.scala |   2 +-
 .../spark/rdd/PartitionSplitter.scala   |   2 +-
 .../carbondata/spark/util/CommonUtil.scala  |  32 +
 .../carbondata/spark/util/DataLoadingUtil.scala |   8 +-
 .../spark/util/GlobalDictionaryUtil.scala   |  12 +-
 .../command/carbonTableSchemaCommon.scala   |   4 +-
 .../spark/rdd/CarbonDataRDDFactory.scala|  12 +-
 .../carbondata/spark/util/CarbonSparkUtil.scala |  18 ++-
 .../spark/sql/CarbonDataFrameWriter.scala   |   3 +-
 .../sql/CarbonDatasourceHadoopRelation.scala|   4 +-
 .../spark/sql/CarbonDictionaryDecoder.scala |  16 +--
 .../scala/org/apache/spark/sql/CarbonEnv.scala  |  36 -
 .../scala/org/apache/spark/sql/CarbonScan.scala |   6 +-
 .../org/apache/spark/sql/CarbonSource.scala |  20 +--
 .../command/CarbonCreateTableCommand.scala  |   4 +-
 .../CarbonDescribeFormattedCommand.scala|  20 +--
 .../command/CarbonDropTableCommand.scala|   9 +-
 .../datamap/CarbonDataMapShowCommand.scala  |   4 +-
 .../datamap/CarbonDropDataMapCommand.scala  |  31 +++--
 .../AlterTableCompactionCommand.scala   |  12 +-
 .../management/CarbonShowLoadsCommand.scala |   4 +-
 .../command/management/CleanFilesCommand.scala  |  10 +-
 .../management/DeleteLoadByIdCommand.scala  |   4 +-
 .../DeleteLoadByLoadDateCommand.scala   |   4 +-
 .../management/LoadTableByInsertCommand.scala   |   2 +-
 .../command/management/LoadTableCommand.scala   |  62 -
 .../command/mutation/DeleteExecution.scala  |  13 +-
 .../command/mutation/HorizontalCompaction.scala |   8 +-
 .../command/mutation/IUDCommonUtil.scala|   2 +-
 .../mutation/ProjectForDeleteCommand.scala  |   7 +-
 .../mutation/ProjectForUpdateCommand.scala  |  11 +-
 .../AlterTableDropCarbonPartitionCommand.scala  |  19 +--
 .../AlterTableSplitCarbonPartitionCommand.scala |  19 +--
 .../partition/ShowCarbonPartitionsCommand.scala |   7 +-
 .../CreatePreAggregateTableCommand.scala|   7 +-
 .../preaaggregate/PreAggregateListeners.scala   |   6 +-
 .../preaaggregate/PreAggregateUtil.scala|  37 +++---
 .../CarbonAlterTableAddColumnCommand.scala  |   4 +-
 .../CarbonAlterTableDataTypeChangeCommand.scala |   4 +-
 .../CarbonAlterTableDropColumnCommand.scala |   4 +-
 .../schema/CarbonAlterTableRenameCommand.scala  |   7 +-
 .../strategy/CarbonLateDecodeStrategy.scala |   4 +-
 .../sql/execution/strategy/DDLStrategy.scala|  11 +-
 .../strategy/StreamingTableStrategy.scala   |   3 +-
 .../spark/sql/hive/CarbonFileMetastore.scala|  61 -
 .../spark/sql/hive/CarbonHiveMetaStore.scala|  13 +-
 .../apache/spark/sql/hive/CarbonMetaStore.scala |   4 +-
 .../sql/hive/CarbonPreAggregateRules.scala  |   2 +-
 .../apache/spark/sql/hive/CarbonRelation.scala  |  26 ++--
 .../spark/sql/hive/CarbonSessionState.scala |  13 

[24/28] carbondata git commit: [CARBONDATA-1751] Modify sys.err to AnalysisException when uses run related operation except IUD, compaction and alter

2017-11-18 Thread jackylk
[CARBONDATA-1751] Modify sys.err to AnalysisException when uses run related 
operation except IUD,compaction and alter

carbon printout improper error message, for example, it printout system error 
when users run create table with the same column name, but it should printout 
related exception information

So we modify sys.error method to AnalysisException when uses run related 
operation except IUD,compaction and alter

This closes #1515


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/2a12938b
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/2a12938b
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/2a12938b

Branch: refs/heads/fgdatamap
Commit: 2a12938b545cca5e3c09396dd68393ea615038fa
Parents: 1b8d348
Author: xubo245 <601450...@qq.com>
Authored: Fri Nov 17 10:48:31 2017 +0800
Committer: Jacky Li 
Committed: Sat Nov 18 16:24:23 2017 +0800

--
 .../command/CarbonTableSchemaCommonSuite.scala  | 72 
 .../org/apache/carbondata/api/CarbonStore.scala |  4 +-
 .../carbondata/spark/util/CommonUtil.scala  | 17 ++---
 .../spark/util/DataTypeConverterUtil.scala  |  4 +-
 .../catalyst/AbstractCarbonSparkSQLParser.scala |  3 +-
 .../spark/sql/catalyst/CarbonDDLSqlParser.scala |  4 +-
 .../command/carbonTableSchemaCommon.scala   | 10 +--
 .../apache/spark/sql/util/CarbonException.scala | 24 +++
 .../sql/parser/CarbonSpark2SqlParser.scala  |  2 +-
 .../apache/spark/util/CarbonCommandSuite.scala  |  2 +-
 10 files changed, 124 insertions(+), 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/2a12938b/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
new file mode 100644
index 000..67dfa8f
--- /dev/null
+++ 
b/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
@@ -0,0 +1,72 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.command
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.test.util.QueryTest
+import org.junit.Assert
+import org.scalatest.BeforeAndAfterAll
+
+class CarbonTableSchemaCommonSuite extends QueryTest with BeforeAndAfterAll {
+
+  test("Creating table: Duplicate dimensions found with name, it should throw 
AnalysisException") {
+sql("DROP TABLE IF EXISTS carbon_table")
+try {
+  sql(
+s"""
+   | CREATE TABLE carbon_table(
+   | BB INT, bb char(10)
+   | )
+   | STORED BY 'carbondata'
+   """.stripMargin)
+  Assert.assertTrue(false)
+} catch {
+  case _: AnalysisException => Assert.assertTrue(true)
+  case _: Exception => Assert.assertTrue(false)
+} finally {
+  sql("DROP TABLE IF EXISTS carbon_table")
+}
+  }
+
+  test("Altering table: Duplicate column found with name, it should throw 
RuntimeException") {
+sql("DROP TABLE IF EXISTS carbon_table")
+sql(
+  s"""
+ | CREATE TABLE if not exists carbon_table(
+ | BB INT, cc char(10)
+ | )
+ | STORED BY 'carbondata'
+   """.stripMargin)
+
+try {
+  sql(
+s"""
+   | alter TABLE carbon_table add columns(
+   | bb char(10)
+)
+   """.stripMargin)
+  Assert.assertTrue(false)
+} catch {
+  case _: RuntimeException => Assert.assertTrue(true)
+  case _: Exception => Assert.assertTrue(false)
+} finally {
+  sql("DROP TABLE IF EXISTS carbon_table")
+}
+  }
+
+}


[23/28] carbondata git commit: [CARBONDATA-1753][Streaming]Fix missing 'org.scalatest.tools.Runner' issue when run test with streaming module

2017-11-18 Thread jackylk
[CARBONDATA-1753][Streaming]Fix missing 'org.scalatest.tools.Runner' issue when 
run test with streaming module

This closes #1519


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/1b8d348c
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/1b8d348c
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/1b8d348c

Branch: refs/heads/fgdatamap
Commit: 1b8d348c041021038a5a542de1ef7da1855b514a
Parents: ee71610
Author: Zhang Zhichao <441586...@qq.com>
Authored: Fri Nov 17 14:27:23 2017 +0800
Committer: chenliang613 
Committed: Sat Nov 18 10:48:08 2017 +0800

--
 streaming/pom.xml | 7 ++-
 1 file changed, 6 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/1b8d348c/streaming/pom.xml
--
diff --git a/streaming/pom.xml b/streaming/pom.xml
index d9dac75..713bf2d 100644
--- a/streaming/pom.xml
+++ b/streaming/pom.xml
@@ -28,7 +28,12 @@
 
   junit
   junit
-  3.8.1
+  test
+
+
+  org.scalatest
+  scalatest_${scala.binary.version}
+  2.2.1
   test
 
   



[17/28] carbondata git commit: [CARBONDATA-1752] There are some scalastyle error should be optimized in CarbonData

2017-11-18 Thread jackylk
[CARBONDATA-1752] There are some scalastyle error should be optimized in 
CarbonData

There are some scalastyle error should be optimized in CarbonData

This closes #1518


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/1d2af629
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/1d2af629
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/1d2af629

Branch: refs/heads/fgdatamap
Commit: 1d2af629367b906f5424819f94e208aa9f30db4d
Parents: dfc7442
Author: xubo245 <601450...@qq.com>
Authored: Fri Nov 17 11:25:14 2017 +0800
Committer: Jacky Li 
Committed: Fri Nov 17 17:23:50 2017 +0800

--
 .../org/apache/carbondata/api/CarbonStore.scala |  2 +-
 .../carbondata/spark/util/CommonUtil.scala  |  5 ++--
 .../spark/util/DataTypeConverterUtil.scala  |  2 +-
 .../spark/sql/catalyst/CarbonDDLSqlParser.scala |  2 +-
 .../command/carbonTableSchemaCommon.scala   | 25 +++-
 5 files changed, 19 insertions(+), 17 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/1d2af629/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
--
diff --git 
a/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
 
b/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
index 3d93a65..a2c9c6d 100644
--- 
a/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
+++ 
b/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
@@ -29,7 +29,7 @@ import org.apache.carbondata.common.logging.LogServiceFactory
 import org.apache.carbondata.core.constants.CarbonCommonConstants
 import org.apache.carbondata.core.datastore.impl.FileFactory
 import org.apache.carbondata.core.locks.{CarbonLockUtil, ICarbonLock, 
LockUsage}
-import org.apache.carbondata.core.metadata.{AbsoluteTableIdentifier, 
CarbonTableIdentifier}
+import org.apache.carbondata.core.metadata.{AbsoluteTableIdentifier}
 import org.apache.carbondata.core.metadata.schema.table.CarbonTable
 import org.apache.carbondata.core.mutate.CarbonUpdateUtil
 import org.apache.carbondata.core.statusmanager.SegmentStatusManager

http://git-wip-us.apache.org/repos/asf/carbondata/blob/1d2af629/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
--
diff --git 
a/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
 
b/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
index a3572ed..6c0e802 100644
--- 
a/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
+++ 
b/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
@@ -27,10 +27,9 @@ import scala.collection.mutable.Map
 
 import org.apache.commons.lang3.StringUtils
 import org.apache.hadoop.conf.Configuration
-import org.apache.hadoop.fs.Path
 import org.apache.hadoop.mapreduce.lib.input.FileInputFormat
 import org.apache.spark.SparkContext
-import org.apache.spark.sql.{Row, RowFactory, SQLContext}
+import org.apache.spark.sql.{Row, RowFactory}
 import org.apache.spark.sql.catalyst.expressions.{Attribute, 
AttributeReference}
 import org.apache.spark.sql.execution.command.{ColumnProperty, Field, 
PartitionerField}
 import org.apache.spark.sql.types.{MetadataBuilder, StringType}
@@ -41,7 +40,7 @@ import 
org.apache.carbondata.core.constants.CarbonCommonConstants
 import org.apache.carbondata.core.datastore.filesystem.CarbonFile
 import org.apache.carbondata.core.datastore.impl.FileFactory
 import org.apache.carbondata.core.memory.{UnsafeMemoryManager, 
UnsafeSortMemoryManager}
-import org.apache.carbondata.core.metadata.{AbsoluteTableIdentifier, 
CarbonTableIdentifier}
+import org.apache.carbondata.core.metadata.{AbsoluteTableIdentifier}
 import org.apache.carbondata.core.metadata.datatype.{DataType, DataTypes}
 import org.apache.carbondata.core.metadata.schema.PartitionInfo
 import org.apache.carbondata.core.metadata.schema.partition.PartitionType

http://git-wip-us.apache.org/repos/asf/carbondata/blob/1d2af629/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/DataTypeConverterUtil.scala
--
diff --git 
a/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/DataTypeConverterUtil.scala
 
b/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/DataTypeConverterUtil.scala
index 6cf7298..38657ac 100644
--- 

[01/28] carbondata git commit: [CARBONDATA-1612][CARBONDATA-1615][Streaming] Support delete segment for streaming table [Forced Update!]

2017-11-18 Thread jackylk
Repository: carbondata
Updated Branches:
  refs/heads/fgdatamap 40259b362 -> 36c34cd02 (forced update)


[CARBONDATA-1612][CARBONDATA-1615][Streaming] Support delete segment for 
streaming table

This closes #1497


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/09d02056
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/09d02056
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/09d02056

Branch: refs/heads/fgdatamap
Commit: 09d020561a6a5b7bd90b769e608b4130baa43667
Parents: 1155d4d
Author: Jacky Li 
Authored: Tue Nov 14 20:41:24 2017 +0800
Committer: QiangCai 
Committed: Wed Nov 15 09:35:53 2017 +0800

--
 .../spark/sql/CarbonCatalystOperators.scala |  4 +-
 .../TestStreamingTableOperation.scala   | 55 +++-
 2 files changed, 55 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/09d02056/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonCatalystOperators.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonCatalystOperators.scala
 
b/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonCatalystOperators.scala
index 48f1a09..62632df 100644
--- 
a/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonCatalystOperators.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonCatalystOperators.scala
@@ -119,10 +119,10 @@ case class ShowLoadsCommand(
   extends Command {
 
   override def output: Seq[Attribute] = {
-Seq(AttributeReference("SegmentSequenceId", StringType, nullable = 
false)(),
+Seq(AttributeReference("Segment Id", StringType, nullable = false)(),
   AttributeReference("Status", StringType, nullable = false)(),
   AttributeReference("Load Start Time", TimestampType, nullable = false)(),
-  AttributeReference("Load End Time", TimestampType, nullable = false)(),
+  AttributeReference("Load End Time", TimestampType, nullable = true)(),
   AttributeReference("Merged To", StringType, nullable = false)())
   }
 }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/09d02056/integration/spark2/src/test/scala/org/apache/spark/carbondata/TestStreamingTableOperation.scala
--
diff --git 
a/integration/spark2/src/test/scala/org/apache/spark/carbondata/TestStreamingTableOperation.scala
 
b/integration/spark2/src/test/scala/org/apache/spark/carbondata/TestStreamingTableOperation.scala
index 3fb1424..b29cca4 100644
--- 
a/integration/spark2/src/test/scala/org/apache/spark/carbondata/TestStreamingTableOperation.scala
+++ 
b/integration/spark2/src/test/scala/org/apache/spark/carbondata/TestStreamingTableOperation.scala
@@ -19,6 +19,7 @@ package org.apache.spark.carbondata
 
 import java.io.{File, PrintWriter}
 import java.net.ServerSocket
+import java.util.{Calendar, Date}
 import java.util.concurrent.Executors
 
 import scala.collection.mutable
@@ -103,6 +104,9 @@ class TestStreamingTableOperation extends QueryTest with 
BeforeAndAfterAll {
 
 // 10. fault tolerant
 createTable(tableName = "stream_table_tolerant", streaming = true, 
withBatchLoad = true)
+
+// 11. table for delete segment test
+createTable(tableName = "stream_table_delete", streaming = true, 
withBatchLoad = false)
   }
 
   test("validate streaming property") {
@@ -181,6 +185,7 @@ class TestStreamingTableOperation extends QueryTest with 
BeforeAndAfterAll {
 sql("drop table if exists streaming.stream_table_compact")
 sql("drop table if exists streaming.stream_table_new")
 sql("drop table if exists streaming.stream_table_tolerant")
+sql("drop table if exists streaming.stream_table_delete")
   }
 
   // normal table not support streaming ingest
@@ -578,8 +583,6 @@ class TestStreamingTableOperation extends QueryTest with 
BeforeAndAfterAll {
   badRecordAction = "force",
   handoffSize = 1024L * 200
 )
-sql("show segments for table streaming.stream_table_new").show(100, false)
-
 assert(sql("show segments for table streaming.stream_table_new").count() 
== 4)
 
 checkAnswer(
@@ -588,6 +591,51 @@ class TestStreamingTableOperation extends QueryTest with 
BeforeAndAfterAll {
 )
   }
 
+  test("test deleting streaming segment by ID while ingesting") {
+executeStreamingIngest(
+  tableName = "stream_table_delete",
+  batchNums = 6,
+  rowNumsEachBatch = 1,
+  intervalOfSource = 3,
+  intervalOfIngest = 5,
+  continueSeconds = 15,
+  generateBadRecords = false,
+  badRecordAction = "force",
+  handoffSize = 1024L * 200
+)
+val beforeDelete = sql("show segments for table 

[26/28] carbondata git commit: [CARBONDATA-1480]Min Max Index Example for DataMap

2017-11-18 Thread jackylk
[CARBONDATA-1480]Min Max Index Example for DataMap

Datamap Example. Implementation of Min Max Index through Datamap. And Using the 
Index while prunning.

This closes #1359


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/29431084
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/29431084
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/29431084

Branch: refs/heads/fgdatamap
Commit: 2943108433c37b3187d95078f417a59bc66ac841
Parents: 85dc4ff
Author: sounakr 
Authored: Thu Sep 28 16:21:05 2017 +0530
Committer: Jacky Li 
Committed: Sat Nov 18 23:22:26 2017 +0800

--
 .../core/datamap/DataMapStoreManager.java   |  16 +-
 .../carbondata/core/datamap/TableDataMap.java   |  15 +-
 .../carbondata/core/datamap/dev/DataMap.java|   3 +-
 .../core/datamap/dev/DataMapWriter.java |   3 +-
 .../indexstore/SegmentPropertiesFetcher.java|  36 +++
 .../blockletindex/BlockletDataMap.java  |   2 +-
 .../blockletindex/BlockletDataMapFactory.java   |  33 ++-
 datamap/examples/pom.xml| 111 ++
 .../datamap/examples/BlockletMinMax.java|  41 
 .../datamap/examples/MinMaxDataMap.java | 143 
 .../datamap/examples/MinMaxDataMapFactory.java  | 114 ++
 .../datamap/examples/MinMaxDataWriter.java  | 221 +++
 .../examples/MinMaxIndexBlockDetails.java   |  77 +++
 .../MinMaxDataMapExample.scala  |  77 +++
 .../testsuite/datamap/DataMapWriterSuite.scala  |   2 +-
 pom.xml |   2 +
 .../datamap/DataMapWriterListener.java  |   4 +-
 .../store/writer/AbstractFactDataWriter.java|   7 +-
 .../writer/v3/CarbonFactDataWriterImplV3.java   |   3 +
 19 files changed, 892 insertions(+), 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/29431084/core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java
 
b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java
index d30483a..90e5fff 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java
@@ -26,6 +26,7 @@ import org.apache.carbondata.common.logging.LogService;
 import org.apache.carbondata.common.logging.LogServiceFactory;
 import org.apache.carbondata.core.datamap.dev.DataMapFactory;
 import org.apache.carbondata.core.indexstore.BlockletDetailsFetcher;
+import org.apache.carbondata.core.indexstore.SegmentPropertiesFetcher;
 import org.apache.carbondata.core.indexstore.blockletindex.BlockletDataMap;
 import 
org.apache.carbondata.core.indexstore.blockletindex.BlockletDataMapFactory;
 import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier;
@@ -103,7 +104,7 @@ public final class DataMapStoreManager {
   tableDataMaps = new ArrayList<>();
 }
 TableDataMap dataMap = getTableDataMap(dataMapName, tableDataMaps);
-if (dataMap != null) {
+if (dataMap != null && 
dataMap.getDataMapName().equalsIgnoreCase(dataMapName)) {
   throw new RuntimeException("Already datamap exists in that path with 
type " + dataMapName);
 }
 
@@ -113,12 +114,15 @@ public final class DataMapStoreManager {
   DataMapFactory dataMapFactory = factoryClass.newInstance();
   dataMapFactory.init(identifier, dataMapName);
   BlockletDetailsFetcher blockletDetailsFetcher;
+  SegmentPropertiesFetcher segmentPropertiesFetcher = null;
   if (dataMapFactory instanceof BlockletDetailsFetcher) {
 blockletDetailsFetcher = (BlockletDetailsFetcher) dataMapFactory;
   } else {
 blockletDetailsFetcher = getBlockletDetailsFetcher(identifier);
   }
-  dataMap = new TableDataMap(identifier, dataMapName, dataMapFactory, 
blockletDetailsFetcher);
+  segmentPropertiesFetcher = (SegmentPropertiesFetcher) 
blockletDetailsFetcher;
+  dataMap = new TableDataMap(identifier, dataMapName, dataMapFactory, 
blockletDetailsFetcher,
+  segmentPropertiesFetcher);
 } catch (Exception e) {
   LOGGER.error(e);
   throw new RuntimeException(e);
@@ -128,11 +132,11 @@ public final class DataMapStoreManager {
 return dataMap;
   }
 
-  private TableDataMap getTableDataMap(String dataMapName,
-  List tableDataMaps) {
+  private TableDataMap getTableDataMap(String dataMapName, List 
tableDataMaps) {
 TableDataMap dataMap = null;
-for (TableDataMap tableDataMap: tableDataMaps) {
-  if 

[21/28] carbondata git commit: [CARBONDATA-1651] [Supported Boolean Type When Saving DataFrame] Provided Support For Boolean Data Type In CarbonDataFrameWriter

2017-11-18 Thread jackylk
[CARBONDATA-1651] [Supported Boolean Type When Saving DataFrame] Provided 
Support For Boolean Data Type In CarbonDataFrameWriter

1.Provided Support For Boolean Data Type In CarbonDataFrameWriter
2.Test Cases are Added For Same

This closes #1491


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/91355ef7
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/91355ef7
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/91355ef7

Branch: refs/heads/fgdatamap
Commit: 91355ef7cb3147537eacd11c95518495417eab82
Parents: d74251f
Author: anubhav100 
Authored: Mon Nov 13 13:33:15 2017 +0530
Committer: Jacky Li 
Committed: Sat Nov 18 01:37:02 2017 +0800

--
 .../testsuite/dataload/TestLoadDataFrame.scala  | 27 ++--
 .../spark/sql/CarbonDataFrameWriter.scala   |  1 +
 2 files changed, 26 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/91355ef7/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataFrame.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataFrame.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataFrame.scala
index f2ea45e..3399740 100644
--- 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataFrame.scala
+++ 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataFrame.scala
@@ -20,7 +20,7 @@ package org.apache.carbondata.spark.testsuite.dataload
 import java.math.BigDecimal
 
 import org.apache.spark.sql.test.util.QueryTest
-import org.apache.spark.sql.types.{DecimalType, DoubleType, StringType, 
StructField, StructType}
+import org.apache.spark.sql.types._
 import org.apache.spark.sql.{DataFrame, Row, SaveMode}
 import org.scalatest.BeforeAndAfterAll
 
@@ -28,6 +28,7 @@ class TestLoadDataFrame extends QueryTest with 
BeforeAndAfterAll {
   var df: DataFrame = _
   var dataFrame: DataFrame = _
   var df2: DataFrame = _
+  var booldf:DataFrame = _
 
 
   def buildTestData() = {
@@ -49,6 +50,15 @@ class TestLoadDataFrame extends QueryTest with 
BeforeAndAfterAll {
 df2 = sqlContext.sparkContext.parallelize(1 to 1000)
   .map(x => ("key_" + x, "str_" + x, x, x * 2, x * 3))
   .toDF("c1", "c2", "c3", "c4", "c5")
+
+val boolrdd = sqlContext.sparkContext.parallelize(
+  Row("anubhav",true) ::
+Row("prince",false) :: Nil)
+
+val boolSchema = StructType(
+  StructField("name", StringType, nullable = false) ::
+StructField("isCarbonEmployee",BooleanType,nullable = false)::Nil)
+booldf = sqlContext.createDataFrame(boolrdd,boolSchema)
   }
 
   def dropTable() = {
@@ -61,6 +71,8 @@ class TestLoadDataFrame extends QueryTest with 
BeforeAndAfterAll {
 sql("DROP TABLE IF EXISTS carbon7")
 sql("DROP TABLE IF EXISTS carbon8")
 sql("DROP TABLE IF EXISTS carbon9")
+sql("DROP TABLE IF EXISTS carbon10")
+
   }
 
 
@@ -70,7 +82,18 @@ class TestLoadDataFrame extends QueryTest with 
BeforeAndAfterAll {
 buildTestData
   }
 
-
+test("test the boolean data type"){
+  booldf.write
+.format("carbondata")
+.option("tableName", "carbon10")
+.option("tempCSV", "true")
+.option("compress", "true")
+.mode(SaveMode.Overwrite)
+.save()
+  checkAnswer(
+sql("SELECT * FROM CARBON10"),
+Seq(Row("anubhav", true), Row("prince", false)))
+}
 
   test("test load dataframe with saving compressed csv files") {
 // save dataframe to carbon file

http://git-wip-us.apache.org/repos/asf/carbondata/blob/91355ef7/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonDataFrameWriter.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonDataFrameWriter.scala
 
b/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonDataFrameWriter.scala
index b74576d..89b618f 100644
--- 
a/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonDataFrameWriter.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/spark/sql/CarbonDataFrameWriter.scala
@@ -153,6 +153,7 @@ class CarbonDataFrameWriter(sqlContext: SQLContext, val 
dataFrame: DataFrame) {
   case TimestampType => CarbonType.TIMESTAMP.getName
   case DateType => CarbonType.DATE.getName
   case decimal: DecimalType => s"decimal(${decimal.precision}, 
${decimal.scale})"
+  case BooleanType => CarbonType.BOOLEAN.getName
   case other => sys.error(s"unsupported type: $other")
 }
  

[14/28] carbondata git commit: [CARBONDATA-1750] Fix NPE when tablestatus file is empty

2017-11-18 Thread jackylk
[CARBONDATA-1750] Fix NPE when tablestatus file is empty

Fix NPE when tablestatus file is empty

This closes #1517


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/52bf7c81
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/52bf7c81
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/52bf7c81

Branch: refs/heads/fgdatamap
Commit: 52bf7c81c8ecc632bdfaee6225a2e83ca697c475
Parents: c3e326e
Author: QiangCai 
Authored: Fri Nov 17 10:45:13 2017 +0800
Committer: Jacky Li 
Committed: Fri Nov 17 15:54:26 2017 +0800

--
 .../carbondata/core/statusmanager/SegmentStatusManager.java| 6 ++
 1 file changed, 6 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/52bf7c81/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
index 1944f96..2409219 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/statusmanager/SegmentStatusManager.java
@@ -205,11 +205,17 @@ public class SegmentStatusManager {
   listOfLoadFolderDetailsArray =
   gsonObjectToRead.fromJson(buffReader, LoadMetadataDetails[].class);
 } catch (IOException e) {
+  LOG.error(e, "Failed to read metadata of load");
   return new LoadMetadataDetails[0];
 } finally {
   closeStreams(buffReader, inStream, dataInputStream);
 }
 
+// if listOfLoadFolderDetailsArray is null, return empty array
+if (null == listOfLoadFolderDetailsArray) {
+  return new LoadMetadataDetails[0];
+}
+
 return listOfLoadFolderDetailsArray;
   }
 



[03/28] carbondata git commit: handled review comments

2017-11-18 Thread jackylk
handled review comments

add column comment during carbon create table and when table is described if 
comment is not mentioned, default will be null
added test case when sort column is boolean column


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/9c9521b6
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/9c9521b6
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/9c9521b6

Branch: refs/heads/fgdatamap
Commit: 9c9521b683fe19458d243c813dc622d30e06074d
Parents: 17892b1
Author: akashrn5 
Authored: Tue Oct 24 15:56:53 2017 +0530
Committer: kumarvishal 
Committed: Thu Nov 16 20:41:00 2017 +0530

--
 .../TestCreateTableWithColumnComment.scala  | 54 
 .../CarbonDescribeFormattedCommand.scala| 19 ---
 .../sql/parser/CarbonSpark2SqlParser.scala  | 14 +++--
 .../BooleanDataTypesInsertTest.scala| 40 +++
 4 files changed, 115 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/9c9521b6/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateTableWithColumnComment.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateTableWithColumnComment.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateTableWithColumnComment.scala
new file mode 100644
index 000..c291a6f
--- /dev/null
+++ 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateTableWithColumnComment.scala
@@ -0,0 +1,54 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.spark.testsuite.createTable
+
+import org.apache.spark.sql.test.util.QueryTest
+import org.scalatest.BeforeAndAfterAll
+
+/**
+ * Test functionality of create table with column comment
+ */
+class TestCreateTableWithColumnComment extends QueryTest with 
BeforeAndAfterAll {
+
+  override def beforeAll {
+sql("use default")
+sql("drop table if exists columnComment")
+sql("drop table if exists defaultComment")
+  }
+
+  test("test create table with column comment") {
+sql(
+  "create table columnComment(id int, name string comment \"This column is 
called name\") " +
+  "stored by 'carbondata'")
+checkExistence(sql("describe formatted columnComment"), true, "This column 
is called name")
+  }
+
+  test("test create table with default column comment value") {
+sql(
+  "create table defaultComment(id int, name string) " +
+  "stored by 'carbondata'")
+checkExistence(sql("describe formatted defaultComment"), true, "null")
+  }
+
+  override def afterAll {
+sql("use default")
+sql("drop table if exists columnComment")
+sql("drop table if exists defaultComment")
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/9c9521b6/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/CarbonDescribeFormattedCommand.scala
--
diff --git 
a/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/CarbonDescribeFormattedCommand.scala
 
b/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/CarbonDescribeFormattedCommand.scala
index 519fbea..7dcad9a 100644
--- 
a/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/CarbonDescribeFormattedCommand.scala
+++ 
b/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/CarbonDescribeFormattedCommand.scala
@@ -65,6 +65,7 @@ private[sql] case class CarbonDescribeFormattedCommand(
 val dims = relation.metaData.dims.map(x => x.toLowerCase)
 var results: Seq[(String, String, String)] = child.schema.fields.map { 
field =>
   val fieldName = field.name.toLowerCase
+

[16/28] carbondata git commit: [CARBONDATA-1745] Use default metastore path from Hive

2017-11-18 Thread jackylk
[CARBONDATA-1745] Use default metastore path from Hive

This closes #1513


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/dfc7442a
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/dfc7442a
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/dfc7442a

Branch: refs/heads/fgdatamap
Commit: dfc7442a483d839282edb6e1305c191aa60da65a
Parents: 75ec79e
Author: Jacky Li 
Authored: Fri Nov 17 15:01:43 2017 +0800
Committer: QiangCai 
Committed: Fri Nov 17 16:25:59 2017 +0800

--
 .../core/metadata/AbsoluteTableIdentifier.java  |  1 +
 .../carbondata/examples/AlterTableExample.scala |  3 +-
 .../examples/CarbonDataFrameExample.scala   |  1 -
 .../examples/CarbonPartitionExample.scala   |  1 -
 .../carbondata/examples/ExampleUtils.scala  |  1 -
 .../org/apache/spark/sql/CarbonSession.scala| 38 ++--
 .../spark/sql/hive/cli/CarbonSQLCLIDriver.scala |  3 +-
 7 files changed, 23 insertions(+), 25 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/dfc7442a/core/src/main/java/org/apache/carbondata/core/metadata/AbsoluteTableIdentifier.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/metadata/AbsoluteTableIdentifier.java
 
b/core/src/main/java/org/apache/carbondata/core/metadata/AbsoluteTableIdentifier.java
index d5434d8..603a1c1 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/metadata/AbsoluteTableIdentifier.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/metadata/AbsoluteTableIdentifier.java
@@ -62,6 +62,7 @@ public class AbsoluteTableIdentifier implements Serializable {
 CarbonTableIdentifier identifier = new CarbonTableIdentifier(dbName, 
tableName, "");
 return new AbsoluteTableIdentifier(tablePath, identifier);
   }
+
   public String getTablePath() {
 return tablePath;
   }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/dfc7442a/examples/spark2/src/main/scala/org/apache/carbondata/examples/AlterTableExample.scala
--
diff --git 
a/examples/spark2/src/main/scala/org/apache/carbondata/examples/AlterTableExample.scala
 
b/examples/spark2/src/main/scala/org/apache/carbondata/examples/AlterTableExample.scala
index dd2a28a..472dc44 100644
--- 
a/examples/spark2/src/main/scala/org/apache/carbondata/examples/AlterTableExample.scala
+++ 
b/examples/spark2/src/main/scala/org/apache/carbondata/examples/AlterTableExample.scala
@@ -37,7 +37,6 @@ object AlterTableExample {
 
 val storeLocation = s"$rootPath/examples/spark2/target/store"
 val warehouse = s"$rootPath/examples/spark2/target/warehouse"
-val metastoredb = s"$rootPath/examples/spark2/target/metastore_db"
 
 CarbonProperties.getInstance()
   .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "/MM/dd")
@@ -49,7 +48,7 @@ object AlterTableExample {
   .master("local")
   .appName("AlterTableExample")
   .config("spark.sql.warehouse.dir", warehouse)
-  .getOrCreateCarbonSession(storeLocation, metastoredb)
+  .getOrCreateCarbonSession(storeLocation)
 
 spark.sparkContext.setLogLevel("WARN")
 

http://git-wip-us.apache.org/repos/asf/carbondata/blob/dfc7442a/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonDataFrameExample.scala
--
diff --git 
a/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonDataFrameExample.scala
 
b/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonDataFrameExample.scala
index ac198d8..2450b49 100644
--- 
a/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonDataFrameExample.scala
+++ 
b/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonDataFrameExample.scala
@@ -31,7 +31,6 @@ object CarbonDataFrameExample {
 + "../../../..").getCanonicalPath
 val storeLocation = s"$rootPath/examples/spark2/target/store"
 val warehouse = s"$rootPath/examples/spark2/target/warehouse"
-val metastoredb = s"$rootPath/examples/spark2/target"
 
 CarbonProperties.getInstance()
   .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "/MM/dd")

http://git-wip-us.apache.org/repos/asf/carbondata/blob/dfc7442a/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonPartitionExample.scala
--
diff --git 
a/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonPartitionExample.scala
 
b/examples/spark2/src/main/scala/org/apache/carbondata/examples/CarbonPartitionExample.scala
index d8aca6b..6837c56 100644
--- 

[13/28] carbondata git commit: [HOTFIX] change to use store path in property in testcase

2017-11-18 Thread jackylk
[HOTFIX] change to use store path in property in testcase

This closes #1522


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/c3e326e0
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/c3e326e0
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/c3e326e0

Branch: refs/heads/fgdatamap
Commit: c3e326e021e4d530b0d30a82c55a09f40a34c2fe
Parents: 0f46ef0
Author: Jacky Li 
Authored: Fri Nov 17 15:45:26 2017 +0800
Committer: QiangCai 
Committed: Fri Nov 17 15:50:26 2017 +0800

--
 .../cluster/sdv/generated/CarbonV1toV3CompatabilityTestCase.scala  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/c3e326e0/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/CarbonV1toV3CompatabilityTestCase.scala
--
diff --git 
a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/CarbonV1toV3CompatabilityTestCase.scala
 
b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/CarbonV1toV3CompatabilityTestCase.scala
index f49e475..93971b0 100644
--- 
a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/CarbonV1toV3CompatabilityTestCase.scala
+++ 
b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/CarbonV1toV3CompatabilityTestCase.scala
@@ -47,7 +47,7 @@ class CarbonV1toV3CompatabilityTestCase extends QueryTest 
with BeforeAndAfterAll
   .appName("CarbonV1toV3CompatabilityTestCase")
   .config("spark.driver.host", "localhost")
   .getOrCreateCarbonSession(storeLocation, 
metaLocation).asInstanceOf[CarbonSession]
-println("store path from env : " + 
CarbonEnv.getInstance(localspark).storePath)
+println("store path : " + CarbonProperties.getStorePath)
 localspark.sparkContext.setLogLevel("WARN")
 localspark.sessionState.asInstanceOf[CarbonSessionState].metadataHive
   .runSqlHive(



[2/2] carbondata git commit: [CARBONDATA-1544][Datamap] Datamap FineGrain implementation

2017-11-18 Thread jackylk
[CARBONDATA-1544][Datamap] Datamap FineGrain implementation

Implemented interfaces for FG datamap and integrated to filterscanner to use 
the pruned bitset from FG datamap.
FG Query flow as follows.
1.The user can add FG datamap to any table and implement there interfaces.
2. Any filter query which hits the table with datamap will call prune method of 
FGdatamap.
3. The prune method of FGDatamap return list FineGrainBlocklet , these 
blocklets contain the information of block, blocklet, page and rowids 
information as well.
4. The pruned blocklets are internally wriitten to file and returns only the 
block , blocklet and filepath information as part of Splits.
5. Based on the splits scanrdd schedule the tasks.
6. In filterscanner we check the datamapwriterpath from split and reNoteads the 
bitset if exists. And pass this bitset as input to it.

This closes #1471


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/40259b36
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/40259b36
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/40259b36

Branch: refs/heads/fgdatamap
Commit: 40259b3628803dc8bf801efb867120f0670f8e90
Parents: a21bdbb
Author: ravipesala 
Authored: Wed Nov 15 19:48:40 2017 +0530
Committer: Jacky Li 
Committed: Sat Nov 18 23:16:23 2017 +0800

--
 .../carbondata/core/datamap/DataMapMeta.java|   8 +-
 .../core/datamap/DataMapStoreManager.java   |  30 +-
 .../carbondata/core/datamap/DataMapType.java|  21 +
 .../carbondata/core/datamap/TableDataMap.java   |  31 +-
 .../core/datamap/dev/AbstractDataMapWriter.java | 110 +
 .../core/datamap/dev/BlockletSerializer.java|  57 +++
 .../carbondata/core/datamap/dev/DataMap.java|   4 +-
 .../core/datamap/dev/DataMapFactory.java|  14 +-
 .../core/datamap/dev/DataMapWriter.java |  57 ---
 .../cgdatamap/AbstractCoarseGrainDataMap.java   |  24 +
 .../AbstractCoarseGrainDataMapFactory.java  |  34 ++
 .../dev/fgdatamap/AbstractFineGrainDataMap.java |  24 +
 .../AbstractFineGrainDataMapFactory.java|  38 ++
 .../carbondata/core/datastore/DataRefNode.java  |   7 +
 .../core/datastore/block/TableBlockInfo.java|  10 +
 .../impl/btree/AbstractBTreeLeafNode.java   |   5 +
 .../datastore/impl/btree/BTreeNonLeafNode.java  |   5 +
 .../carbondata/core/indexstore/Blocklet.java|  30 +-
 .../core/indexstore/BlockletDetailsFetcher.java |   8 +
 .../core/indexstore/ExtendedBlocklet.java   |  19 +-
 .../core/indexstore/FineGrainBlocklet.java  | 120 +
 .../blockletindex/BlockletDataMap.java  |  11 +-
 .../blockletindex/BlockletDataMapFactory.java   |  62 ++-
 .../BlockletDataRefNodeWrapper.java |  27 +-
 .../indexstore/blockletindex/IndexWrapper.java  |  18 +
 .../core/indexstore/schema/FilterType.java  |  24 -
 .../executer/ExcludeFilterExecuterImpl.java |   3 +
 .../executer/IncludeFilterExecuterImpl.java |   3 +
 .../core/scan/processor/BlocksChunkHolder.java  |   5 -
 .../core/scan/scanner/impl/FilterScanner.java   |   2 +
 .../apache/carbondata/core/util/CarbonUtil.java |  98 +
 .../datamap/examples/MinMaxDataMap.java |  20 +-
 .../datamap/examples/MinMaxDataMapFactory.java  |  49 ++-
 .../datamap/examples/MinMaxDataWriter.java  |  36 +-
 .../examples/MinMaxIndexBlockDetails.java   |  13 -
 .../carbondata/hadoop/CarbonInputFormat.java|   2 +-
 .../carbondata/hadoop/CarbonInputSplit.java |  20 +-
 .../hadoop/api/CarbonTableInputFormat.java  |  23 +-
 .../testsuite/datamap/CGDataMapTestCase.scala   | 357 +++
 .../testsuite/datamap/DataMapWriterSuite.scala  |  49 ++-
 .../testsuite/datamap/FGDataMapTestCase.scala   | 436 +++
 .../carbondata/spark/rdd/CarbonScanRDD.scala|   6 +-
 .../datamap/DataMapWriterListener.java  |  57 ++-
 .../store/CarbonFactDataHandlerModel.java   |  10 +-
 .../store/writer/AbstractFactDataWriter.java| 126 +-
 45 files changed, 1732 insertions(+), 381 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/40259b36/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java 
b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
index 7746acf..dd15ccb 100644
--- a/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
+++ b/core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java
@@ -19,15 +19,15 @@ package org.apache.carbondata.core.datamap;
 
 import java.util.List;
 
-import org.apache.carbondata.core.indexstore.schema.FilterType;
+import 

[1/2] carbondata git commit: [CARBONDATA-1544][Datamap] Datamap FineGrain implementation

2017-11-18 Thread jackylk
Repository: carbondata
Updated Branches:
  refs/heads/fgdatamap a21bdbb43 -> 40259b362


http://git-wip-us.apache.org/repos/asf/carbondata/blob/40259b36/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
--
diff --git 
a/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
 
b/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
index 78544d3..fe0bbcf 100644
--- 
a/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
+++ 
b/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataWriter.java
@@ -19,7 +19,6 @@ package org.apache.carbondata.datamap.examples;
 
 import java.io.BufferedWriter;
 import java.io.DataOutputStream;
-import java.io.File;
 import java.io.IOException;
 import java.io.OutputStreamWriter;
 import java.util.ArrayList;
@@ -29,17 +28,18 @@ import java.util.Map;
 
 import org.apache.carbondata.common.logging.LogService;
 import org.apache.carbondata.common.logging.LogServiceFactory;
-import org.apache.carbondata.core.constants.CarbonCommonConstants;
-import org.apache.carbondata.core.datamap.dev.DataMapWriter;
+import org.apache.carbondata.core.datamap.dev.AbstractDataMapWriter;
 import org.apache.carbondata.core.datastore.impl.FileFactory;
 import org.apache.carbondata.core.datastore.page.ColumnPage;
+import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier;
 import org.apache.carbondata.core.metadata.schema.table.TableInfo;
 import org.apache.carbondata.core.util.ByteUtil;
 import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.path.CarbonTablePath;
 
 import com.google.gson.Gson;
 
-public class MinMaxDataWriter implements DataMapWriter {
+public class MinMaxDataWriter extends AbstractDataMapWriter {
 
   private static final LogService LOGGER =
   LogServiceFactory.getLogService(TableInfo.class.getName());
@@ -50,17 +50,23 @@ public class MinMaxDataWriter implements DataMapWriter {
 
   private Map blockMinMaxMap;
 
-  private String blockPath;
+  private String dataWritePath;
 
+  public MinMaxDataWriter(AbsoluteTableIdentifier identifier, String segmentId,
+  String dataWritePath) {
+super(identifier, segmentId, dataWritePath);
+this.identifier = identifier;
+this.segmentId = segmentId;
+this.dataWritePath = dataWritePath;
+  }
 
-  @Override public void onBlockStart(String blockId, String blockPath) {
+  @Override public void onBlockStart(String blockId) {
 pageLevelMax = null;
 pageLevelMin = null;
 blockletLevelMax = null;
 blockletLevelMin = null;
 blockMinMaxMap = null;
 blockMinMaxMap = new HashMap();
-this.blockPath = blockPath;
   }
 
   @Override public void onBlockEnd(String blockId) {
@@ -161,7 +167,7 @@ public class MinMaxDataWriter implements DataMapWriter {
 List tempMinMaxIndexBlockDetails = null;
 tempMinMaxIndexBlockDetails = loadBlockDetails();
 try {
-  writeMinMaxIndexFile(tempMinMaxIndexBlockDetails, blockPath, blockId);
+  writeMinMaxIndexFile(tempMinMaxIndexBlockDetails, blockId);
 } catch (IOException ex) {
   LOGGER.info(" Unable to write the file");
 }
@@ -178,7 +184,6 @@ public class MinMaxDataWriter implements DataMapWriter {
   
tmpminMaxIndexBlockDetails.setMinValues(blockMinMaxMap.get(index).getMin());
   
tmpminMaxIndexBlockDetails.setMaxValues(blockMinMaxMap.get(index).getMax());
   tmpminMaxIndexBlockDetails.setBlockletId(index);
-  tmpminMaxIndexBlockDetails.setFilePath(this.blockPath);
   minMaxIndexBlockDetails.add(tmpminMaxIndexBlockDetails);
 }
 return minMaxIndexBlockDetails;
@@ -187,22 +192,19 @@ public class MinMaxDataWriter implements DataMapWriter {
   /**
* Write the data to a file. This is JSON format file.
* @param minMaxIndexBlockDetails
-   * @param blockPath
* @param blockId
* @throws IOException
*/
   public void writeMinMaxIndexFile(List 
minMaxIndexBlockDetails,
-  String blockPath, String blockId) throws IOException {
-String filePath = blockPath.substring(0, 
blockPath.lastIndexOf(File.separator) + 1) + blockId
-+ ".minmaxindex";
+  String blockId) throws IOException {
+String filePath = dataWritePath +"/" + blockId + ".minmaxindex";
 BufferedWriter brWriter = null;
 DataOutputStream dataOutStream = null;
 try {
   FileFactory.createNewFile(filePath, FileFactory.getFileType(filePath));
   dataOutStream = FileFactory.getDataOutputStream(filePath, 
FileFactory.getFileType(filePath));
   Gson gsonObjectToWrite = new Gson();
-  brWriter = new BufferedWriter(new OutputStreamWriter(dataOutStream,
-  

Jenkins build is still unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark Common Test #1626

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: carbondata-master-spark-2.1 #1626

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build became unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #1626

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark Common Test #1625

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: carbondata-master-spark-2.1 #1625

2017-11-18 Thread Apache Jenkins Server
See 




carbondata git commit: [CARBONDATA-1762] Remove existing column level dateformat and support dateformat, timestampformat in the load option

2017-11-18 Thread jackylk
Repository: carbondata
Updated Branches:
  refs/heads/master 2a12938b5 -> 85dc4fff0


[CARBONDATA-1762] Remove existing column level dateformat and support 
dateformat, timestampformat in the load option

(1) Remove column level dateformat option
(2) Support dateformat and timestampformat in load options(table level)

This closes #1524


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/85dc4fff
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/85dc4fff
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/85dc4fff

Branch: refs/heads/master
Commit: 85dc4fff0ecca160654085379310a1c3096731f7
Parents: 2a12938
Author: akashrn5 
Authored: Fri Nov 17 16:55:33 2017 +0530
Committer: Jacky Li 
Committed: Sat Nov 18 16:34:43 2017 +0800

--
 .../constants/CarbonLoadOptionConstants.java| 10 +-
 .../carbondata/core/util/DataTypeUtil.java  |  2 +-
 .../TestLoadDataWithDiffTimestampFormat.scala   | 31 
 .../carbondata/spark/load/ValidateUtil.scala| 38 +---
 .../carbondata/spark/util/DataLoadingUtil.scala | 13 ++-
 .../spark/sql/catalyst/CarbonDDLSqlParser.scala |  2 +-
 .../spark/rdd/CarbonDataRDDFactory.scala| 24 +++--
 .../processing/loading/DataField.java   | 10 ++
 .../loading/DataLoadProcessBuilder.java | 10 +++---
 .../DirectDictionaryFieldConverterImpl.java | 10 +-
 .../impl/NonDictionaryFieldConverterImpl.java   |  8 -
 .../loading/model/CarbonLoadModel.java  | 13 +++
 .../util/CarbonDataProcessorUtil.java   | 19 --
 13 files changed, 114 insertions(+), 76 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/85dc4fff/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
index ac278d9..e78d125 100644
--- 
a/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
+++ 
b/core/src/main/java/org/apache/carbondata/core/constants/CarbonLoadOptionConstants.java
@@ -46,12 +46,20 @@ public final class CarbonLoadOptionConstants {
   public static final String CARBON_OPTIONS_IS_EMPTY_DATA_BAD_RECORD_DEFAULT = 
"false";
 
   /**
-   * option to specify the load option
+   * option to specify the dateFormat in load option for all date columns in 
table
*/
   @CarbonProperty
   public static final String CARBON_OPTIONS_DATEFORMAT =
   "carbon.options.dateformat";
   public static final String CARBON_OPTIONS_DATEFORMAT_DEFAULT = "";
+
+  /**
+   * option to specify the timestampFormat in load option for all timestamp 
columns in table
+   */
+  @CarbonProperty
+  public static final String CARBON_OPTIONS_TIMESTAMPFORMAT =
+  "carbon.options.timestampformat";
+  public static final String CARBON_OPTIONS_TIMESTAMPFORMAT_DEFAULT = "";
   /**
* option to specify the sort_scope
*/

http://git-wip-us.apache.org/repos/asf/carbondata/blob/85dc4fff/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
--
diff --git 
a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java 
b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
index 0961a63..3a25988 100644
--- a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
+++ b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java
@@ -319,7 +319,7 @@ public final class DataTypeUtil {
   Date dateToStr = null;
   DateFormat dateFormatter = null;
   try {
-if (null != dateFormat) {
+if (null != dateFormat && !dateFormat.trim().isEmpty()) {
   dateFormatter = new SimpleDateFormat(dateFormat);
 } else {
   dateFormatter = timeStampformatter.get();

http://git-wip-us.apache.org/repos/asf/carbondata/blob/85dc4fff/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataWithDiffTimestampFormat.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataWithDiffTimestampFormat.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/dataload/TestLoadDataWithDiffTimestampFormat.scala
index 71d6466..906f05a 100644
--- 

carbondata git commit: [CARBONDATA-1751] Modify sys.err to AnalysisException when uses run related operation except IUD, compaction and alter

2017-11-18 Thread jackylk
Repository: carbondata
Updated Branches:
  refs/heads/master 1b8d348c0 -> 2a12938b5


[CARBONDATA-1751] Modify sys.err to AnalysisException when uses run related 
operation except IUD,compaction and alter

carbon printout improper error message, for example, it printout system error 
when users run create table with the same column name, but it should printout 
related exception information

So we modify sys.error method to AnalysisException when uses run related 
operation except IUD,compaction and alter

This closes #1515


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/2a12938b
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/2a12938b
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/2a12938b

Branch: refs/heads/master
Commit: 2a12938b545cca5e3c09396dd68393ea615038fa
Parents: 1b8d348
Author: xubo245 <601450...@qq.com>
Authored: Fri Nov 17 10:48:31 2017 +0800
Committer: Jacky Li 
Committed: Sat Nov 18 16:24:23 2017 +0800

--
 .../command/CarbonTableSchemaCommonSuite.scala  | 72 
 .../org/apache/carbondata/api/CarbonStore.scala |  4 +-
 .../carbondata/spark/util/CommonUtil.scala  | 17 ++---
 .../spark/util/DataTypeConverterUtil.scala  |  4 +-
 .../catalyst/AbstractCarbonSparkSQLParser.scala |  3 +-
 .../spark/sql/catalyst/CarbonDDLSqlParser.scala |  4 +-
 .../command/carbonTableSchemaCommon.scala   | 10 +--
 .../apache/spark/sql/util/CarbonException.scala | 24 +++
 .../sql/parser/CarbonSpark2SqlParser.scala  |  2 +-
 .../apache/spark/util/CarbonCommandSuite.scala  |  2 +-
 10 files changed, 124 insertions(+), 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/2a12938b/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
--
diff --git 
a/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
 
b/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
new file mode 100644
index 000..67dfa8f
--- /dev/null
+++ 
b/integration/spark-common-test/src/test/scala/org/apache/spark/sql/execution/command/CarbonTableSchemaCommonSuite.scala
@@ -0,0 +1,72 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.command
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.test.util.QueryTest
+import org.junit.Assert
+import org.scalatest.BeforeAndAfterAll
+
+class CarbonTableSchemaCommonSuite extends QueryTest with BeforeAndAfterAll {
+
+  test("Creating table: Duplicate dimensions found with name, it should throw 
AnalysisException") {
+sql("DROP TABLE IF EXISTS carbon_table")
+try {
+  sql(
+s"""
+   | CREATE TABLE carbon_table(
+   | BB INT, bb char(10)
+   | )
+   | STORED BY 'carbondata'
+   """.stripMargin)
+  Assert.assertTrue(false)
+} catch {
+  case _: AnalysisException => Assert.assertTrue(true)
+  case _: Exception => Assert.assertTrue(false)
+} finally {
+  sql("DROP TABLE IF EXISTS carbon_table")
+}
+  }
+
+  test("Altering table: Duplicate column found with name, it should throw 
RuntimeException") {
+sql("DROP TABLE IF EXISTS carbon_table")
+sql(
+  s"""
+ | CREATE TABLE if not exists carbon_table(
+ | BB INT, cc char(10)
+ | )
+ | STORED BY 'carbondata'
+   """.stripMargin)
+
+try {
+  sql(
+s"""
+   | alter TABLE carbon_table add columns(
+   | bb char(10)
+)
+   """.stripMargin)
+  Assert.assertTrue(false)
+} catch {
+  case _: RuntimeException => Assert.assertTrue(true)
+  case _: Exception => Assert.assertTrue(false)
+} finally {
+  sql("DROP TABLE IF EXISTS carbon_table")
+}
+  }
+
+}


Jenkins build became unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark Common Test #1624

2017-11-18 Thread Apache Jenkins Server
See 




Jenkins build became unstable: carbondata-master-spark-2.1 #1624

2017-11-18 Thread Apache Jenkins Server
See