[GitHub] carbondata pull request #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column p...

2017-06-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/1121


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column page for...

2017-06-30 Thread QiangCai
Github user QiangCai commented on the issue:

https://github.com/apache/carbondata/pull/1121
  
LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1122: [CARBONDATA-1253] Sort_columns should not sup...

2017-06-30 Thread QiangCai
Github user QiangCai commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1122#discussion_r125154538
  
--- Diff: 
integration/spark-common/src/main/scala/org/apache/spark/sql/catalyst/CarbonDDLSqlParser.scala
 ---
@@ -691,6 +691,14 @@ abstract class CarbonDDLSqlParser extends 
AbstractCarbonSparkSQLParser {
   }
 
   /**
+   * detects whether datatype is part of sort_column
+   */
+  def isDataTypeSupportedForSortColumn(columnDataType: String): Boolean = {
--- End diff --

fixed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column page for...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1121
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/257/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154345
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/ConditionalFilterResolverImpl.java
 ---
@@ -198,21 +237,31 @@ public AbsoluteTableIdentifier getTableIdentifier() {
*/
   public void getStartKey(SegmentProperties segmentProperties, long[] 
startKey,
   SortedMap setOfStartKeyByteArray, List 
startKeyList) {
-
FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
-segmentProperties, startKey, startKeyList);
-
FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo,
-segmentProperties, setOfStartKeyByteArray);
+if (null != dimColResolvedFilterInfo) {
+  
FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
+  segmentProperties, startKey, startKeyList);
+  
FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo, 
segmentProperties,
+  setOfStartKeyByteArray);
+}
+// else {
+//  
FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
+//  segmentProperties, startKey, startKeyList);
+//  
FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo, 
segmentProperties,
+//  setOfStartKeyByteArray);
+//}
--- End diff --

remove commented code


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154321
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/partition/PartitionFilterUtil.java
 ---
@@ -107,6 +131,12 @@ public static Comparator 
getComparatorByDataType(DataType dataType) {
 }
   }
 
+  static class DecimalComparator implements Comparator {
--- End diff --

what is the use of this comparator? Please remove if not used


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154292
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/partition/PartitionFilterUtil.java
 ---
@@ -76,24 +99,25 @@ public static Comparator 
getComparatorByDataType(DataType dataType) {
 
   static class DoubleComparator implements Comparator {
 @Override public int compare(Object key1, Object key2) {
-  double result = (double) key1 - (double) key2;
-  if (result < 0) {
+  double key1Double1 = (double)key1;
--- End diff --

Why need to change this logic? old logic seems fine right


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1017: [CARBONDATA-1105] remove spark.version in submodule

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1017
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2836/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1017: [CARBONDATA-1105] remove spark.version in submodule

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1017
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/256/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154262
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java
 ---
@@ -77,72 +89,188 @@ private void ifDefaultValueMatchesFilter() {
   }
 }
   }
+} else if (!msrColEvalutorInfoList.isEmpty() && 
!isMeasurePresentInCurrentBlock[0]) {
+  CarbonMeasure measure = 
this.msrColEvalutorInfoList.get(0).getMeasure();
+  byte[] defaultValue = measure.getDefaultValue();
+  if (null != defaultValue) {
+for (int k = 0; k < filterRangeValues.length; k++) {
+  int maxCompare =
+  
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
+  if (maxCompare > 0) {
+isDefaultValuePresentInFilter = true;
+break;
+  }
+}
+  }
 }
   }
 
   @Override public BitSet isScanRequired(byte[][] blockMaxValue, byte[][] 
blockMinValue) {
 BitSet bitSet = new BitSet(1);
-boolean isScanRequired =
-isScanRequired(blockMinValue[dimensionBlocksIndex[0]], 
filterRangeValues);
+byte[] minValue = null;
+boolean isScanRequired = false;
+if (isMeasurePresentInCurrentBlock[0] || 
isDimensionPresentInCurrentBlock[0]) {
+  if (isMeasurePresentInCurrentBlock[0]) {
+minValue = blockMinValue[measureBlocksIndex[0] + 
lastDimensionColOrdinal];
+isScanRequired =
+isScanRequired(minValue, filterRangeValues, 
msrColEvalutorInfoList.get(0).getType());
+  } else {
+minValue = blockMinValue[dimensionBlocksIndex[0]];
+isScanRequired = isScanRequired(minValue, filterRangeValues);
+  }
+} else {
+  isScanRequired = isDefaultValuePresentInFilter;
+}
 if (isScanRequired) {
   bitSet.set(0);
 }
 return bitSet;
   }
 
+
   private boolean isScanRequired(byte[] blockMinValue, byte[][] 
filterValues) {
 boolean isScanRequired = false;
-if (isDimensionPresentInCurrentBlock[0]) {
-  for (int k = 0; k < filterValues.length; k++) {
-// and filter-min should be positive
-int minCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
+for (int k = 0; k < filterValues.length; k++) {
+  // and filter-min should be positive
+  int minCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
 
-// if any filter applied is not in range of min and max of block
-// then since its a less than fiter validate whether the block
-// min range is less  than applied filter member
-if (minCompare > 0) {
-  isScanRequired = true;
-  break;
-}
+  // if any filter applied is not in range of min and max of block
+  // then since its a less than equal to fiter validate whether the 
block
+  // min range is less than equal to applied filter member
+  if (minCompare > 0) {
+isScanRequired = true;
+break;
   }
-} else {
-  isScanRequired = isDefaultValuePresentInFilter;
 }
 return isScanRequired;
   }
 
+  private boolean isScanRequired(byte[] minValue, byte[][] filterValue,
+  DataType dataType) {
+for (int i = 0; i < filterValue.length; i++) {
+  if (filterValue[i].length == 0 || minValue.length == 0) {
+return isScanRequired(minValue, filterValue);
+  }
+  switch (dataType) {
--- End diff --

Use existing methods of `DataTypeUtil` and comparator here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154233
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java
 ---
@@ -91,57 +129,147 @@ private void ifDefaultValueMatchesFilter() {
 
   private boolean isScanRequired(byte[] blockMinValue, byte[][] 
filterValues) {
 boolean isScanRequired = false;
-if (isDimensionPresentInCurrentBlock[0]) {
-  for (int k = 0; k < filterValues.length; k++) {
-// and filter-min should be positive
-int minCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
+for (int k = 0; k < filterValues.length; k++) {
+  // and filter-min should be positive
+  int minCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
 
-// if any filter applied is not in range of min and max of block
-// then since its a less than equal to fiter validate whether the 
block
-// min range is less than equal to applied filter member
-if (minCompare >= 0) {
-  isScanRequired = true;
-  break;
-}
+  // if any filter applied is not in range of min and max of block
+  // then since its a less than equal to fiter validate whether the 
block
+  // min range is less than equal to applied filter member
+  if (minCompare >= 0) {
+isScanRequired = true;
+break;
   }
-} else {
-  isScanRequired = isDefaultValuePresentInFilter;
 }
 return isScanRequired;
   }
 
+  private boolean isScanRequired(byte[] minValue, byte[][] filterValue,
+  DataType dataType) {
+for (int i = 0; i < filterValue.length; i++) {
+  if (filterValue[i].length == 0 || minValue.length == 0) {
+return isScanRequired(minValue, filterValue);
+  }
+  switch (dataType) {
--- End diff --

Use existing methods of `DataTypeUtil` and comparator here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154226
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtrThanEquaToFilterExecuterImpl.java
 ---
@@ -91,67 +131,167 @@ private void ifDefaultValueMatchesFilter() {
 
   private boolean isScanRequired(byte[] blockMaxValue, byte[][] 
filterValues) {
 boolean isScanRequired = false;
-if (isDimensionPresentInCurrentBlock[0]) {
-  for (int k = 0; k < filterValues.length; k++) {
-// filter value should be in range of max and min value i.e
-// max>filtervalue>min
-// so filter-max should be negative
-int maxCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
-// if any filter value is in range than this block needs to be
-// scanned less than equal to max range.
-if (maxCompare <= 0) {
-  isScanRequired = true;
-  break;
-}
+for (int k = 0; k < filterValues.length; k++) {
+  // filter value should be in range of max and min value i.e
+  // max>filtervalue>min
+  // so filter-max should be negative
+  int maxCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
+  // if any filter value is in range than this block needs to be
+  // scanned less than equal to max range.
+  if (maxCompare <= 0) {
+isScanRequired = true;
+break;
   }
-} else {
-  isScanRequired = isDefaultValuePresentInFilter;
 }
 return isScanRequired;
   }
 
+  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
+  DataType dataType) {
+for (int i = 0; i < filterValue.length; i++) {
+  if (filterValue[i].length == 0 || maxValue.length == 0) {
+return isScanRequired(maxValue, filterValue);
+  }
+  switch (dataType) {
+case DOUBLE:
+  double maxValueDouble = ByteBuffer.wrap(maxValue).getDouble();
+  double filterValueDouble = 
ByteBuffer.wrap(filterValue[i]).getDouble();
+  if (filterValueDouble <= maxValueDouble) {
+return true;
+  }
+  break;
+case INT:
+case SHORT:
+case LONG:
+  long maxValueLong = ByteBuffer.wrap(maxValue).getLong();
+  long filterValueLong = ByteBuffer.wrap(filterValue[i]).getLong();
+  if (filterValueLong <= maxValueLong) {
+return true;
+  }
+  break;
+case DECIMAL:
+  BigDecimal maxDecimal = DataTypeUtil.byteToBigDecimal(maxValue);
+  BigDecimal filterDecimal = 
DataTypeUtil.byteToBigDecimal(filterValue[i]);
+  if (filterDecimal.compareTo(maxDecimal) <= 0) {
+return true;
+  }
+  }
+}
+return false;
+  }
+
   @Override public BitSetGroup applyFilter(BlocksChunkHolder 
blockChunkHolder)
   throws FilterUnsupportedException, IOException {
 // select all rows if dimension does not exists in the current block
-if (!isDimensionPresentInCurrentBlock[0]) {
+if (!isDimensionPresentInCurrentBlock[0] && 
!isMeasurePresentInCurrentBlock[0]) {
   int numberOfRows = blockChunkHolder.getDataBlock().nodeSize();
   return FilterUtil
   
.createBitSetGroupWithDefaultValue(blockChunkHolder.getDataBlock().numberOfPages(),
   numberOfRows, true);
 }
-int blockIndex =
-
segmentProperties.getDimensionOrdinalToBlockMapping().get(dimensionBlocksIndex[0]);
-if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
-  blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = 
blockChunkHolder.getDataBlock()
-  .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
-}
-DimensionRawColumnChunk rawColumnChunk =
-blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
-BitSetGroup bitSetGroup = new 
BitSetGroup(rawColumnChunk.getPagesCount());
-for (int i = 0; i < rawColumnChunk.getPagesCount(); i++) {
-  if (rawColumnChunk.getMaxValues() != null) {
-if (isScanRequired(rawColumnChunk.getMaxValues()[i], 
this.filterRangeValues)) {
-  int compare = ByteUtil.UnsafeComparer.INSTANCE
-  .compareTo(filterRangeValues[0], 
rawColumnChunk.getMinValues()[i]);
-  if (compare <= 0) {
-BitSet bitSet = new BitSet(rawColumnChunk.getRowCount()[i]);
-bitSet.flip(0, rawColumnChunk.getRowCount()[i]);
-bitSetGroup.setBitSet(bitSet, i);
-  } else {
 

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154219
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtrThanEquaToFilterExecuterImpl.java
 ---
@@ -91,67 +131,167 @@ private void ifDefaultValueMatchesFilter() {
 
   private boolean isScanRequired(byte[] blockMaxValue, byte[][] 
filterValues) {
 boolean isScanRequired = false;
-if (isDimensionPresentInCurrentBlock[0]) {
-  for (int k = 0; k < filterValues.length; k++) {
-// filter value should be in range of max and min value i.e
-// max>filtervalue>min
-// so filter-max should be negative
-int maxCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
-// if any filter value is in range than this block needs to be
-// scanned less than equal to max range.
-if (maxCompare <= 0) {
-  isScanRequired = true;
-  break;
-}
+for (int k = 0; k < filterValues.length; k++) {
+  // filter value should be in range of max and min value i.e
+  // max>filtervalue>min
+  // so filter-max should be negative
+  int maxCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
+  // if any filter value is in range than this block needs to be
+  // scanned less than equal to max range.
+  if (maxCompare <= 0) {
+isScanRequired = true;
+break;
   }
-} else {
-  isScanRequired = isDefaultValuePresentInFilter;
 }
 return isScanRequired;
   }
 
+  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
+  DataType dataType) {
+for (int i = 0; i < filterValue.length; i++) {
+  if (filterValue[i].length == 0 || maxValue.length == 0) {
+return isScanRequired(maxValue, filterValue);
+  }
+  switch (dataType) {
--- End diff --

Use existing methods of `DataTypeUtil` and comparator here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154119
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtThanFiterExecuterImpl.java
 ---
@@ -74,80 +87,205 @@ private void ifDefaultValueMatchesFilter() {
   }
 }
   }
+} else if (!msrColEvalutorInfoList.isEmpty() && 
!isMeasurePresentInCurrentBlock[0]) {
+  CarbonMeasure measure = 
this.msrColEvalutorInfoList.get(0).getMeasure();
+  byte[] defaultValue = measure.getDefaultValue();
+  if (null != defaultValue) {
+for (int k = 0; k < filterRangeValues.length; k++) {
+  int maxCompare =
+  
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
+  if (maxCompare < 0) {
+isDefaultValuePresentInFilter = true;
+break;
+  }
+}
+  }
 }
   }
 
   @Override public BitSet isScanRequired(byte[][] blockMaxValue, byte[][] 
blockMinValue) {
 BitSet bitSet = new BitSet(1);
-boolean isScanRequired =
-isScanRequired(blockMaxValue[dimensionBlocksIndex[0]], 
filterRangeValues);
+boolean isScanRequired = false;
+byte[] maxValue = null;
+if (isMeasurePresentInCurrentBlock[0] || 
isDimensionPresentInCurrentBlock[0]) {
+  if (isMeasurePresentInCurrentBlock[0]) {
+maxValue = blockMaxValue[measureBlocksIndex[0] + 
lastDimensionColOrdinal];
+isScanRequired =
+isScanRequired(maxValue, filterRangeValues, 
msrColEvalutorInfoList.get(0).getType());
+  } else {
+maxValue = blockMaxValue[dimensionBlocksIndex[0]];
+isScanRequired = isScanRequired(maxValue, filterRangeValues);
+  }
+} else {
+  isScanRequired = isDefaultValuePresentInFilter;
+}
+
 if (isScanRequired) {
   bitSet.set(0);
 }
 return bitSet;
   }
 
+
   private boolean isScanRequired(byte[] blockMaxValue, byte[][] 
filterValues) {
 boolean isScanRequired = false;
-if (isDimensionPresentInCurrentBlock[0]) {
-  for (int k = 0; k < filterValues.length; k++) {
-// filter value should be in range of max and min value i.e
-// max>filtervalue>min
-// so filter-max should be negative
-int maxCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
-// if any filter value is in range than this block needs to be
-// scanned means always less than block max range.
-if (maxCompare < 0) {
-  isScanRequired = true;
-  break;
-}
+for (int k = 0; k < filterValues.length; k++) {
+  // filter value should be in range of max and min value i.e
+  // max>filtervalue>min
+  // so filter-max should be negative
+  int maxCompare = 
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
+  // if any filter value is in range than this block needs to be
+  // scanned less than equal to max range.
+  if (maxCompare < 0) {
+isScanRequired = true;
+break;
   }
-} else {
-  isScanRequired = isDefaultValuePresentInFilter;
 }
 return isScanRequired;
   }
 
+  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
+  DataType dataType) {
+for (int i = 0; i < filterValue.length; i++) {
+  if (filterValue[i].length == 0 || maxValue.length == 0) {
+return isScanRequired(maxValue, filterValue);
+  }
+  switch (dataType) {
--- End diff --

Use existing DataTypeUtil  methods and comparator here to compare


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125154090
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtThanFiterExecuterImpl.java
 ---
@@ -74,80 +87,205 @@ private void ifDefaultValueMatchesFilter() {
   }
 }
   }
+} else if (!msrColEvalutorInfoList.isEmpty() && 
!isMeasurePresentInCurrentBlock[0]) {
+  CarbonMeasure measure = 
this.msrColEvalutorInfoList.get(0).getMeasure();
+  byte[] defaultValue = measure.getDefaultValue();
+  if (null != defaultValue) {
+for (int k = 0; k < filterRangeValues.length; k++) {
+  int maxCompare =
+  
ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
--- End diff --

This comparison in case of measure is wrong. Always compare actual values. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1122: [CARBONDATA-1253] Sort_columns should not support fl...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1122
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/255/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1122: [CARBONDATA-1253] Sort_columns should not support fl...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1122
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2835/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153811
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/MeasureColumnExecuterFilterInfo.java
 ---
@@ -0,0 +1,30 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.carbondata.core.scan.filter.executer;
+
+public class MeasureColumnExecuterFilterInfo {
+
+  byte[][] filterKeys;
--- End diff --

Use Object[]


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153772
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java
 ---
@@ -186,12 +314,60 @@ private boolean isScanRequired(byte[] blkMaxVal, 
byte[] blkMinVal, byte[][] filt
 return isScanRequired;
   }
 
+  private boolean isScanRequired(byte[] maxValue, byte[] minValue, 
byte[][] filterValue,
+  DataType dataType) {
+for (int i = 0; i < filterValue.length; i++) {
+  if (filterValue[i].length == 0 || maxValue.length == 0 || 
minValue.length == 0) {
+return isScanRequired(maxValue, minValue, filterValue);
+  } else {
+switch (dataType) {
--- End diff --

Use existing methods of DataTypeUtil for conversions. And use comparator 
which is used in applyFilter method here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153736
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java
 ---
@@ -152,12 +261,31 @@ private BitSet 
setFilterdIndexToBitSet(DimensionColumnDataChunk dimensionColumnD
 
   public BitSet isScanRequired(byte[][] blkMaxVal, byte[][] blkMinVal) {
 BitSet bitSet = new BitSet(1);
-byte[][] filterValues = dimColumnExecuterInfo.getFilterKeys();
-int columnIndex = dimColumnEvaluatorInfo.getColumnIndex();
-int blockIndex = 
segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex);
+byte[][] filterValues = null;
+int columnIndex = 0;
+int blockIndex = 0;
+boolean isScanRequired = false;
+
+if (isDimensionPresentInCurrentBlock == true) {
+  filterValues = dimColumnExecuterInfo.getFilterKeys();
+  columnIndex = dimColumnEvaluatorInfo.getColumnIndex();
+  blockIndex = 
segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex);
+  isScanRequired =
+  isScanRequired(blkMaxVal[blockIndex], blkMinVal[blockIndex], 
filterValues);
+
+} else if (isMeasurePresentInCurrentBlock) {
+  filterValues = msrColumnExecutorInfo.getFilterKeys();
+  columnIndex = msrColumnEvaluatorInfo.getColumnIndex();
+  // blockIndex =
+  // 
segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex) + 
segmentProperties
+  // .getLastDimensionColOrdinal();
--- End diff --

remove commented code


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153674
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java
 ---
@@ -17,65 +17,174 @@
 package org.apache.carbondata.core.scan.filter.executer;
 
 import java.io.IOException;
+import java.math.BigDecimal;
+import java.nio.ByteBuffer;
 import java.util.BitSet;
+import java.util.Comparator;
 
 import org.apache.carbondata.core.datastore.block.SegmentProperties;
 import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
+import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
 import 
org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
+import 
org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
+import org.apache.carbondata.core.metadata.datatype.DataType;
 import org.apache.carbondata.core.scan.filter.FilterUtil;
+import 
org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
 import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
+import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
 import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
 import org.apache.carbondata.core.util.BitSetGroup;
 import org.apache.carbondata.core.util.ByteUtil;
 import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.DataTypeUtil;
 
 public class IncludeFilterExecuterImpl implements FilterExecuter {
 
   protected DimColumnResolvedFilterInfo dimColumnEvaluatorInfo;
   protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
+  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
+  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
   protected SegmentProperties segmentProperties;
+  protected boolean isDimensionPresentInCurrentBlock = false;
+  protected boolean isMeasurePresentInCurrentBlock = false;
--- End diff --

remove this flags and use null check of `dimColumnExecuterInfo` and 
`msrColumnExecutorInfo`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153654
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java
 ---
@@ -17,65 +17,174 @@
 package org.apache.carbondata.core.scan.filter.executer;
 
 import java.io.IOException;
+import java.math.BigDecimal;
+import java.nio.ByteBuffer;
 import java.util.BitSet;
+import java.util.Comparator;
 
 import org.apache.carbondata.core.datastore.block.SegmentProperties;
 import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
+import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
 import 
org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
+import 
org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
+import org.apache.carbondata.core.metadata.datatype.DataType;
 import org.apache.carbondata.core.scan.filter.FilterUtil;
+import 
org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
 import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
+import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
 import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
 import org.apache.carbondata.core.util.BitSetGroup;
 import org.apache.carbondata.core.util.ByteUtil;
 import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.DataTypeUtil;
 
 public class IncludeFilterExecuterImpl implements FilterExecuter {
 
   protected DimColumnResolvedFilterInfo dimColumnEvaluatorInfo;
   protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
+  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
+  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
   protected SegmentProperties segmentProperties;
+  protected boolean isDimensionPresentInCurrentBlock = false;
+  protected boolean isMeasurePresentInCurrentBlock = false;
   /**
* is dimension column data is natural sorted
*/
-  private boolean isNaturalSorted;
+  private boolean isNaturalSorted = false;
--- End diff --

default is false only right, no need to add


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153644
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java
 ---
@@ -18,56 +18,152 @@
 
 import java.io.IOException;
 import java.util.BitSet;
+import java.util.Comparator;
 
 import org.apache.carbondata.core.datastore.block.SegmentProperties;
 import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
+import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
 import 
org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
+import 
org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
+import org.apache.carbondata.core.metadata.datatype.DataType;
 import org.apache.carbondata.core.scan.filter.FilterUtil;
+import 
org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
 import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
+import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
 import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
 import org.apache.carbondata.core.util.BitSetGroup;
 import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.DataTypeUtil;
 
 public class ExcludeFilterExecuterImpl implements FilterExecuter {
 
   protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
   protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
+  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
+  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
   protected SegmentProperties segmentProperties;
+  protected boolean isDimensionPresentInCurrentBlock = false;
+  protected boolean isMeasurePresentInCurrentBlock = false;
   /**
* is dimension column data is natural sorted
*/
-  private boolean isNaturalSorted;
+  private boolean isNaturalSorted = false;
+
   public ExcludeFilterExecuterImpl(DimColumnResolvedFilterInfo 
dimColEvaluatorInfo,
-  SegmentProperties segmentProperties) {
-this.dimColEvaluatorInfo = dimColEvaluatorInfo;
-dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
+  MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo, 
SegmentProperties segmentProperties,
+  boolean isMeasure) {
 this.segmentProperties = segmentProperties;
-
FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), 
segmentProperties,
-dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo);
-isNaturalSorted = 
dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
-.getDimension().isSortColumn();
+if (isMeasure == false) {
+  this.dimColEvaluatorInfo = dimColEvaluatorInfo;
+  dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
+
+  
FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), 
segmentProperties,
+  dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo, null, 
null);
+  isDimensionPresentInCurrentBlock = true;
+  isNaturalSorted =
+  dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && 
dimColEvaluatorInfo
+  .getDimension().isSortColumn();
+} else {
+  this.msrColumnEvaluatorInfo = msrColumnEvaluatorInfo;
+  msrColumnExecutorInfo = new MeasureColumnExecuterFilterInfo();
+  FilterUtil
+  
.prepareKeysFromSurrogates(msrColumnEvaluatorInfo.getFilterValues(), 
segmentProperties,
+  null, null, msrColumnEvaluatorInfo.getMeasure(), 
msrColumnExecutorInfo);
+  isMeasurePresentInCurrentBlock = true;
+}
+
   }
 
   @Override public BitSetGroup applyFilter(BlocksChunkHolder 
blockChunkHolder) throws IOException {
-int blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping()
-.get(dimColEvaluatorInfo.getColumnIndex());
-if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
-  blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = 
blockChunkHolder.getDataBlock()
-  .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
+if (isDimensionPresentInCurrentBlock == true) {
+  int blockIndex = 
segmentProperties.getDimensionOrdinalToBlockMapping()
+  .get(dimColEvaluatorInfo.getColumnIndex());
+  if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) 
{
+blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = 
blockChunkHolder.getDataBlock()
+.getDimensionChunk(blockChunkHolder.getFileReader(), 
blockIndex);
   

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153317
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
@@ -113,6 +115,143 @@ public static Object 
getMeasureValueBasedOnDataType(String msrValue, DataType da
 }
   }
 
+  public static Object getMeasureObjectFromDataType(byte[] data, DataType 
dataType) {
+ByteBuffer bb = ByteBuffer.wrap(data);
+switch (dataType) {
+  case SHORT:
+  case INT:
+  case LONG:
+return bb.getLong();
+  case DECIMAL:
+return byteToBigDecimal(data);
+  default:
+return bb.getDouble();
+}
+  }
+
+  /**
+   * This method will convert a given ByteArray to its specific type
+   *
+   * @param msrValue
+   * @param dataType
+   * @param carbonMeasure
+   * @return
+   */
+  //  public static byte[] getMeasureByteArrayBasedOnDataType(String 
msrValue, DataType dataType,
+  //  CarbonMeasure carbonMeasure) {
+  //switch (dataType) {
+  //  case DECIMAL:
+  //BigDecimal bigDecimal =
+  //new 
BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
+  //   return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, 
carbonMeasure.getPrecision()));
+  //  case SHORT:
+  //return ByteUtil.toBytes((Short.parseShort(msrValue)));
+  //  case INT:
+  //return ByteUtil.toBytes(Integer.parseInt(msrValue));
+  //  case LONG:
+  //return ByteUtil.toBytes(Long.valueOf(msrValue));
+  //  default:
+  //Double parsedValue = Double.valueOf(msrValue);
+  //if (Double.isInfinite(parsedValue) || 
Double.isNaN(parsedValue)) {
+  //  return null;
+  //}
+  //return ByteUtil.toBytes(parsedValue);
+  //}
+  //  }
+  public static byte[] getMeasureByteArrayBasedOnDataTypes(String 
msrValue, DataType dataType,
+  CarbonMeasure carbonMeasure) {
+ByteBuffer b;
+switch (dataType) {
+  case BYTE:
+  case SHORT:
+  case INT:
+  case LONG:
+b = ByteBuffer.allocate(8);
+b.putLong(Long.valueOf(msrValue));
+b.flip();
+return b.array();
+  case DOUBLE:
+b = ByteBuffer.allocate(8);
+b.putDouble(Double.valueOf(msrValue));
+b.flip();
+return b.array();
+  case DECIMAL:
+BigDecimal bigDecimal =
+new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), 
RoundingMode.HALF_UP);
+return DataTypeUtil
+.bigDecimalToByte(normalizeDecimalValue(bigDecimal, 
carbonMeasure.getPrecision()));
+  default:
+throw new IllegalArgumentException("Invalid data type: " + 
dataType);
+}
+  }
+
+  /**
+   * This method will convert a given ByteArray to its specific type
+   *
+   * @param msrValue
+   * @param dataType
+   * @param carbonMeasure
+   * @return
+   */
+  public static byte[] getMeasureByteArrayBasedOnDataType(ColumnPage 
measurePage, int index,
+  DataType dataType, CarbonMeasure carbonMeasure) {
+switch (dataType) {
+  case DECIMAL:
+BigDecimal bigDecimal = new 
BigDecimal(measurePage.getDouble(index))
+.setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
+return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, 
carbonMeasure.getPrecision()));
+  case SHORT:
+return ByteUtil.toBytes(measurePage.getShort(index));
+  case INT:
+return ByteUtil.toBytes(measurePage.getInt(index));
+  case LONG:
+return ByteUtil.toBytes(measurePage.getLong(index));
+  default:
+Double parsedValue = Double.valueOf(measurePage.getDouble(index));
+if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
+  return null;
+}
+return ByteUtil.toBytes(parsedValue);
+}
+  }
+
+  public static Object getMeasureObjectBasedOnDataType(ColumnPage 
measurePage, int index,
+  DataType dataType, CarbonMeasure carbonMeasure) {
+//switch (dataType) {
+//  case DECIMAL:
+//BigDecimal bigDecimal = new 
BigDecimal(measurePage.getDouble(index))
+//.setScale(carbonMeasure.getScale(), 
RoundingMode.HALF_UP);
+//return normalizeDecimalValue(bigDecimal, 
carbonMeasure.getPrecision());
+//  case SHORT:
+//  case INT:
+//  case LONG:
+//return measurePage.getLong(index);
+//  default:
+//Double 

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153305
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
@@ -113,6 +115,143 @@ public static Object 
getMeasureValueBasedOnDataType(String msrValue, DataType da
 }
   }
 
+  public static Object getMeasureObjectFromDataType(byte[] data, DataType 
dataType) {
+ByteBuffer bb = ByteBuffer.wrap(data);
+switch (dataType) {
+  case SHORT:
+  case INT:
+  case LONG:
+return bb.getLong();
+  case DECIMAL:
+return byteToBigDecimal(data);
+  default:
+return bb.getDouble();
+}
+  }
+
+  /**
+   * This method will convert a given ByteArray to its specific type
+   *
+   * @param msrValue
+   * @param dataType
+   * @param carbonMeasure
+   * @return
+   */
+  //  public static byte[] getMeasureByteArrayBasedOnDataType(String 
msrValue, DataType dataType,
+  //  CarbonMeasure carbonMeasure) {
+  //switch (dataType) {
+  //  case DECIMAL:
+  //BigDecimal bigDecimal =
+  //new 
BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
+  //   return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, 
carbonMeasure.getPrecision()));
+  //  case SHORT:
+  //return ByteUtil.toBytes((Short.parseShort(msrValue)));
+  //  case INT:
+  //return ByteUtil.toBytes(Integer.parseInt(msrValue));
+  //  case LONG:
+  //return ByteUtil.toBytes(Long.valueOf(msrValue));
+  //  default:
+  //Double parsedValue = Double.valueOf(msrValue);
+  //if (Double.isInfinite(parsedValue) || 
Double.isNaN(parsedValue)) {
+  //  return null;
+  //}
+  //return ByteUtil.toBytes(parsedValue);
+  //}
+  //  }
+  public static byte[] getMeasureByteArrayBasedOnDataTypes(String 
msrValue, DataType dataType,
+  CarbonMeasure carbonMeasure) {
+ByteBuffer b;
+switch (dataType) {
+  case BYTE:
+  case SHORT:
+  case INT:
+  case LONG:
+b = ByteBuffer.allocate(8);
+b.putLong(Long.valueOf(msrValue));
+b.flip();
+return b.array();
+  case DOUBLE:
+b = ByteBuffer.allocate(8);
+b.putDouble(Double.valueOf(msrValue));
+b.flip();
+return b.array();
+  case DECIMAL:
+BigDecimal bigDecimal =
+new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), 
RoundingMode.HALF_UP);
+return DataTypeUtil
+.bigDecimalToByte(normalizeDecimalValue(bigDecimal, 
carbonMeasure.getPrecision()));
+  default:
+throw new IllegalArgumentException("Invalid data type: " + 
dataType);
+}
+  }
+
+  /**
+   * This method will convert a given ByteArray to its specific type
+   *
+   * @param msrValue
+   * @param dataType
+   * @param carbonMeasure
+   * @return
+   */
+  public static byte[] getMeasureByteArrayBasedOnDataType(ColumnPage 
measurePage, int index,
--- End diff --

This method is not used, please remove it


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153193
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
@@ -113,6 +115,143 @@ public static Object 
getMeasureValueBasedOnDataType(String msrValue, DataType da
 }
   }
 
+  public static Object getMeasureObjectFromDataType(byte[] data, DataType 
dataType) {
+ByteBuffer bb = ByteBuffer.wrap(data);
+switch (dataType) {
+  case SHORT:
+  case INT:
+  case LONG:
+return bb.getLong();
+  case DECIMAL:
+return byteToBigDecimal(data);
+  default:
+return bb.getDouble();
+}
+  }
+
+  /**
+   * This method will convert a given ByteArray to its specific type
+   *
+   * @param msrValue
+   * @param dataType
+   * @param carbonMeasure
+   * @return
+   */
+  //  public static byte[] getMeasureByteArrayBasedOnDataType(String 
msrValue, DataType dataType,
+  //  CarbonMeasure carbonMeasure) {
+  //switch (dataType) {
+  //  case DECIMAL:
+  //BigDecimal bigDecimal =
+  //new 
BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
+  //   return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, 
carbonMeasure.getPrecision()));
+  //  case SHORT:
+  //return ByteUtil.toBytes((Short.parseShort(msrValue)));
+  //  case INT:
+  //return ByteUtil.toBytes(Integer.parseInt(msrValue));
+  //  case LONG:
+  //return ByteUtil.toBytes(Long.valueOf(msrValue));
+  //  default:
+  //Double parsedValue = Double.valueOf(msrValue);
+  //if (Double.isInfinite(parsedValue) || 
Double.isNaN(parsedValue)) {
+  //  return null;
+  //}
+  //return ByteUtil.toBytes(parsedValue);
+  //}
+  //  }
--- End diff --

remove the commented code


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153179
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
@@ -113,6 +115,143 @@ public static Object 
getMeasureValueBasedOnDataType(String msrValue, DataType da
 }
   }
 
+  public static Object getMeasureObjectFromDataType(byte[] data, DataType 
dataType) {
+ByteBuffer bb = ByteBuffer.wrap(data);
--- End diff --

This is unnecessary object for `decimal` so keep it in respective case 
statement


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125153056
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java
 ---
@@ -18,56 +18,152 @@
 
 import java.io.IOException;
 import java.util.BitSet;
+import java.util.Comparator;
 
 import org.apache.carbondata.core.datastore.block.SegmentProperties;
 import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
+import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
 import 
org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
+import 
org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
+import org.apache.carbondata.core.metadata.datatype.DataType;
 import org.apache.carbondata.core.scan.filter.FilterUtil;
+import 
org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
 import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
+import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
 import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
 import org.apache.carbondata.core.util.BitSetGroup;
 import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.DataTypeUtil;
 
 public class ExcludeFilterExecuterImpl implements FilterExecuter {
 
   protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
   protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
+  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
+  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
   protected SegmentProperties segmentProperties;
+  protected boolean isDimensionPresentInCurrentBlock = false;
+  protected boolean isMeasurePresentInCurrentBlock = false;
--- End diff --

I don't think all these flags are required. just do `null` check of  
`dimColumnExecuterInfo` and  `msrColumnExecutorInfo` is enough


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152979
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java
 ---
@@ -18,56 +18,152 @@
 
 import java.io.IOException;
 import java.util.BitSet;
+import java.util.Comparator;
 
 import org.apache.carbondata.core.datastore.block.SegmentProperties;
 import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
+import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
 import 
org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
+import 
org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
+import org.apache.carbondata.core.metadata.datatype.DataType;
 import org.apache.carbondata.core.scan.filter.FilterUtil;
+import 
org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
 import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
+import 
org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
 import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
 import org.apache.carbondata.core.util.BitSetGroup;
 import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.DataTypeUtil;
 
 public class ExcludeFilterExecuterImpl implements FilterExecuter {
 
   protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
   protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
+  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
+  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
   protected SegmentProperties segmentProperties;
+  protected boolean isDimensionPresentInCurrentBlock = false;
+  protected boolean isMeasurePresentInCurrentBlock = false;
   /**
* is dimension column data is natural sorted
*/
-  private boolean isNaturalSorted;
+  private boolean isNaturalSorted = false;
+
   public ExcludeFilterExecuterImpl(DimColumnResolvedFilterInfo 
dimColEvaluatorInfo,
-  SegmentProperties segmentProperties) {
-this.dimColEvaluatorInfo = dimColEvaluatorInfo;
-dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
+  MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo, 
SegmentProperties segmentProperties,
+  boolean isMeasure) {
 this.segmentProperties = segmentProperties;
-
FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), 
segmentProperties,
-dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo);
-isNaturalSorted = 
dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
-.getDimension().isSortColumn();
+if (isMeasure == false) {
--- End diff --

just use `!isMeasure`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152956
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
@@ -1042,12 +1144,17 @@ public static FilterExecuter getFilterExecuterTree(
* @param dimension
* @param dimColumnExecuterInfo
*/
-  public static void prepareKeysFromSurrogates(DimColumnFilterInfo 
filterValues,
+  public static void prepareKeysFromSurrogates(ColumnFilterInfo 
filterValues,
   SegmentProperties segmentProperties, CarbonDimension dimension,
-  DimColumnExecuterFilterInfo dimColumnExecuterInfo) {
-byte[][] keysBasedOnFilter = getKeyArray(filterValues, dimension, 
segmentProperties);
-dimColumnExecuterInfo.setFilterKeys(keysBasedOnFilter);
-
+  DimColumnExecuterFilterInfo dimColumnExecuterInfo, CarbonMeasure 
measures,
+  MeasureColumnExecuterFilterInfo msrColumnExecuterInfo) {
+if (null != measures) {
--- End diff --

I don't think this `if ` check is required. just pass dimension and measure 
to the method 'getKeyArray'


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152881
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
@@ -395,6 +440,58 @@ public static DimColumnFilterInfo 
getNoDictionaryValKeyMemberForFilter(
   }
 
   /**
+   * This method will get the no dictionary data based on filters and same
+   * will be in ColumnFilterInfo
+   *
+   * @param evaluateResultListFinal
+   * @param isIncludeFilter
+   * @return ColumnFilterInfo
+   */
+  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
+  List evaluateResultListFinal, boolean isIncludeFilter, 
DataType dataType,
+  CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
+List filterValuesList = new ArrayList(20);
+String result = null;
+try {
+  int length = evaluateResultListFinal.size();
+  for (int i = 0; i < length; i++) {
+result = evaluateResultListFinal.get(i);
+if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
+  filterValuesList.add(new byte[0]);
+  continue;
+}
+// TODO have to understand what method to be used for measures.
+// filterValuesList
+//  
.add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, 
dataType));
+
+filterValuesList
+.add(DataTypeUtil.getMeasureByteArrayBasedOnDataTypes(result, 
dataType, carbonMeasure));
+
+  }
+} catch (Throwable ex) {
+  throw new FilterUnsupportedException("Unsupported Filter condition: 
" + result, ex);
+}
+
+Comparator filterMeasureComaparator = new Comparator() 
{
+
+  @Override public int compare(byte[] filterMember1, byte[] 
filterMember2) {
+// TODO Auto-generated method stub
+return ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterMember1, 
filterMember2);
--- End diff --

Please compare actual values before convert to binary


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152873
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
@@ -395,6 +440,58 @@ public static DimColumnFilterInfo 
getNoDictionaryValKeyMemberForFilter(
   }
 
   /**
+   * This method will get the no dictionary data based on filters and same
+   * will be in ColumnFilterInfo
+   *
+   * @param evaluateResultListFinal
+   * @param isIncludeFilter
+   * @return ColumnFilterInfo
+   */
+  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
+  List evaluateResultListFinal, boolean isIncludeFilter, 
DataType dataType,
+  CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
+List filterValuesList = new ArrayList(20);
+String result = null;
+try {
+  int length = evaluateResultListFinal.size();
+  for (int i = 0; i < length; i++) {
+result = evaluateResultListFinal.get(i);
+if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
+  filterValuesList.add(new byte[0]);
+  continue;
+}
+// TODO have to understand what method to be used for measures.
+// filterValuesList
+//  
.add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, 
dataType));
+
+filterValuesList
+.add(DataTypeUtil.getMeasureByteArrayBasedOnDataTypes(result, 
dataType, carbonMeasure));
+
+  }
+} catch (Throwable ex) {
+  throw new FilterUnsupportedException("Unsupported Filter condition: 
" + result, ex);
+}
+
+Comparator filterMeasureComaparator = new Comparator() 
{
+
+  @Override public int compare(byte[] filterMember1, byte[] 
filterMember2) {
+// TODO Auto-generated method stub
+return ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterMember1, 
filterMember2);
--- End diff --

this is wrong, we cannot compare `double,float,decimal` through bytes. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152824
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
@@ -395,6 +440,58 @@ public static DimColumnFilterInfo 
getNoDictionaryValKeyMemberForFilter(
   }
 
   /**
+   * This method will get the no dictionary data based on filters and same
+   * will be in ColumnFilterInfo
+   *
+   * @param evaluateResultListFinal
+   * @param isIncludeFilter
+   * @return ColumnFilterInfo
+   */
+  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
+  List evaluateResultListFinal, boolean isIncludeFilter, 
DataType dataType,
+  CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
+List filterValuesList = new ArrayList(20);
+String result = null;
+try {
+  int length = evaluateResultListFinal.size();
+  for (int i = 0; i < length; i++) {
+result = evaluateResultListFinal.get(i);
+if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
+  filterValuesList.add(new byte[0]);
+  continue;
+}
+// TODO have to understand what method to be used for measures.
+// filterValuesList
+//  
.add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, 
dataType));
--- End diff --

is this comment required now? please remove if not required


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152709
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
@@ -209,9 +233,29 @@ private static FilterExecuter getIncludeFilterExecuter(
* @return
*/
   private static FilterExecuter getExcludeFilterExecuter(
-  DimColumnResolvedFilterInfo dimColResolvedFilterInfo, 
SegmentProperties segmentProperties) {
+  DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
+  MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
+  SegmentProperties segmentProperties) {
 
-if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
+if (null != msrColResolvedFilterInfo && 
msrColResolvedFilterInfo.getMeasure().isColumnar()) {
--- End diff --

even here `isColumnar` check not required


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152663
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
@@ -180,9 +185,27 @@ private static FilterExecuter createFilterExecuterTree(
* @return
*/
   private static FilterExecuter getIncludeFilterExecuter(
-  DimColumnResolvedFilterInfo dimColResolvedFilterInfo, 
SegmentProperties segmentProperties) {
-
-if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
+  DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
+  MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
+  SegmentProperties segmentProperties) {
+if (null != msrColResolvedFilterInfo && 
msrColResolvedFilterInfo.getMeasure().isColumnar()) {
+  CarbonMeasure measuresFromCurrentBlock = segmentProperties
+  
.getMeasureFromCurrentBlock(msrColResolvedFilterInfo.getMeasure().getColumnId());
+  if (null != measuresFromCurrentBlock) {
+// update dimension and column index according to the dimension 
position in current block
+MeasureColumnResolvedFilterInfo msrColResolvedFilterInfoCopyObject 
=
+msrColResolvedFilterInfo.getCopyObject();
+
msrColResolvedFilterInfoCopyObject.setMeasure(measuresFromCurrentBlock);
+
msrColResolvedFilterInfoCopyObject.setColumnIndex(measuresFromCurrentBlock.getOrdinal());
+
msrColResolvedFilterInfoCopyObject.setType(measuresFromCurrentBlock.getDataType());
+return new IncludeFilterExecuterImpl(null, 
msrColResolvedFilterInfoCopyObject,
+segmentProperties, true);
+  } else {
+return new 
RestructureIncludeFilterExecutorImpl(dimColResolvedFilterInfo,
+msrColResolvedFilterInfo, segmentProperties, true);
+  }
+}
--- End diff --

isn't `else` should be here for `dimColResolvedFilterInfo` check


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152630
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
@@ -180,9 +185,27 @@ private static FilterExecuter createFilterExecuterTree(
* @return
*/
   private static FilterExecuter getIncludeFilterExecuter(
-  DimColumnResolvedFilterInfo dimColResolvedFilterInfo, 
SegmentProperties segmentProperties) {
-
-if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
+  DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
+  MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
+  SegmentProperties segmentProperties) {
+if (null != msrColResolvedFilterInfo && 
msrColResolvedFilterInfo.getMeasure().isColumnar()) {
--- End diff --

I don't think `msrColResolvedFilterInfo.getMeasure().isColumnar()` is 
really required. it is only for dimensions.please remove


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152433
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/expression/ColumnExpression.java
 ---
@@ -31,12 +32,16 @@
 
   private boolean isDimension;
 
+  private boolean isMeasure;
--- End diff --

same information is available inside `measure`, so please remove it and get 
it from `measure`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152368
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/metadata/schema/table/CarbonTable.java
 ---
@@ -137,6 +137,8 @@
*/
   private int numberOfNoDictSortColumns;
 
+  private int lastDimensionColumnOrdinal;
--- End diff --

Not used any where, please remove


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152345
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/datastore/page/statistics/ColumnPageStatsVO.java
 ---
@@ -102,9 +109,15 @@ public void update(Object value) {
 break;
   case DECIMAL:
 BigDecimal decimalValue = DataTypeUtil.byteToBigDecimal((byte[]) 
value);
-decimal = decimalValue.scale();
-BigDecimal val = (BigDecimal) min;
-nonExistValue = (val.subtract(new BigDecimal(1.0)));
+if (isFirst) {
--- End diff --

I think this `isFirst` is required. just check null for max or min here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

2017-06-30 Thread ravipesala
Github user ravipesala commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1079#discussion_r125152265
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/datastore/page/statistics/ColumnPageStatsVO.java
 ---
@@ -56,9 +65,7 @@ public ColumnPageStatsVO(DataType dataType) {
 nonExistValue = Double.MIN_VALUE;
 break;
   case DECIMAL:
-max = new BigDecimal(Double.MIN_VALUE);
-min = new BigDecimal(Double.MAX_VALUE);
-nonExistValue = new BigDecimal(Double.MIN_VALUE);
+this.zeroDecimal = new BigDecimal(0);
--- End diff --

use `BigDecimal.ZERO`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1079
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/254/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1079
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2834/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1079
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2833/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1079
  
Build Failed with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/253/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1079
  
Build Failed with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/252/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1079
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2832/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

2017-06-30 Thread sounakr
Github user sounakr commented on the issue:

https://github.com/apache/carbondata/pull/1079
  
retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1123: [CARBONDATA-1254] Fixed describe formatted for sort ...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1123
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/251/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1123: [CARBONDATA-1254] Fixed describe formatted for sort ...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1123
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2831/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1123: [CARBONDATA-1254] Fixed describe formatted for sort ...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1123
  
Build Failed with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/250/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1123: [CARBONDATA-1254] Fixed describe formatted for sort ...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1123
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2830/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1123: [CARBONDATA-1254] Fixed describe formatted fo...

2017-06-30 Thread BJangir
GitHub user BJangir opened a pull request:

https://github.com/apache/carbondata/pull/1123

[CARBONDATA-1254] Fixed describe formatted for sort columns after alter

Analysis: Earlier if we delete any column still it was being listed in 
describe formatted.

Solution: Filtering deleted columns from sort columns list

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/BJangir/incubator-carbondata descformat

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/carbondata/pull/1123.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1123


commit 6c159dca830c71ff3f3157318747247eb05bd3db
Author: Ayush Mantri 
Date:   2017-06-29T09:48:20Z

Fixed described formatted for sort_columns after alter




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1123: [CARBONDATA-1254] Fixed describe formatted for sort ...

2017-06-30 Thread asfgit
Github user asfgit commented on the issue:

https://github.com/apache/carbondata/pull/1123
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1123: [CARBONDATA-1254] Fixed describe formatted for sort ...

2017-06-30 Thread asfgit
Github user asfgit commented on the issue:

https://github.com/apache/carbondata/pull/1123
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Assigned] (CARBONDATA-1254) Post Alter Describe Formatted is listing deleted column in Sort_Colums

2017-06-30 Thread ayushmantri (JIRA)

 [ 
https://issues.apache.org/jira/browse/CARBONDATA-1254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ayushmantri reassigned CARBONDATA-1254:
---

Assignee: ayushmantri

> Post Alter Describe Formatted is listing deleted column in Sort_Colums 
> ---
>
> Key: CARBONDATA-1254
> URL: https://issues.apache.org/jira/browse/CARBONDATA-1254
> Project: CarbonData
>  Issue Type: Bug
>  Components: data-query
>Affects Versions: 1.1.0
> Environment: Single Node setup
>Reporter: ayushmantri
>Assignee: ayushmantri
>Priority: Minor
>
> 0: jdbc:hive2://172.168.100.196:22550/default> create table t3 (id 
> string,country string,population string ) stored by 'carbondata';
> +-+--+
> | Result  |
> +-+--+
> +-+--+
> No rows selected (0.474 seconds)
> 0: jdbc:hive2://172.168.100.196:22550/default> desc formatted t3;
> +---+---+---+--+
> |   col_name| 
> data_type |   
>comment  |
> +---+---+---+--+
> | id| string  
>   | DICTIONARY, KEY COLUMN
> |
> | country   | string  
>   | DICTIONARY, KEY COLUMN
> |
> | population| string  
>   | DICTIONARY, KEY COLUMN
> |
> |   | 
>   |   
> |
> | ##Detailed Table Information  | 
>   |   
> |
> | Database Name:| default 
>   |   
> |
> | Table Name:   | t3  
>   |   
> |
> | CARBON Store Path:| 
> hdfs://hacluster/user/hive/warehouse/carbon.store 
> | 
>   |
> | Table Block Size :| 1024 MB 
>   |   
> |
> |   | 
>   |   
> |
> | ##Detailed Column property| 
>   |   
> |
> | ADAPTIVE  | 
>   |   
> |
> | SORT_COLUMNS  | id,country,population   
>   |   
> |
> |   | 
>   |   
> |
> | ##Column Group Information| 
>   |   
> |
> +---+-
> 0: 

[jira] [Commented] (CARBONDATA-1254) Post Alter Describe Formatted is listing deleted column in Sort_Colums

2017-06-30 Thread ayushmantri (JIRA)

[ 
https://issues.apache.org/jira/browse/CARBONDATA-1254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16070263#comment-16070263
 ] 

ayushmantri commented on CARBONDATA-1254:
-

Please assign this to me

> Post Alter Describe Formatted is listing deleted column in Sort_Colums 
> ---
>
> Key: CARBONDATA-1254
> URL: https://issues.apache.org/jira/browse/CARBONDATA-1254
> Project: CarbonData
>  Issue Type: Bug
>  Components: data-query
>Affects Versions: 1.1.0
> Environment: Single Node setup
>Reporter: ayushmantri
>Priority: Minor
>
> 0: jdbc:hive2://172.168.100.196:22550/default> create table t3 (id 
> string,country string,population string ) stored by 'carbondata';
> +-+--+
> | Result  |
> +-+--+
> +-+--+
> No rows selected (0.474 seconds)
> 0: jdbc:hive2://172.168.100.196:22550/default> desc formatted t3;
> +---+---+---+--+
> |   col_name| 
> data_type |   
>comment  |
> +---+---+---+--+
> | id| string  
>   | DICTIONARY, KEY COLUMN
> |
> | country   | string  
>   | DICTIONARY, KEY COLUMN
> |
> | population| string  
>   | DICTIONARY, KEY COLUMN
> |
> |   | 
>   |   
> |
> | ##Detailed Table Information  | 
>   |   
> |
> | Database Name:| default 
>   |   
> |
> | Table Name:   | t3  
>   |   
> |
> | CARBON Store Path:| 
> hdfs://hacluster/user/hive/warehouse/carbon.store 
> | 
>   |
> | Table Block Size :| 1024 MB 
>   |   
> |
> |   | 
>   |   
> |
> | ##Detailed Column property| 
>   |   
> |
> | ADAPTIVE  | 
>   |   
> |
> | SORT_COLUMNS  | id,country,population   
>   |   
> |
> |   | 
>   |   
> |
> | ##Column Group Information| 
>   |   
> |
> +---+-
> 0: 

[jira] [Updated] (CARBONDATA-1254) Post Alter Describe Formatted is listing deleted column in Sort_Colums

2017-06-30 Thread ayushmantri (JIRA)

 [ 
https://issues.apache.org/jira/browse/CARBONDATA-1254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ayushmantri updated CARBONDATA-1254:

Description: 
0: jdbc:hive2://172.168.100.196:22550/default> create table t3 (id 
string,country string,population string ) stored by 'carbondata';
+-+--+
| Result  |
+-+--+
+-+--+
No rows selected (0.474 seconds)
0: jdbc:hive2://172.168.100.196:22550/default> desc formatted t3;
+---+---+---+--+
|   col_name| 
data_type | 
 comment  |
+---+---+---+--+
| id| string
| DICTIONARY, KEY COLUMN
|
| country   | string
| DICTIONARY, KEY COLUMN
|
| population| string
| DICTIONARY, KEY COLUMN
|
|   |   
|   
|
| ##Detailed Table Information  |   
|   
|
| Database Name:| default   
|   
|
| Table Name:   | t3
|   
|
| CARBON Store Path:| 
hdfs://hacluster/user/hive/warehouse/carbon.store   
  |   |
| Table Block Size :| 1024 MB   
|   
|
|   |   
|   
|
| ##Detailed Column property|   
|   
|
| ADAPTIVE  |   
|   
|
| SORT_COLUMNS  | id,country,population 
|   
|
|   |   
|   
|
| ##Column Group Information|   
|   
|
+---+-

0: jdbc:hive2://172.168.100.196:22550/default> alter table t3 drop columns ( 
country);
+-+--+
| Result  |
+-+--+
+-+--+
No rows selected (0.652 seconds)
0: jdbc:hive2://172.168.100.196:22550/default> desc formatted t3;
+---+---+---+--+
|   col_name| 
data_type | 
 comment  |

[jira] [Created] (CARBONDATA-1254) Post Alter Describe Formatted is listing deleted column in Sort_Colums

2017-06-30 Thread ayushmantri (JIRA)
ayushmantri created CARBONDATA-1254:
---

 Summary: Post Alter Describe Formatted is listing deleted column 
in Sort_Colums 
 Key: CARBONDATA-1254
 URL: https://issues.apache.org/jira/browse/CARBONDATA-1254
 Project: CarbonData
  Issue Type: Bug
  Components: data-query
Affects Versions: 1.1.0
 Environment: Single Node setup
Reporter: ayushmantri
Priority: Minor


0: jdbc:hive2://172.168.100.196:22550/default> create table t3 (id 
string,country string,population string ) stored by 'carbondata';
+-+--+
| Result  |
+-+--+
+-+--+
No rows selected (0.474 seconds)
0: jdbc:hive2://172.168.100.196:22550/default> desc formatted t3;
+---+---+---+--+
|   col_name| 
data_type | 
 comment  |
+---+---+---+--+
| id| string
| DICTIONARY, KEY COLUMN
|
| country   | string
| DICTIONARY, KEY COLUMN
|
| population| string
| DICTIONARY, KEY COLUMN
|
|   |   
|   
|
| ##Detailed Table Information  |   
|   
|
| Database Name:| default   
|   
|
| Table Name:   | t3
|   
|
| CARBON Store Path:| 
hdfs://hacluster/user/hive/warehouse/carbon.store   
  |   |
| Table Block Size :| 1024 MB   
|   
|
|   |   
|   
|
| ##Detailed Column property|   
|   
|
| ADAPTIVE  |   
|   
|
| SORT_COLUMNS  | id,country,population 
|   
|
|   |   
|   
|
| ##Column Group Information|   
|   
|
+---+-0:
 jdbc:hive2://172.168.100.196:22550/default> alter table t3 drop columns ( 
country);
+-+--+
| Result  |
+-+--+
+-+--+
No rows selected (0.652 seconds)
0: jdbc:hive2://172.168.100.196:22550/default> desc formatted t3;

[GitHub] carbondata issue #1109: [CARBONDATA-1241] Single_Pass either should be block...

2017-06-30 Thread rahulforallp
Github user rahulforallp commented on the issue:

https://github.com/apache/carbondata/pull/1109
  
@gvramana please review. i have refactored the code , now it will work for 
SET property also.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column page for...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1121
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/249/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column page for...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1121
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2829/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Resolved] (CARBONDATA-1251) Add test cases for IUD feature

2017-06-30 Thread Jacky Li (JIRA)

 [ 
https://issues.apache.org/jira/browse/CARBONDATA-1251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jacky Li resolved CARBONDATA-1251.
--
   Resolution: Fixed
Fix Version/s: 1.2.0

> Add test cases for IUD feature
> --
>
> Key: CARBONDATA-1251
> URL: https://issues.apache.org/jira/browse/CARBONDATA-1251
> Project: CarbonData
>  Issue Type: Bug
>Reporter: chenerlu
>Assignee: chenerlu
>Priority: Minor
> Fix For: 1.2.0
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] carbondata pull request #1120: [CARBONDATA-1251] Add test cases for IUD feat...

2017-06-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/1120


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1120: [CARBONDATA-1251] Add test cases for IUD feature

2017-06-30 Thread jackylk
Github user jackylk commented on the issue:

https://github.com/apache/carbondata/pull/1120
  
LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1122: [CARBONDATA-1253] Sort_columns should not sup...

2017-06-30 Thread jackylk
Github user jackylk commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1122#discussion_r125026813
  
--- Diff: 
integration/spark-common/src/main/scala/org/apache/spark/sql/catalyst/CarbonDDLSqlParser.scala
 ---
@@ -691,6 +691,14 @@ abstract class CarbonDDLSqlParser extends 
AbstractCarbonSparkSQLParser {
   }
 
   /**
+   * detects whether datatype is part of sort_column
+   */
+  def isDataTypeSupportedForSortColumn(columnDataType: String): Boolean = {
--- End diff --

should be private


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1122: [CARBONDATA-1253] Sort_columns should not support fl...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1122
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2828/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1122: [CARBONDATA-1253] Sort_columns should not support fl...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1122
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/248/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column p...

2017-06-30 Thread QiangCai
Github user QiangCai commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1121#discussion_r125023223
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/datastore/page/VarLengthColumnPageBase.java
 ---
@@ -106,7 +106,7 @@ static ColumnPage newDecimalColumnPage(byte[] 
lvEncodedBytes) throws MemoryExcep
 
 VarLengthColumnPageBase page;
 if (unsafe) {
-  page = new UnsafeVarLengthColumnPage(DECIMAL, numRows);
+  page = new UnsafeVarLengthColumnPage(DECIMAL, numRows, 
lvEncodedBytes.length);
--- End diff --

how about to use offset variable?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1122: [CARBONDATA-1253] Sort_columns should not sup...

2017-06-30 Thread QiangCai
GitHub user QiangCai opened a pull request:

https://github.com/apache/carbondata/pull/1122

[CARBONDATA-1253] Sort_columns should not support float,double,decimal

Sort_columns should not support float,double,decimal.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/QiangCai/carbondata 
sortcolumnnotsupportdatatype

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/carbondata/pull/1122.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1122


commit 10576375ef24629e1f6d36062ab809fac3cfd8d0
Author: QiangCai 
Date:   2017-06-30T11:51:19Z

sort_columns not support float,double,decimal




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #943: [CARBONDATA-1086]Added documentation for BATCH SORT S...

2017-06-30 Thread vandana7
Github user vandana7 commented on the issue:

https://github.com/apache/carbondata/pull/943
  
1) I have confirmed that "carbon.load.batch.sort.size.inmb" property is 
present CarbonCommonConstants class. I have also updated the description for 
the same.
2) I have also added "carbon.load.sort.scope" property in the documentation.
Please review


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1114: [CARBONDATA-1248] change LazyColumnPage parent class

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1114
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2827/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1114: [CARBONDATA-1248] change LazyColumnPage parent class

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1114
  
Build Success with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/247/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column page for...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1121
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2826/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata pull request #1121: [CARBONDATA-1215][BUGFIX] Fix unsafe column p...

2017-06-30 Thread jackylk
GitHub user jackylk opened a pull request:

https://github.com/apache/carbondata/pull/1121

[CARBONDATA-1215][BUGFIX] Fix unsafe column page for decimal query

For decimal loading is using variable length column page, in unsafe column 
page implementation, there is a bug that the page is not checking for this 
capacity.

In this PR, capacity check is added, now `UnsafeVarLengthColumnPage` will 
grow its size based on need.

A testcase is added to verify this bug fix


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jackylk/incubator-carbondata unsafe

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/carbondata/pull/1121.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1121


commit 222376fba93da6ee5e51ecd8d116b7bf4f9d44f3
Author: jackylk 
Date:   2017-06-30T10:27:08Z

fix unsafe column page bug




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1116: [CARBONDATA-1249] Wrong order of columns in redirect...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1116
  
Build Failed with Spark 1.6, Please check CI 
http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/244/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] carbondata issue #1116: [CARBONDATA-1249] Wrong order of columns in redirect...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1116
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2824/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (CARBONDATA-1252) Add BAD_RECORD_PATH option in Load options section in the Carbon Help doc

2017-06-30 Thread Gururaj Shetty (JIRA)

[ 
https://issues.apache.org/jira/browse/CARBONDATA-1252?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16069613#comment-16069613
 ] 

Gururaj Shetty commented on CARBONDATA-1252:


We will add the following info in the doc for Load section:

BAD_RECORD_PATH
-

Specifies the HDFS path where bad records are stored. By default the value is 
Null. This path must to be configured by the user if bad record logger is 
enabled or bad record action redirect.

> Add BAD_RECORD_PATH option in Load options section in the Carbon Help doc
> -
>
> Key: CARBONDATA-1252
> URL: https://issues.apache.org/jira/browse/CARBONDATA-1252
> Project: CarbonData
>  Issue Type: Sub-task
>Reporter: Mohammad Shahid Khan
>Assignee: Gururaj Shetty
>Priority: Minor
> Fix For: 1.2.0
>
>




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (CARBONDATA-1252) Add BAD_RECORD_PATH option in Load options section in the Carbon Help doc

2017-06-30 Thread Mohammad Shahid Khan (JIRA)

 [ 
https://issues.apache.org/jira/browse/CARBONDATA-1252?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mohammad Shahid Khan updated CARBONDATA-1252:
-
Summary: Add BAD_RECORD_PATH option in Load options section in the Carbon 
Help doc  (was: Add BAD_RECORD_PATH options in Load options in the Carbon Help 
doc)

> Add BAD_RECORD_PATH option in Load options section in the Carbon Help doc
> -
>
> Key: CARBONDATA-1252
> URL: https://issues.apache.org/jira/browse/CARBONDATA-1252
> Project: CarbonData
>  Issue Type: Sub-task
>Reporter: Mohammad Shahid Khan
>Priority: Minor
> Fix For: 1.2.0
>
>




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Created] (CARBONDATA-1252) Add BAD_RECORD_PATH options in Load options in the Carbon Help doc

2017-06-30 Thread Mohammad Shahid Khan (JIRA)
Mohammad Shahid Khan created CARBONDATA-1252:


 Summary: Add BAD_RECORD_PATH options in Load options in the Carbon 
Help doc
 Key: CARBONDATA-1252
 URL: https://issues.apache.org/jira/browse/CARBONDATA-1252
 Project: CarbonData
  Issue Type: Sub-task
Reporter: Mohammad Shahid Khan
Priority: Minor






--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (CARBONDATA-1117) Update SET & RESET command details in online help documentation

2017-06-30 Thread Mohammad Shahid Khan (JIRA)

[ 
https://issues.apache.org/jira/browse/CARBONDATA-1117?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16069604#comment-16069604
 ] 

Mohammad Shahid Khan commented on CARBONDATA-1117:
--

Please refer PR #https://github.com/apache/carbondata/pull/972 for details 
For information
Updated supported parameters
set -v command displays all session parameter list along with default value and 
usage doc
Example : set -v;
Supported parameters for dynamic set.
carbon.options.bad.records.logger.enable
carbon.options.bad.records.action
carbon.options.is.empty.data.bad.record
carbon.options.sort.scope
carbon.options.batch.sort.size.inmb
carbon.options.single.pass
carbon.options.bad.record.path
carbon.options.global.sort.partitions
enable.unsafe.sort
carbon.custom.block.distribution

> Update SET & RESET command details in online help documentation
> ---
>
> Key: CARBONDATA-1117
> URL: https://issues.apache.org/jira/browse/CARBONDATA-1117
> Project: CarbonData
>  Issue Type: Sub-task
>  Components: spark-integration
>Reporter: Manohar Vanam
>Assignee: Gururaj Shetty
>
> Update SET & RESET command details in online help documentation
> 1. update syntax
> 2. Examples with use cases



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] carbondata issue #1082: [CARBONDATA-1218] In case of data-load failure the B...

2017-06-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1082
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2823/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---