[GitHub] carbondata issue #2546: [CARBONDATA-2775] Adaptive encoding fails for Unsafe...

2018-07-24 Thread kumarvishal09
Github user kumarvishal09 commented on the issue:

https://github.com/apache/carbondata/pull/2546
  
LGTM


---


[GitHub] carbondata issue #2441: [CARBONDATA-2625] optimize CarbonReader performance

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2441
  
SDV Build Fail , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5989/



---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6236/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7481/



---


[jira] [Resolved] (CARBONDATA-2740) flat folder structure is not handled for implicit column and segment file is not getting deleted after load is failed

2018-07-24 Thread Kunal Kapoor (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-2740?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kunal Kapoor resolved CARBONDATA-2740.
--
   Resolution: Fixed
Fix Version/s: 1.4.1

> flat folder structure is not handled for implicit column and segment file is 
> not getting deleted after load is failed
> -
>
> Key: CARBONDATA-2740
> URL: https://issues.apache.org/jira/browse/CARBONDATA-2740
> Project: CarbonData
>  Issue Type: Bug
>Reporter: Akash R Nilugal
>Assignee: Akash R Nilugal
>Priority: Minor
> Fix For: 1.4.1
>
>  Time Spent: 5h 20m
>  Remaining Estimate: 0h
>
> flat folder structure is not handled for implicit column and segment file is 
> not getting deleted after load is failed



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata pull request #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread kunal642
Github user kunal642 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2535#discussion_r204983701
  
--- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/management/CarbonLoadDataCommand.scala
 ---
@@ -77,6 +77,7 @@ import 
org.apache.carbondata.spark.dictionary.provider.SecureDictionaryServicePr
 import org.apache.carbondata.spark.dictionary.server.SecureDictionaryServer
 import org.apache.carbondata.spark.load.{CsvRDDHelper, 
DataLoadProcessorStepOnSpark}
 import org.apache.carbondata.spark.rdd.CarbonDataRDDFactory
+import org.apache.carbondata.spark.rdd.CarbonDataRDDFactory.LOGGER
--- End diff --

no need for this import. CarbonLoadDataCommand already has a LOGGER


---


[GitHub] carbondata pull request #2550: [CARBONDATA-2779]Fixed filter query issue in ...

2018-07-24 Thread manishgupta88
Github user manishgupta88 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2550#discussion_r204981714
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java
 ---
@@ -239,36 +249,34 @@ private void readAndFillBlockletInfo(Map filePathToFileF
   // fill the info only for given blockletId in detailInfo
   BlockletInfo blockletInfo = 
blockletList.get(blockletDetailInfo.getBlockletId());
   fillBlockletInfoToTableBlock(tableBlockInfos, blockInfo, 
blockletDetailInfo, fileFooter,
-  blockletInfo, blockletDetailInfo.getBlockletId());
+  blockletInfo, blockletDetailInfo.getBlockletId(), 
segmentProperties);
 } else {
   short count = 0;
   for (BlockletInfo blockletInfo : blockletList) {
 fillBlockletInfoToTableBlock(tableBlockInfos, blockInfo, 
blockletDetailInfo, fileFooter,
-blockletInfo, count);
+blockletInfo, count, segmentProperties);
 count++;
   }
 }
   }
 
   private void fillBlockletInfoToTableBlock(List 
tableBlockInfos,
   TableBlockInfo blockInfo, BlockletDetailInfo blockletDetailInfo, 
DataFileFooter fileFooter,
-  BlockletInfo blockletInfo, short blockletId) {
+  BlockletInfo blockletInfo, short blockletId, SegmentProperties 
segmentProperties) {
 TableBlockInfo info = blockInfo.copy();
 BlockletDetailInfo detailInfo = info.getDetailInfo();
 // set column schema details
 detailInfo.setColumnSchemas(fileFooter.getColumnInTable());
 detailInfo.setRowCount(blockletInfo.getNumberOfRows());
-byte[][] maxValues = 
blockletInfo.getBlockletIndex().getMinMaxIndex().getMaxValues();
-byte[][] minValues = 
blockletInfo.getBlockletIndex().getMinMaxIndex().getMinValues();
+byte[][] minValues = 
BlockletDataMapUtil.updateMinValues(segmentProperties,
+blockletInfo.getBlockletIndex().getMinMaxIndex().getMinValues());
+byte[][] maxValues = 
BlockletDataMapUtil.updateMaxValues(segmentProperties,
+blockletInfo.getBlockletIndex().getMinMaxIndex().getMaxValues());
--- End diff --

Move this code inside isLegacyStore check


---


[GitHub] carbondata issue #2548: [CARBONDATA-2778]Fixed bug when select after delete ...

2018-07-24 Thread manishgupta88
Github user manishgupta88 commented on the issue:

https://github.com/apache/carbondata/pull/2548
  
retest sdv please


---


[GitHub] carbondata pull request #2514: [CARBONDATA-2740]segment file is not getting ...

2018-07-24 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/2514


---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7479/



---


[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...

2018-07-24 Thread kunal642
Github user kunal642 commented on the issue:

https://github.com/apache/carbondata/pull/2514
  
LGTM


---


[GitHub] carbondata issue #2549: [CARBONDATA-2606][Complex DataType Enhancements]Fix ...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2549
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6233/



---


[GitHub] carbondata issue #2549: [CARBONDATA-2606][Complex DataType Enhancements]Fix ...

2018-07-24 Thread dhatchayani
Github user dhatchayani commented on the issue:

https://github.com/apache/carbondata/pull/2549
  
retest this please


---


[GitHub] carbondata issue #2551: [HOTFIX] Fix a spelling mistake after PR2511 merged.

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2551
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6232/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread dhatchayani
Github user dhatchayani commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
retest this please


---


[GitHub] carbondata issue #2549: [CARBONDATA-2606][Complex DataType Enhancements]Fix ...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2549
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7476/



---


[GitHub] carbondata issue #2541: [CARBONDATA-2771]block update and delete on table if...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2541
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7477/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6228/



---


[GitHub] carbondata issue #2548: [CARBONDATA-2778]Fixed bug when select after delete ...

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2548
  
SDV Build Fail , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5988/



---


[GitHub] carbondata issue #2537: [CARBONDATA-2768][CarbonStore] Fix error in tests fo...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2537
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7478/



---


[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local

2018-07-24 Thread brijoobopanna
Github user brijoobopanna commented on the issue:

https://github.com/apache/carbondata/pull/2484
  
retest this please


---


[GitHub] carbondata issue #2551: [HOTFIX] Fix a spelling mistake after PR2511 merged.

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2551
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7475/



---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread ndwangsen
Github user ndwangsen commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204973033
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -122,6 +122,45 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  TBLPROPERTIES ('streaming'='true')
  ```
 
+  - **Local Dictionary Configuration**
+  
+  Local Dictionary is generated only for no-dictionary string/varchar 
datatype columns. It helps in:
+  1. Getting more compression on dimension columns with less cardinality.
+  2. Filter queries and full scan queries on No-dictionary columns with 
local dictionary will be faster as filter will be done on encoded data.
+  3. Reducing the store size and memory footprint as only unique values 
will be stored as part of local dictionary and corresponding data will be 
stored as encoded data.
+
+   By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns.
--- End diff --

By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns. - Is the data loading 
performance ok?


---


[GitHub] carbondata issue #2546: [CARBONDATA-2775] Adaptive encoding fails for Unsafe...

2018-07-24 Thread ajantha-bhat
Github user ajantha-bhat commented on the issue:

https://github.com/apache/carbondata/pull/2546
  
@ravipesala : please review


---


[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...

2018-07-24 Thread brijoobopanna
Github user brijoobopanna commented on the issue:

https://github.com/apache/carbondata/pull/2533
  
retest sdv please



---


[GitHub] carbondata issue #2547: [CARBONDATA-2777] Fixed: NonTransactional tables, Se...

2018-07-24 Thread ajantha-bhat
Github user ajantha-bhat commented on the issue:

https://github.com/apache/carbondata/pull/2547
  
@ravipesala : please review


---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6231/



---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread Indhumathi27
Github user Indhumathi27 commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Retest this please


---


[GitHub] carbondata issue #2550: [CARBONDATA-2779]Fixed filter query issue in case of...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2550
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7473/



---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204970343
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -124,6 +124,41 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  TBLPROPERTIES ('streaming'='true')
  ```
 
+  - **Local Dictionary Configuration**
+  
+  Local Dictionary is generated only for no-dictionary string/varchar 
datatype columns. It helps in:
+  1. Getting more compression on dimension columns with less cardinality.
+  2. Filter queries and full scan queries on No-dictionary columns with 
local dictionary will be faster as filter will be done on encoded data.
+  3. Reducing the store size and memory footprint as only unique values 
will be stored as part of local dictionary and corresponding data will be 
stored as encoded data.
+
+   By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns.
+   
+   Users will be able to pass following properties in create table 
command: 
+   
+   | Properties | Default value | Description |
+   | -- | - | --- |
+   | LOCAL_DICTIONARY_ENABLE | true | By default, local dictionary 
will be enabled for the table | 
+   | LOCAL_DICTIONARY_THRESHOLD | 1 | The maximum cardinality for 
local dictionary generation (range- 1000 to 10) |
--- End diff --

add more description for it, such as `If the cardinality exceeds the 
threshold, this column will will not use local dictionary encoding. And in this 
case, the data loading performance will decrease since there is a rollback 
procedure for local dictionary encoding.`


---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204970695
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -124,6 +124,41 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  TBLPROPERTIES ('streaming'='true')
  ```
 
+  - **Local Dictionary Configuration**
+  
+  Local Dictionary is generated only for no-dictionary string/varchar 
datatype columns. It helps in:
+  1. Getting more compression on dimension columns with less cardinality.
+  2. Filter queries and full scan queries on No-dictionary columns with 
local dictionary will be faster as filter will be done on encoded data.
+  3. Reducing the store size and memory footprint as only unique values 
will be stored as part of local dictionary and corresponding data will be 
stored as encoded data.
+
+   By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns.
+   
+   Users will be able to pass following properties in create table 
command: 
+   
+   | Properties | Default value | Description |
+   | -- | - | --- |
+   | LOCAL_DICTIONARY_ENABLE | true | By default, local dictionary 
will be enabled for the table | 
+   | LOCAL_DICTIONARY_THRESHOLD | 1 | The maximum cardinality for 
local dictionary generation (range- 1000 to 10) |
--- End diff --

What is the scope of `cardinality` here? Page/Blocklet/Block/Segment/Table 
level?
Previously for global cardinality, it is table level.


---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7474/



---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204969011
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -122,6 +122,45 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  TBLPROPERTIES ('streaming'='true')
  ```
 
+  - **Local Dictionary Configuration**
+  
+  Local Dictionary is generated only for no-dictionary string/varchar 
datatype columns. It helps in:
+  1. Getting more compression on dimension columns with less cardinality.
+  2. Filter queries and full scan queries on No-dictionary columns with 
local dictionary will be faster as filter will be done on encoded data.
+  3. Reducing the store size and memory footprint as only unique values 
will be stored as part of local dictionary and corresponding data will be 
stored as encoded data.
+
+   By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns.
--- End diff --

+1


---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204969813
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -124,6 +124,41 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  TBLPROPERTIES ('streaming'='true')
  ```
 
+  - **Local Dictionary Configuration**
+  
+  Local Dictionary is generated only for no-dictionary string/varchar 
datatype columns. It helps in:
+  1. Getting more compression on dimension columns with less cardinality.
+  2. Filter queries and full scan queries on No-dictionary columns with 
local dictionary will be faster as filter will be done on encoded data.
+  3. Reducing the store size and memory footprint as only unique values 
will be stored as part of local dictionary and corresponding data will be 
stored as encoded data.
+
+   By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns.
+   
+   Users will be able to pass following properties in create table 
command: 
+   
+   | Properties | Default value | Description |
+   | -- | - | --- |
+   | LOCAL_DICTIONARY_ENABLE | true | By default, local dictionary 
will be enabled for the table | 
+   | LOCAL_DICTIONARY_THRESHOLD | 1 | The maximum cardinality for 
local dictionary generation (range- 1000 to 10) |
+   | LOCAL_DICTIONARY_INCLUDE | all no-dictionary string/varchar 
columns | Columns for which Local Dictionary is generated. |
+   | LOCAL_DICTIONARY_EXCLUDE | none | Columns for which Local 
Dictionary is not generated |
+
--- End diff --

What about the limitations? Such as, can local dictionary columns work with:
1. sort_columns?
2. dictionary include?
3. complex?
3. etc.


---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204969382
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -333,6 +373,20 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  ```
  ALTER TABLE test_db.carbon CHANGE a1 a1 DECIMAL(18,2)
  ```
+   - **SET and UNSET for Local Dictionary Properties**
+   
+  When set command is used, all the newly set properties will override 
the corresponding old properties if exists.
+ 
+  Example to SET Local Dictionary Properties:
+   ```
+  ALTER TABLE tablename SET 
TBLPROPERTIES('LOCAL_DICTIONARY_ENABLE'='false',’LOCAL_DICTIONARY_THRESHOLD'='1000','LOCAL_DICTIONARY_INCLUDE'='column1','LOCAL_DICTIONARY_EXCLUDE'='column2')
--- End diff --

`’` before `LOCAL_DICTIONARY_THRESHOLD` is wrong


---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204969612
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -333,6 +373,20 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  ```
  ALTER TABLE test_db.carbon CHANGE a1 a1 DECIMAL(18,2)
  ```
+   - **SET and UNSET for Local Dictionary Properties**
+   
+  When set command is used, all the newly set properties will override 
the corresponding old properties if exists.
+ 
+  Example to SET Local Dictionary Properties:
+   ```
+  ALTER TABLE tablename SET 
TBLPROPERTIES('LOCAL_DICTIONARY_ENABLE'='false',’LOCAL_DICTIONARY_THRESHOLD'='1000','LOCAL_DICTIONARY_INCLUDE'='column1','LOCAL_DICTIONARY_EXCLUDE'='column2')
+   ```
+  When Local Dictionary Properties are unset, default of Local 
Dictionary Enable will be changed to true, default of Local Dictionary 
Threshold will be changed to 1, and columns for Local Dictionary Include by 
default will be all no-dictionary String/Varchar datatype columns.
--- End diff --

no need to repeat the default scenario, better to change it to
`When Local Dictionary properties are unset, corresponding default value 
will be used for those properties.`


---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204969187
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -124,6 +124,41 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  TBLPROPERTIES ('streaming'='true')
  ```
 
+  - **Local Dictionary Configuration**
+  
+  Local Dictionary is generated only for no-dictionary string/varchar 
datatype columns. It helps in:
+  1. Getting more compression on dimension columns with less cardinality.
+  2. Filter queries and full scan queries on No-dictionary columns with 
local dictionary will be faster as filter will be done on encoded data.
+  3. Reducing the store size and memory footprint as only unique values 
will be stored as part of local dictionary and corresponding data will be 
stored as encoded data.
+
+   By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns.
+   
+   Users will be able to pass following properties in create table 
command: 
+   
+   | Properties | Default value | Description |
+   | -- | - | --- |
+   | LOCAL_DICTIONARY_ENABLE | true | By default, local dictionary 
will be enabled for the table | 
+   | LOCAL_DICTIONARY_THRESHOLD | 1 | The maximum cardinality for 
local dictionary generation (range- 1000 to 10) |
+   | LOCAL_DICTIONARY_INCLUDE | all no-dictionary string/varchar 
columns | Columns for which Local Dictionary is generated. |
+   | LOCAL_DICTIONARY_EXCLUDE | none | Columns for which Local 
Dictionary is not generated |
+
+
+### Example:
+
+  ```
+  CREATE TABLE carbontable(
+
+  column1 string,
+
+  column2 string,
+
+  column3 LONG )
+
+STORED BY 'carbondata'
+
TBLPROPERTIES('LOCAL_DICTIONARY_ENABLE'='true',’LOCAL_DICTIONARY_THRESHOLD'='1000',
--- End diff --

`’` before `LOCAL_DICTIONARY_THRESHOLD` is wrong.


---


[GitHub] carbondata pull request #2520: [CARBONDATA-2750] Added Documentation for Loc...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2520#discussion_r204968918
  
--- Diff: docs/data-management-on-carbondata.md ---
@@ -124,6 +124,41 @@ This tutorial is going to introduce all commands and 
data operations on CarbonDa
  TBLPROPERTIES ('streaming'='true')
  ```
 
+  - **Local Dictionary Configuration**
+  
+  Local Dictionary is generated only for no-dictionary string/varchar 
datatype columns. It helps in:
+  1. Getting more compression on dimension columns with less cardinality.
+  2. Filter queries and full scan queries on No-dictionary columns with 
local dictionary will be faster as filter will be done on encoded data.
+  3. Reducing the store size and memory footprint as only unique values 
will be stored as part of local dictionary and corresponding data will be 
stored as encoded data.
+
+   By default, Local Dictionary will be enabled and generated for all 
no-dictionary string/varchar datatype columns.
--- End diff --

The indent of ‘   By default’ is wrong


---


[GitHub] carbondata issue #2528: [CARBONDATA-2767][CarbonStore] Fix task locality iss...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2528
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6229/



---


[GitHub] carbondata pull request #2528: [CARBONDATA-2767][CarbonStore] Fix task local...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2528#discussion_r204968370
  
--- Diff: 
integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/CarbonScanRDD.scala
 ---
@@ -739,9 +741,16 @@ class CarbonScanRDD[T: ClassTag](
* Get the preferred locations where to launch this task.
*/
   override def getPreferredLocations(split: Partition): Seq[String] = {
-val theSplit = split.asInstanceOf[CarbonSparkPartition]
-val firstOptionLocation = theSplit.split.value.getLocations.filter(_ 
!= "localhost")
-firstOptionLocation
+if (isTaskLocality) {
+  split.asInstanceOf[CarbonSparkPartition]
+.split
+.value
+.getLocations
+.filter(_ != "localhost")
--- End diff --

What will happen if I configure TaskLocality and run the job in local 
machine or local pseudo distributed mode?
Besides, if you really want to exclude local machine, except 'localhost',  
why host name of local machine is not considered?


---


[GitHub] carbondata issue #2537: [CARBONDATA-2768][CarbonStore] Fix error in tests fo...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on the issue:

https://github.com/apache/carbondata/pull/2537
  
retest this please


---


[GitHub] carbondata pull request #2544: [CARBONDATA-2776][CarbonStore] Support ingest...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2544#discussion_r204966885
  
--- Diff: pom.xml ---
@@ -110,7 +110,7 @@
   
 UTF-8
 1.1.2.6
-2.7.2
+2.8.3
--- End diff --

can we directly upgrade the hadoop version to 2.8.3 without any other 
changes?


---


[GitHub] carbondata pull request #2544: [CARBONDATA-2776][CarbonStore] Support ingest...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2544#discussion_r204967361
  
--- Diff: 
store/sql/src/main/java/org/apache/carbondata/dis/DisProducer.java ---
@@ -0,0 +1,151 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.dis;
+
+import java.nio.ByteBuffer;
+import java.nio.charset.Charset;
+import java.text.SimpleDateFormat;
+import java.util.ArrayList;
+import java.util.Date;
+import java.util.List;
+import java.util.Random;
+import java.util.Timer;
+import java.util.TimerTask;
+import java.util.concurrent.ThreadLocalRandom;
+import java.util.concurrent.atomic.AtomicLong;
+
+import org.apache.carbondata.common.logging.LogService;
+import org.apache.carbondata.common.logging.LogServiceFactory;
+
+import com.huaweicloud.dis.DIS;
+import com.huaweicloud.dis.DISClientBuilder;
+import com.huaweicloud.dis.exception.DISClientException;
+import com.huaweicloud.dis.http.exception.ResourceAccessException;
+import com.huaweicloud.dis.iface.data.request.PutRecordsRequest;
+import com.huaweicloud.dis.iface.data.request.PutRecordsRequestEntry;
+import com.huaweicloud.dis.iface.data.response.PutRecordsResult;
+import com.huaweicloud.dis.iface.data.response.PutRecordsResultEntry;
+
+public class DisProducer {
+
+  private static AtomicLong eventId = new AtomicLong(0);
+
+  private static final LogService LOGGER =
+  LogServiceFactory.getLogService(DisProducer.class.getName());
+
+  public static void main(String[] args) {
+if (args.length < 6) {
+  System.err.println(
+  "Usage: DisProducer  
 ");
+  return;
+}
+
+DIS dic = 
DISClientBuilder.standard().withEndpoint(args[1]).withAk(args[3]).withSk(args[4])
+.withProjectId(args[5]).withRegion(args[2]).build();
+
+Sensor sensor = new Sensor(dic, args[0]);
+Timer timer = new Timer();
+timer.schedule(sensor, 0, 5000);
+
+  }
+
+  static class Sensor extends TimerTask {
+private DIS dic;
+
+private String streamName;
+
+private SimpleDateFormat format = new SimpleDateFormat("-MM-dd 
HH:mm:ss");
+
+private Random random = new Random();
+
+private int i = 0;
+private int flag = 1;
+
+Sensor(DIS dic, String streamName) {
+  this.dic = dic;
+  this.streamName = streamName;
+}
+
+@Override public void run() {
+  uploadData();
+  //recordSensor();
+}
+
+private void uploadData() {
+  PutRecordsRequest putRecordsRequest = new PutRecordsRequest();
+  putRecordsRequest.setStreamName(streamName);
+  List putRecordsRequestEntryList = new 
ArrayList<>();
+  PutRecordsRequestEntry putRecordsRequestEntry = new 
PutRecordsRequestEntry();
+  putRecordsRequestEntry.setData(ByteBuffer.wrap(recordSensor()));
+  putRecordsRequestEntry
+  
.setPartitionKey(String.valueOf(ThreadLocalRandom.current().nextInt(100)));
+  putRecordsRequestEntryList.add(putRecordsRequestEntry);
+  putRecordsRequest.setRecords(putRecordsRequestEntryList);
+
+  LOGGER.info("== BEGIN PUT ");
+
+  PutRecordsResult putRecordsResult = null;
+  try {
+putRecordsResult = dic.putRecords(putRecordsRequest);
+  } catch (DISClientException e) {
+LOGGER.error(e,
+"Failed to get a normal response, please check params and 
retry." + e.getMessage());
+  } catch (ResourceAccessException e) {
+LOGGER.error(e, "Failed to access endpoint. " + e.getMessage());
+  } catch (Exception e) {
+LOGGER.error(e, e.getMessage());
+  }
+
+  if (putRecordsResult != null) {
+LOGGER.info("Put " + putRecordsResult.getRecords().size() + " 
records[" + (
+

[GitHub] carbondata pull request #2544: [CARBONDATA-2776][CarbonStore] Support ingest...

2018-07-24 Thread xuchuanyin
Github user xuchuanyin commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2544#discussion_r204966779
  
--- Diff: store/sql/pom.xml ---
@@ -35,6 +36,33 @@
 
   
 
+
+  org.apache.hadoop
+  hadoop-aws
+  ${hadoop.version}
+  
+
+  com.fasterxml.jackson.core
+  *
+
+  
+
+
+  com.amazonaws
+  aws-java-sdk-s3
+  1.10.6
--- End diff --

trim the useless spaces


---


[GitHub] carbondata issue #2541: [CARBONDATA-2771]block update and delete on table if...

2018-07-24 Thread akashrn5
Github user akashrn5 commented on the issue:

https://github.com/apache/carbondata/pull/2541
  
retest this please 


---


[GitHub] carbondata issue #2477: [CARBONDATA-2539][MV] Fix predicate subquery which u...

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2477
  
SDV Build Fail , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5987/



---


[jira] [Commented] (CARBONDATA-2773) Load one file for multiple times in one load command cause wrong query result

2018-07-24 Thread xuchuanyin (JIRA)


[ 
https://issues.apache.org/jira/browse/CARBONDATA-2773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16555036#comment-16555036
 ] 

xuchuanyin commented on CARBONDATA-2773:


In spark, SparkSQL does not support loading multiple files separated by comma.

However SparkSQL support reading/loading multiple files through dataframe and 
it works as expected.

val path = Arrays(path1, path1, path1)
// show tripled content
sparkSession.read.csv(paths: _*).show()
// write tripled content to parquet
sparkSession.read.csv(paths: _*).write.parquet(parquetDir)
// show tripled content in parquet
sparkSession.read.read.parquet(parquetDir)

> Load one file for multiple times in one load command cause wrong query result
> -
>
> Key: CARBONDATA-2773
> URL: https://issues.apache.org/jira/browse/CARBONDATA-2773
> Project: CarbonData
>  Issue Type: Bug
>Reporter: xuchuanyin
>Assignee: wangsen
>Priority: Major
>
> CarbonData now support load multiple files in one load command. The file path 
> can be comma separated.
> But when I try to load one file for multiple times in one load command, the 
> query result is wrong.
> The load command looks like below:
> ```
> LOAD DATA LOCAL INPATH 'file1,file1,file1' INTO TABLE test_table;
> ```
> The expected result should be the triple of the file content, but actually 
> the result is the file content not tripled.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #2550: [CARBONDATA-2779]Fixed filter query issue in case of...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2550
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6230/



---


[GitHub] carbondata issue #2549: [CARBONDATA-2606][Complex DataType Enhancements]Fix ...

2018-07-24 Thread dhatchayani
Github user dhatchayani commented on the issue:

https://github.com/apache/carbondata/pull/2549
  
Retest this please


---


[GitHub] carbondata issue #2551: [HOTFIX] Fix a spelling mistake after PR2511 merged.

2018-07-24 Thread zzcclp
Github user zzcclp commented on the issue:

https://github.com/apache/carbondata/pull/2551
  
@kunal642 @jackylk please review, thanks.


---


[GitHub] carbondata pull request #2551: [HOTFIX] Fix a spelling mistake after PR2511 ...

2018-07-24 Thread zzcclp
GitHub user zzcclp opened a pull request:

https://github.com/apache/carbondata/pull/2551

[HOTFIX] Fix a spelling mistake after PR2511 merged.

spelling mistakes: AtomicFileOperationsF
use: AtomicFileOperationFactory.getAtomicFileOperations

Be sure to do all of the following checklist to help us incorporate 
your contribution quickly and easily:

 - [ ] Any interfaces changed?
 
 - [ ] Any backward compatibility impacted?
 
 - [ ] Document update required?

 - [ ] Testing done
Please provide details on 
- Whether new unit test cases have been added or why no new tests 
are required?
- How it is tested? Please attach test report.
- Is it a performance related change? Please attach the performance 
test report.
- Any additional information to help reviewers in testing this 
change.
   
 - [ ] For large changes, please consider breaking it into sub-tasks under 
an umbrella JIRA. 



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/zzcclp/carbondata 
hotfix_AtomicFileOperationFactory

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/carbondata/pull/2551.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2551


commit f25b03c13196ac792a3fe7bdbcd1c359dbd9eef0
Author: Zhang Zhichao <441586683@...>
Date:   2018-07-25T02:19:50Z

[HOTFIX] Fix an error after PR2511 merged.

spelling mistakes: AtomicFileOperationsF
use: AtomicFileOperationFactory.getAtomicFileOperations




---


[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2484
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6226/



---


[GitHub] carbondata issue #2528: [CARBONDATA-2767][CarbonStore] Fix task locality iss...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2528
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7472/



---


[GitHub] carbondata issue #2542: [CARBONDATA-2772] Size based dictionary fallback is ...

2018-07-24 Thread kumarvishal09
Github user kumarvishal09 commented on the issue:

https://github.com/apache/carbondata/pull/2542
  
retest sdv please 


---


[GitHub] carbondata issue #2550: [CARBONDATA-2779]Fixed filter query issue in case of...

2018-07-24 Thread kumarvishal09
Github user kumarvishal09 commented on the issue:

https://github.com/apache/carbondata/pull/2550
  
retest this please 


---


[jira] [Closed] (CARBONDATA-2780) Load one file for multiple times in one load command cause wrong query result

2018-07-24 Thread xuchuanyin (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-2780?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

xuchuanyin closed CARBONDATA-2780.
--
Resolution: Duplicate

> Load one file for multiple times in one load command cause wrong query result
> -
>
> Key: CARBONDATA-2780
> URL: https://issues.apache.org/jira/browse/CARBONDATA-2780
> Project: CarbonData
>  Issue Type: Bug
>Reporter: xuchuanyin
>Priority: Major
>
> SparkSQL now support load multiple files in one load command. The file path 
> can be comma separated.
> But when I try to load one file for multiple times in one load command, the 
> query result is wrong.
> The load command looks like below:
> ```
> LOAD DATA LOCAL INPATH 'file1,file1,file1' INTO TABLE test_table;
> ```
> The expected result should be the triple of the file content, but actually 
> the result is exactly the file content, not tripled.
> I'm wondering if this is intended or a bug.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (CARBONDATA-2780) Load one file for multiple times in one load command cause wrong query result

2018-07-24 Thread xuchuanyin (JIRA)
xuchuanyin created CARBONDATA-2780:
--

 Summary: Load one file for multiple times in one load command 
cause wrong query result
 Key: CARBONDATA-2780
 URL: https://issues.apache.org/jira/browse/CARBONDATA-2780
 Project: CarbonData
  Issue Type: Bug
Reporter: xuchuanyin


SparkSQL now support load multiple files in one load command. The file path can 
be comma separated.

But when I try to load one file for multiple times in one load command, the 
query result is wrong.

The load command looks like below:

```

LOAD DATA LOCAL INPATH 'file1,file1,file1' INTO TABLE test_table;

```

The expected result should be the triple of the file content, but actually the 
result is exactly the file content, not tripled.

I'm wondering if this is intended or a bug.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread Indhumathi27
Github user Indhumathi27 commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Retest this please


---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6227/



---


[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2514
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6225/



---


[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2514
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5986/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6224/



---


[GitHub] carbondata issue #2542: [CARBONDATA-2772] Size based dictionary fallback is ...

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2542
  
SDV Build Fail , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5985/



---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6223/



---


[GitHub] carbondata issue #2540: [CARBONDATA-2649] Handled executor min/max pruning w...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2540
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6222/



---


[GitHub] carbondata issue #2549: [CARBONDATA-2606][Complex DataType Enhancements]Fix ...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2549
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6220/



---


[GitHub] carbondata issue #2547: [CARBONDATA-2777] Fixed: NonTransactional tables, Se...

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2547
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5984/



---


[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2484
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7468/



---


[GitHub] carbondata issue #2541: [CARBONDATA-2771]block update and delete on table if...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2541
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6221/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7471/



---


[GitHub] carbondata issue #2441: [CARBONDATA-2625] optimize CarbonReader performance

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2441
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6218/



---


[GitHub] carbondata issue #2550: [CARBONDATA-2779]Fixed filter query issue in case of...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2550
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6219/



---


[GitHub] carbondata issue #2477: [CARBONDATA-2539][MV] Fix predicate subquery which u...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2477
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6217/



---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7470/



---


[GitHub] carbondata issue #2540: [CARBONDATA-2649] Handled executor min/max pruning w...

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2540
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5983/



---


[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2484
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6216/



---


[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2514
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7469/



---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7466/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread dhatchayani
Github user dhatchayani commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
Retest this please


---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread Indhumathi27
Github user Indhumathi27 commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Retest this please



---


[GitHub] carbondata issue #2535: [CARBONDATA-2606]Fix Complex array Pushdown

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2535
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6214/



---


[GitHub] carbondata issue #2541: [CARBONDATA-2771]block update and delete on table if...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2541
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7465/



---


[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2514
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6215/



---


[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...

2018-07-24 Thread akashrn5
Github user akashrn5 commented on the issue:

https://github.com/apache/carbondata/pull/2514
  
retest this please 


---


[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2514
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7467/



---


[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2484
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7460/



---


[GitHub] carbondata issue #2549: [CARBONDATA-2606][Complex DataType Enhancements]Fix ...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2549
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7464/



---


[GitHub] carbondata issue #2550: [CARBONDATA-2779]Fixed filter query issue in case of...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2550
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7463/



---


[GitHub] carbondata issue #2540: [CARBONDATA-2649] Handled executor min/max pruning w...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2540
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6213/



---


[GitHub] carbondata issue #2541: [CARBONDATA-2771]block update and delete on table if...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2541
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6212/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5981/



---


[GitHub] carbondata issue #2441: [CARBONDATA-2625] optimize CarbonReader performance

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2441
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7462/



---


[GitHub] carbondata issue #2477: [CARBONDATA-2539][MV] Fix predicate subquery which u...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2477
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7461/



---


[GitHub] carbondata issue #2547: [CARBONDATA-2777] Fixed: NonTransactional tables, Se...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2547
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6210/



---


[GitHub] carbondata issue #2542: [CARBONDATA-2772] Size based dictionary fallback is ...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2542
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6211/



---


[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2484
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7456/



---


[GitHub] carbondata issue #2530: [CARBONDATA-2753] Fix Compatibility issues

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2530
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7459/



---


[GitHub] carbondata issue #2541: [CARBONDATA-2771]block update and delete on table if...

2018-07-24 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2541
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/5980/



---


[GitHub] carbondata issue #2548: [CARBONDATA-2778]Fixed bug when select after delete ...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2548
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6209/



---


[GitHub] carbondata issue #2540: [CARBONDATA-2649] Handled executor min/max pruning w...

2018-07-24 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2540
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7458/



---


  1   2   3   >