[GitHub] carbondata issue #2654: [CARBONDATA-2896] Adaptive Encoding for Primitive da...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2654 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/209/ ---
[GitHub] carbondata issue #2654: [CARBONDATA-2896] Adaptive Encoding for Primitive da...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2654 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8280/ ---
[GitHub] carbondata pull request #2687: [CARBONDATA-2876]Fix Avro decimal datatype wi...
Github user manishgupta88 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2687#discussion_r214790148 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/AvroCarbonWriter.java --- @@ -407,6 +413,19 @@ private Object avroFieldToObjectForUnionType(Schema avroField, Object fieldValue out = null; } break; + case BYTES: +// DECIMAL type is defined in Avro as a BYTE type with the logicalType property +// set to "decimal" and a specified precision and scale +// As binary type is not supported yet,value will be null +if (logicalType instanceof LogicalTypes.Decimal) { + BigDecimal decimalValue = new BigDecimal(new String(((ByteBuffer) fieldValue).array(), + CarbonCommonConstants.DEFAULT_CHARSET_CLASS)); + out = (decimalValue.round( + new MathContext(((LogicalTypes.Decimal) avroField.getLogicalType()).getPrecision( + .setScale(((LogicalTypes.Decimal) avroField.getLogicalType()).getScale(), + RoundingMode.HALF_UP); --- End diff -- replace bigdecimal conversion with below code at all places in the code `BigDecimal bigDecimal = new BigDecimal(new String(((ByteBuffer) fieldValue).array(), CarbonCommonConstants.DEFAULT_CHARSET_CLASS) .setScale(dimension.getColumnSchema().getScale(), RoundingMode.HALF_UP);` ---
[GitHub] carbondata issue #2661: [CARBONDATA-2888] Support multi level subfolder for ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2661 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8278/ ---
[GitHub] carbondata issue #2661: [CARBONDATA-2888] Support multi level subfolder for ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2661 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/207/ ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2628 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/206/ ---
[GitHub] carbondata issue #2685: [CARBONDATA-2910] Support backward compatability in ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2685 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/208/ ---
[GitHub] carbondata issue #2685: [CARBONDATA-2910] Support backward compatability in ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2685 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8279/ ---
[GitHub] carbondata issue #2688: [CARBONDATA-2911] Remove unused BTree related code
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2688 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/205/ ---
[GitHub] carbondata issue #2688: [CARBONDATA-2911] Remove unused BTree related code
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2688 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8276/ ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2628 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8277/ ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user xuchuanyin commented on the issue: https://github.com/apache/carbondata/pull/2628 retest this please ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2628 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/204/ ---
[GitHub] carbondata pull request #2688: [CARBONDATA-2911] Remove unused BTree related...
GitHub user kevinjmh opened a pull request: https://github.com/apache/carbondata/pull/2688 [CARBONDATA-2911] Remove unused BTree related code 1. BTree related code is only used by a test class called`BTreeBlockFinderTest`. 2. BTreeDataRefNodeFinder in AbstractDetailQueryResultIterator never run. All dataRefNode are actually instanceof BlockletDataRefNode Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/kevinjmh/carbondata remove_btree_related Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2688.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2688 commit 47f18af7277acaec8de4e166d6a268b2fa8b2e7e Author: Manhua Date: 2018-09-04T02:20:07Z remove btree related code ---
[jira] [Created] (CARBONDATA-2911) Remove unused BTree related code
jiangmanhua created CARBONDATA-2911: --- Summary: Remove unused BTree related code Key: CARBONDATA-2911 URL: https://issues.apache.org/jira/browse/CARBONDATA-2911 Project: CarbonData Issue Type: Improvement Reporter: jiangmanhua Assignee: jiangmanhua -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2628 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8275/ ---
[GitHub] carbondata pull request #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zs...
GitHub user xuchuanyin reopened a pull request: https://github.com/apache/carbondata/pull/2628 [CARBONDATA-2851][CARBONDATA-2852] Support zstd as column compressor in final store 1. add zstd compressor for compressing column data 2. add zstd support in thrift 3. since zstd does not support zero-copy while compressing, offheap will not take effect for zstd 4. Column compressor is configured through system property and can be changed in each load. Before loading, Carbondata will get the compressor and use that compressor during that loading. During querying, carbondata will get the compressor information from metadata in the file data. 5. Also support compressing streaming table using zstd. The compressor info is stored in FileHeader of the streaming file. 6. This PR also considered and verified on the legacy store and compaction A simple test with 1.2GB raw CSV data shows that the size (in MB) of final store with different compressor: | local dictionary | snappy | zstd | Size Reduced | | --- | --- | --- | -- | | enabled | 335 | 207 | 38.2% | | disabled | 375 | 225 | 40% | Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [x] Any interfaces changed? `Yes, only internal used interfaces are changed` - [x] Any backward compatibility impacted? `Yes, backward compatibility is handled` - [x] Document update required? `Yes` - [x] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? `Added tests` - How it is tested? Please attach test report. `Tested in local machine` - Is it a performance related change? Please attach the performance test report. `The size of final store has been decreased by 40% compared with default snappy` - Any additional information to help reviewers in testing this change. `NA` - [x] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. `NA` You can merge this pull request into a Git repository by running: $ git pull https://github.com/xuchuanyin/carbondata 0810_support_zstd_compressor_final_store Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2628.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2628 commit c171ee13136785110f6fff8104afebc4b2f222c7 Author: xuchuanyin Date: 2018-08-10T14:02:57Z Support zstd as column compressor in final store 1. add zstd compressor for compressing column data 2. add zstd support in thrift 3. legacy store is not considered in this commit 4. since zstd does not support zero-copy while compressing, offheap will not take effect for zstd 5. support lazy load for compressor commit 6448e6f21da66172775b625730b922fdfa57822d Author: xuchuanyin Date: 2018-08-13T13:45:42Z Support new compressor on legacy store In query procedure, we need to decompress the column page. Previously we get the compressor from system property. Now since we support new compressors, we should read the compressor information from the metadata in datafiles. This PR also solve the compatibility related problems on V1/V2 store where we only support snappy. commit 2815c84f1d5fd99ff37ba6890d98fb2b73a95b00 Author: xuchuanyin Date: 2018-08-14T08:38:00Z fix comments commit ac95c25fca1c37f10f9cce0db76062207d0d3cee Author: xuchuanyin Date: 2018-08-23T09:35:23Z Determine the column compressor before data loading we will get the column compressor before data loading/compaction start, so that it can make all the pages use the same compressor in case of concurrent modifying compressor during loading. commit a672d3baad1c476308c0aec5133e418afeaeacb2 Author: xuchuanyin Date: 2018-08-27T11:18:30Z set compressor in carbon load model column compressor is necessary for carbon load model, otherwise load will fail. commit d05c1cc38e1fa42ef94f70577ee2a715f649ebe3 Author: xuchuanyin Date: 2018-08-30T04:02:33Z fix error in test commit fb8cdfb1258b477f7d8c867ee74bd59386725d9c Author: xuchuanyin Date: 2018-09-03T03:58:02Z fix review comments optimize parameters for column page, use columnPageEncodeMeta instead of its members ---
[GitHub] carbondata pull request #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zs...
Github user xuchuanyin closed the pull request at: https://github.com/apache/carbondata/pull/2628 ---
[GitHub] carbondata pull request #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zs...
Github user xuchuanyin closed the pull request at: https://github.com/apache/carbondata/pull/2628 ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user xuchuanyin commented on the issue: https://github.com/apache/carbondata/pull/2628 retest this please ---
[GitHub] carbondata pull request #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zs...
GitHub user xuchuanyin reopened a pull request: https://github.com/apache/carbondata/pull/2628 [CARBONDATA-2851][CARBONDATA-2852] Support zstd as column compressor in final store 1. add zstd compressor for compressing column data 2. add zstd support in thrift 3. since zstd does not support zero-copy while compressing, offheap will not take effect for zstd 4. Column compressor is configured through system property and can be changed in each load. Before loading, Carbondata will get the compressor and use that compressor during that loading. During querying, carbondata will get the compressor information from metadata in the file data. 5. Also support compressing streaming table using zstd. The compressor info is stored in FileHeader of the streaming file. 6. This PR also considered and verified on the legacy store and compaction A simple test with 1.2GB raw CSV data shows that the size (in MB) of final store with different compressor: | local dictionary | snappy | zstd | Size Reduced | | --- | --- | --- | -- | | enabled | 335 | 207 | 38.2% | | disabled | 375 | 225 | 40% | Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [x] Any interfaces changed? `Yes, only internal used interfaces are changed` - [x] Any backward compatibility impacted? `Yes, backward compatibility is handled` - [x] Document update required? `Yes` - [x] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? `Added tests` - How it is tested? Please attach test report. `Tested in local machine` - Is it a performance related change? Please attach the performance test report. `The size of final store has been decreased by 40% compared with default snappy` - Any additional information to help reviewers in testing this change. `NA` - [x] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. `NA` You can merge this pull request into a Git repository by running: $ git pull https://github.com/xuchuanyin/carbondata 0810_support_zstd_compressor_final_store Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2628.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2628 commit c171ee13136785110f6fff8104afebc4b2f222c7 Author: xuchuanyin Date: 2018-08-10T14:02:57Z Support zstd as column compressor in final store 1. add zstd compressor for compressing column data 2. add zstd support in thrift 3. legacy store is not considered in this commit 4. since zstd does not support zero-copy while compressing, offheap will not take effect for zstd 5. support lazy load for compressor commit 6448e6f21da66172775b625730b922fdfa57822d Author: xuchuanyin Date: 2018-08-13T13:45:42Z Support new compressor on legacy store In query procedure, we need to decompress the column page. Previously we get the compressor from system property. Now since we support new compressors, we should read the compressor information from the metadata in datafiles. This PR also solve the compatibility related problems on V1/V2 store where we only support snappy. commit 2815c84f1d5fd99ff37ba6890d98fb2b73a95b00 Author: xuchuanyin Date: 2018-08-14T08:38:00Z fix comments commit ac95c25fca1c37f10f9cce0db76062207d0d3cee Author: xuchuanyin Date: 2018-08-23T09:35:23Z Determine the column compressor before data loading we will get the column compressor before data loading/compaction start, so that it can make all the pages use the same compressor in case of concurrent modifying compressor during loading. commit a672d3baad1c476308c0aec5133e418afeaeacb2 Author: xuchuanyin Date: 2018-08-27T11:18:30Z set compressor in carbon load model column compressor is necessary for carbon load model, otherwise load will fail. commit d05c1cc38e1fa42ef94f70577ee2a715f649ebe3 Author: xuchuanyin Date: 2018-08-30T04:02:33Z fix error in test commit fb8cdfb1258b477f7d8c867ee74bd59386725d9c Author: xuchuanyin Date: 2018-09-03T03:58:02Z fix review comments optimize parameters for column page, use columnPageEncodeMeta instead of its members ---
[GitHub] carbondata issue #2661: [CARBONDATA-2888] Support multi level subfolder for ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2661 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/202/ ---
[GitHub] carbondata issue #2661: [CARBONDATA-2888] Support multi level subfolder for ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2661 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8273/ ---
[GitHub] carbondata issue #2678: [CARBONDATA-2909] Multi user support for SDK on S3
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2678 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/198/ ---
[GitHub] carbondata issue #2678: [CARBONDATA-2909] Multi user support for SDK on S3
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2678 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8269/ ---
[GitHub] carbondata issue #2685: [CARBONDATA-2910] Support backward compatability in ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2685 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/201/ ---
[GitHub] carbondata issue #2685: [CARBONDATA-2910] Support backward compatability in ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2685 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8272/ ---
[GitHub] carbondata pull request #2661: [CARBONDATA-2888] Support multi level subfold...
Github user ravipesala commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2661#discussion_r214738731 --- Diff: integration/spark-datasource/src/main/scala/org/apache/spark/sql/carbondata/execution/datasources/CarbonFileIndexReplaceRule.scala --- @@ -82,4 +82,23 @@ class CarbonFileIndexReplaceRule extends Rule[LogicalPlan] { fileIndex } } + + /** + * Get datafolders recursively + */ + private def getDataFolders(carbonFile: CarbonFile): Seq[CarbonFile] = { +val files = carbonFile.listFiles() +var folders: Seq[CarbonFile] = Seq() +files.foreach { f => + if (f.isDirectory) { +val files = f.listFiles() +if (files.nonEmpty && !files(0).isDirectory) { + folders = Seq(f) ++ folders --- End diff -- ok ---
[GitHub] carbondata pull request #2661: [CARBONDATA-2888] Support multi level subfold...
Github user ravipesala commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2661#discussion_r214738056 --- Diff: integration/spark-datasource/src/main/scala/org/apache/spark/sql/carbondata/execution/datasources/CarbonFileIndexReplaceRule.scala --- @@ -82,4 +82,23 @@ class CarbonFileIndexReplaceRule extends Rule[LogicalPlan] { fileIndex } } + + /** + * Get datafolders recursively + */ + private def getDataFolders(carbonFile: CarbonFile): Seq[CarbonFile] = { --- End diff -- ok ---
[GitHub] carbondata pull request #2661: [CARBONDATA-2888] Support multi level subfold...
Github user ravipesala commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2661#discussion_r214737987 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/schema/table/CarbonTable.java --- @@ -268,6 +257,18 @@ public boolean accept(CarbonFile file) { return CarbonTable.buildFromTableInfo(tableInfoInfer); } + private static CarbonFile getFirstIndexFile(CarbonFile tablePath) { +CarbonFile[] carbonFiles = tablePath.listFiles(); +for (CarbonFile carbonFile : carbonFiles) { + if (carbonFile.isDirectory()) { +return getFirstIndexFile(carbonFile); + } else if (carbonFile.getName().endsWith(CarbonTablePath.INDEX_FILE_EXT)) { +return carbonFile; + } +} +return null; --- End diff -- It is handled in caller method ---
[GitHub] carbondata pull request #2661: [CARBONDATA-2888] Support multi level subfold...
Github user ravipesala commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2661#discussion_r214737914 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/SegmentIndexFileStore.java --- @@ -338,6 +338,24 @@ private MergedBlockIndex readMergeBlockIndex(ThriftReader thriftReader) throws I }); } + /** + * List all the index files of the segment. + * + * @param carbonFile directory + */ + public static void getCarbonIndexFilesRecursively(CarbonFile carbonFile, --- End diff -- Its a recursive call, and it adds file to the passed list if it finds any. ---
[GitHub] carbondata issue #2685: [CARBONDATA-2910] Support backward compatability in ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2685 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/199/ ---
[GitHub] carbondata issue #2663: [CARBONDATA-2894] Add support for complex map type t...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2663 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/195/ ---
[GitHub] carbondata issue #2645: [CARBONDATA-2866] Block schema in external table
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2645 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/196/ ---
[GitHub] carbondata issue #2685: [CARBONDATA-2910] Support backward compatability in ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2685 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8270/ ---
[jira] [Created] (CARBONDATA-2910) Support backward compatability in fileformat and support different sort colums per load
Ravindra Pesala created CARBONDATA-2910: --- Summary: Support backward compatability in fileformat and support different sort colums per load Key: CARBONDATA-2910 URL: https://issues.apache.org/jira/browse/CARBONDATA-2910 Project: CarbonData Issue Type: Bug Reporter: Ravindra Pesala Currently if the data is loaded by old version with all dictionary exclude carbon fileformat cannot read. And also if the sort columns are given different per load while loading through SDK does not work, -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Resolved] (CARBONDATA-2907) Support setting blocklet size in table property
[ https://issues.apache.org/jira/browse/CARBONDATA-2907?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala resolved CARBONDATA-2907. - Resolution: Fixed Fix Version/s: 1.5.0 > Support setting blocklet size in table property > --- > > Key: CARBONDATA-2907 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2907 > Project: CarbonData > Issue Type: Improvement >Reporter: Jacky Li >Assignee: Jacky Li >Priority: Major > Fix For: 1.5.0 > > Time Spent: 2h 10m > Remaining Estimate: 0h > > When creating table, should support setting blocklet size in MB. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2682: [CARBONDATA-2907] Support setting blocklet si...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2682 ---
[GitHub] carbondata issue #2663: [CARBONDATA-2894] Add support for complex map type t...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2663 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8266/ ---
[GitHub] carbondata issue #2682: [CARBONDATA-2907] Support setting blocklet size in t...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2682 LGTM ---
[GitHub] carbondata pull request #2681: [CARBONDATA-2906] Output segment size in SHOW...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2681 ---
[jira] [Resolved] (CARBONDATA-2906) Show segment data size in SHOW SEGMENT command
[ https://issues.apache.org/jira/browse/CARBONDATA-2906?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala resolved CARBONDATA-2906. - Resolution: Fixed Fix Version/s: 1.5.0 > Show segment data size in SHOW SEGMENT command > -- > > Key: CARBONDATA-2906 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2906 > Project: CarbonData > Issue Type: Improvement >Reporter: Jacky Li >Priority: Major > Fix For: 1.5.0 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > In SHOW SEGMENT command, output the segment data size and index size so that > user can check the size of each segment easier. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Resolved] (CARBONDATA-2900) Add dynamic configuration support for some system properties
[ https://issues.apache.org/jira/browse/CARBONDATA-2900?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala resolved CARBONDATA-2900. - Resolution: Fixed Fix Version/s: 1.5.0 > Add dynamic configuration support for some system properties > > > Key: CARBONDATA-2900 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2900 > Project: CarbonData > Issue Type: Improvement >Reporter: Jacky Li >Priority: Major > Fix For: 1.5.0 > > Time Spent: 1.5h > Remaining Estimate: 0h > > Following system level properties are added for dynamic configuration. User > can se SET command to set it. > carbon.number.of.cores.while.loading > carbon.number.of.cores.while.compacting > carbon.blockletgroup.size.in.mb > carbon.major.compaction.size > carbon.enable.vector.reader > enable.unsafe.in.query.processing -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2674: [CARBONDATA-2900] Add dynamic configuration f...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2674 ---
[GitHub] carbondata issue #2645: [CARBONDATA-2866] Block schema in external table
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2645 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8267/ ---
[GitHub] carbondata issue #2674: [CARBONDATA-2900] Add dynamic configuration for syst...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2674 LGTM ---
[GitHub] carbondata issue #2465: [CARBONDATA-2863] Refactored CarbonFile interface
Github user kunal642 commented on the issue: https://github.com/apache/carbondata/pull/2465 @jackylk Yes i am working on this. Just need to fix the build. Will do it soon ---
[GitHub] carbondata issue #2681: [CARBONDATA-2906] Output segment size in SHOW SEGMEN...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2681 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/191/ ---
[GitHub] carbondata issue #2654: [CARBONDATA-2896] Adaptive Encoding for Primitive da...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2654 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/193/ ---
[GitHub] carbondata issue #2654: [CARBONDATA-2896] Adaptive Encoding for Primitive da...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2654 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8264/ ---
[GitHub] carbondata issue #2682: [CARBONDATA-2907] Support setting blocklet size in t...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2682 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/192/ ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2628 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/194/ ---
[GitHub] carbondata issue #2682: [CARBONDATA-2907] Support setting blocklet size in t...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2682 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8263/ ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2628 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8265/ ---
[GitHub] carbondata issue #2681: [CARBONDATA-2906] Output segment size in SHOW SEGMEN...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2681 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8262/ ---
[GitHub] carbondata issue #2681: [CARBONDATA-2906] Output segment size in SHOW SEGMEN...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2681 This PR will print the whole size in the segment folder, even there are update/delete, the output size includes them ---
[GitHub] carbondata issue #2645: [CARBONDATA-2866] Block schema in external table
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2645 retest this please ---
[GitHub] carbondata issue #2678: [CARBONDATA-2909] Multi user support for SDK on S3
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2678 Please add description for this PR. @kunal642 ---
[jira] [Resolved] (CARBONDATA-2902) Fix showing negative pruning result for explain command
[ https://issues.apache.org/jira/browse/CARBONDATA-2902?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jacky Li resolved CARBONDATA-2902. -- Resolution: Fixed Fix Version/s: 1.5.0 > Fix showing negative pruning result for explain command > --- > > Key: CARBONDATA-2902 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2902 > Project: CarbonData > Issue Type: Bug >Reporter: jiangmanhua >Assignee: jiangmanhua >Priority: Major > Fix For: 1.5.0 > > Time Spent: 2h 20m > Remaining Estimate: 0h > > when CACHE_LEVEL is BLOCK or carbon data file is LegacyStore, pruned > result of default datamap are assigned with blocklet_id = -1, which means all > blocklets in block need to be scanned. But explain command takes pruned > result size as blocklets hit. > If only 2 of 3 blocklet in one block is hit, negative pruning result `1 > (block) - 2(blocklet)= -1 ` occur -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2676: [CARBONDATA-2902][DataMap] Fix showing negati...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2676 ---
[jira] [Resolved] (CARBONDATA-2901) Problem: Jvm crash in Load scenario when unsafe memory allocation is failed.
[ https://issues.apache.org/jira/browse/CARBONDATA-2901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jacky Li resolved CARBONDATA-2901. -- Resolution: Fixed Fix Version/s: 1.5.0 > Problem: Jvm crash in Load scenario when unsafe memory allocation is failed. > > > Key: CARBONDATA-2901 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2901 > Project: CarbonData > Issue Type: Bug >Reporter: Ajantha Bhat >Assignee: Ajantha Bhat >Priority: Critical > Fix For: 1.5.0 > > Time Spent: 3h 10m > Remaining Estimate: 0h > > Problem: Jvm crash in Load scenario when unsafe memory allocation is failed. > scenario: > a) Have many cores while loading. suggested more than 10. > [carbon.number.of.cores.while.loading] > b) Load huge data with local sort, more than 5GB (keeping default unsafe > memory manager as 512 MB) > c) when task failes due to not enough unsafae memory, JVM crashes with > SIGSEGV. > root casue: > while sorting, all iterator threads are waiting at > UnsafeSortDataRows.addRowBatch as all iterator works on one row page. > Only one iterator thread will try to allocate memory. Before that it has > freed current page in handlePreviousPage(). > When allocate memory failed, row page will still have that old reference. > next thread will again use same reference and call handlePreviousPage(). > So, Jvm crashes as freed memory is accessed. > solution: > When allocation failed, set row page reference to null. > So, that next thread will not do any operation. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2663: [CARBONDATA-2894] Add support for complex map...
Github user manishgupta88 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2663#discussion_r214717788 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/Field.java --- @@ -213,4 +218,58 @@ public String getColumnComment() { public void setColumnComment(String columnComment) { this.columnComment = columnComment; } + + private void initComplexTypeChildren() { +if (getDataType().isComplexType()) { + StructField subFields = prepareSubFields(getFieldName(), getDataType()); + if (DataTypes.isArrayType(getDataType()) || DataTypes.isMapType(getDataType())) { +children = subFields.getChildren(); + } else if (DataTypes.isStructType(getDataType())) { +children = ((StructType) subFields.getDataType()).getFields(); + } +} + } + + /** + * prepare sub fields for complex types + * + * @param fieldName + * @param dType --- End diff -- ok ---
[GitHub] carbondata issue #2676: [CARBONDATA-2902][DataMap] Fix showing negative prun...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2676 LGTM ---
[GitHub] carbondata pull request #2675: [CARBONDATA-2901] Fixed JVM crash in Load sce...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2675 ---
[GitHub] carbondata issue #2675: [CARBONDATA-2901] Fixed JVM crash in Load scenario w...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2675 LGTM ---
[GitHub] carbondata pull request #2663: [CARBONDATA-2894] Add support for complex map...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2663#discussion_r214716461 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/Field.java --- @@ -213,4 +218,58 @@ public String getColumnComment() { public void setColumnComment(String columnComment) { this.columnComment = columnComment; } + + private void initComplexTypeChildren() { +if (getDataType().isComplexType()) { + StructField subFields = prepareSubFields(getFieldName(), getDataType()); + if (DataTypes.isArrayType(getDataType()) || DataTypes.isMapType(getDataType())) { +children = subFields.getChildren(); + } else if (DataTypes.isStructType(getDataType())) { +children = ((StructType) subFields.getDataType()).getFields(); + } +} + } + + /** + * prepare sub fields for complex types + * + * @param fieldName + * @param dType --- End diff -- change to dataType and add all comment ---
[GitHub] carbondata issue #2687: [CARBONDATA-2876]Fix Avro decimal datatype with prec...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2687 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/189/ ---
[GitHub] carbondata issue #2465: [CARBONDATA-2863] Refactored CarbonFile interface
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2465 Hi @kunal642 , are you still working on this PR. I think this PR makes CarbonFile interface cleaner. Thanks for your effort ---
[GitHub] carbondata issue #2681: [CARBONDATA-2906] Output segment size in SHOW SEGMEN...
Github user xuchuanyin commented on the issue: https://github.com/apache/carbondata/pull/2681 @jackylk Will it still be OK if user perform update/delete operations? ---
[GitHub] carbondata issue #2687: [CARBONDATA-2876]Fix Avro decimal datatype with prec...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2687 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8260/ ---
[GitHub] carbondata issue #2628: [CARBONDATA-2851][CARBONDATA-2852] Support zstd as c...
Github user xuchuanyin commented on the issue: https://github.com/apache/carbondata/pull/2628 retest this please ---
[GitHub] carbondata issue #2598: [CARBONDATA-2811][BloomDataMap] Add query test case ...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2598 LGTM ---
[jira] [Closed] (CARBONDATA-2905) Should allow set stream property on streaming table
[ https://issues.apache.org/jira/browse/CARBONDATA-2905?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jacky Li closed CARBONDATA-2905. Resolution: Invalid > Should allow set stream property on streaming table > --- > > Key: CARBONDATA-2905 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2905 > Project: CarbonData > Issue Type: Improvement >Reporter: Jacky Li >Assignee: Jacky Li >Priority: Major > Fix For: 1.5.0 > > Time Spent: 1h > Remaining Estimate: 0h > > For streaming table with table property "streaming"="true", we should allow > set the streaming table property to false -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2680: [CARBONDATA-2905] Set stream property for str...
Github user jackylk closed the pull request at: https://github.com/apache/carbondata/pull/2680 ---
[GitHub] carbondata pull request #2682: [CARBONDATA-2907] Support setting blocklet si...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2682#discussion_r214710189 --- Diff: processing/src/main/java/org/apache/carbondata/processing/store/writer/v3/CarbonFactDataWriterImplV3.java --- @@ -68,11 +70,14 @@ public CarbonFactDataWriterImplV3(CarbonFactDataHandlerModel model) { super(model); -blockletSizeThreshold = Long.parseLong(CarbonProperties.getInstance() -.getProperty(CarbonV3DataFormatConstants.BLOCKLET_SIZE_IN_MB, -CarbonV3DataFormatConstants.BLOCKLET_SIZE_IN_MB_DEFAULT_VALUE)) -* CarbonCommonConstants.BYTE_TO_KB_CONVERSION_FACTOR -* CarbonCommonConstants.BYTE_TO_KB_CONVERSION_FACTOR; +String blockletSize = --- End diff -- I added in CarbonDDLSqlParser. And one testcase is added ---
[GitHub] carbondata pull request #2682: [CARBONDATA-2907] Support setting blocklet si...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2682#discussion_r214706037 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/schema/table/TableSchemaBuilder.java --- @@ -117,7 +116,7 @@ public TableSchema build() { property.put(CarbonCommonConstants.TABLE_BLOCKSIZE, String.valueOf(blockSize)); } if (blockletSize > 0) { - property.put(CarbonV3DataFormatConstants.BLOCKLET_SIZE_IN_MB, String.valueOf(blockletSize)); --- End diff -- CarbonV3DataFormatConstants.BLOCKLET_SIZE_IN_MB is "carbon.blockletgroup.size.in.mb", this string is system property, but a normal table property is without dot, like `TABLE_BLOCKSIZE = "table_blocksize"; So I added a `TABLE_BLOCKLET_SIZE = "table_blocklet_size"` ---
[GitHub] carbondata pull request #2682: [CARBONDATA-2907] Support setting blocklet si...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2682#discussion_r214705354 --- Diff: processing/src/main/java/org/apache/carbondata/processing/store/writer/v3/CarbonFactDataWriterImplV3.java --- @@ -68,11 +70,14 @@ public CarbonFactDataWriterImplV3(CarbonFactDataHandlerModel model) { super(model); -blockletSizeThreshold = Long.parseLong(CarbonProperties.getInstance() -.getProperty(CarbonV3DataFormatConstants.BLOCKLET_SIZE_IN_MB, -CarbonV3DataFormatConstants.BLOCKLET_SIZE_IN_MB_DEFAULT_VALUE)) -* CarbonCommonConstants.BYTE_TO_KB_CONVERSION_FACTOR -* CarbonCommonConstants.BYTE_TO_KB_CONVERSION_FACTOR; +String blockletSize = + model.getTableSpec().getCarbonTable().getTableInfo().getFactTable().getTableProperties() +.get(TABLE_BLOCKLET_SIZE); +if (blockletSize == null) { --- End diff -- not a problem. blocklet size is get from the property for every load ---
[GitHub] carbondata issue #2676: [CARBONDATA-2902][DataMap] Fix showing negative prun...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2676 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/187/ ---
[GitHub] carbondata issue #2676: [CARBONDATA-2902][DataMap] Fix showing negative prun...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2676 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8258/ ---
[jira] [Resolved] (CARBONDATA-2837) Add MV Example in examples module
[ https://issues.apache.org/jira/browse/CARBONDATA-2837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Liang Chen resolved CARBONDATA-2837. Resolution: Fixed Assignee: Ravindra Pesala Fix Version/s: 1.4.2 1.5.0 > Add MV Example in examples module > - > > Key: CARBONDATA-2837 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2837 > Project: CarbonData > Issue Type: Improvement >Reporter: Ravindra Pesala >Assignee: Ravindra Pesala >Priority: Major > Fix For: 1.5.0, 1.4.2 > > Time Spent: 6h > Remaining Estimate: 0h > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2614: [CARBONDATA-2837] Added MVExample in example ...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2614 ---
[GitHub] carbondata issue #2614: [CARBONDATA-2837] Added MVExample in example module
Github user chenliang613 commented on the issue: https://github.com/apache/carbondata/pull/2614 LGTM ---
[GitHub] carbondata issue #2686: upgrade to scala 2.12.6 and binary 2.11
Github user chenliang613 commented on the issue: https://github.com/apache/carbondata/pull/2686 same question as @zzcclp . It is better to raise one discussion first on mailing list. ---
[GitHub] carbondata issue #2687: [CARBONDATA-2876]Fix Avro decimal datatype with prec...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2687 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8257/ ---
[GitHub] carbondata issue #2687: [CARBONDATA-2876]Fix Avro decimal datatype with prec...
Github user Indhumathi27 commented on the issue: https://github.com/apache/carbondata/pull/2687 Retest this please ---
[GitHub] carbondata issue #2644: [CARBONDATA-2853] Implement file-level min/max index...
Github user QiangCai commented on the issue: https://github.com/apache/carbondata/pull/2644 @ravipesala the blocklet level min/max will be added in another PR. ---
[GitHub] carbondata issue #2642: [CARBONDATA-2532][Integration] Carbon to support spa...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2642 @sujith71955 Please check MV module is failing with compilation issues. http://136.243.101.176:8080/job/ManualApacheCarbonPRBuilder2.1/179/ ---
[GitHub] carbondata issue #2663: [CARBONDATA-2894] Add support for complex map type t...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2663 LGTM ---
[GitHub] carbondata pull request #2687: [CARBONDATA-2876]Fix Avro decimal datatype wi...
GitHub user Indhumathi27 opened a pull request: https://github.com/apache/carbondata/pull/2687 [CARBONDATA-2876]Fix Avro decimal datatype with precision and scale **What is PR for?** Add precision and scale for fieldvalue for Avro Decimal logical type - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/Indhumathi27/carbondata decimal_fix Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2687.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2687 commit a6e5149c12a7427a00fdc634cd0d35aca077 Author: Indhumathi27 Date: 2018-09-03T12:05:01Z Fix decimal type for Avro ---
[GitHub] carbondata issue #2672: [HOTFIX] improve sdk multi-thread performance
Github user kumarvishal09 commented on the issue: https://github.com/apache/carbondata/pull/2672 LGTM ---
[GitHub] carbondata issue #2663: [CARBONDATA-2894] Add support for complex map type t...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2663 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8256/ ---
[GitHub] carbondata issue #2663: [CARBONDATA-2894] Add support for complex map type t...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2663 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/185/ ---
[GitHub] carbondata issue #2644: [CARBONDATA-2853] Implement file-level min/max index...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2644 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/184/ ---
[GitHub] carbondata issue #2675: [CARBONDATA-2901] Fixed JVM crash in Load scenario w...
Github user kumarvishal09 commented on the issue: https://github.com/apache/carbondata/pull/2675 LGTM ---
[GitHub] carbondata issue #2644: [CARBONDATA-2853] Implement file-level min/max index...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2644 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8255/ ---
[GitHub] carbondata issue #2654: [CARBONDATA-2896] Adaptive Encoding for Primitive da...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2654 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/183/ ---
[GitHub] carbondata issue #2654: [CARBONDATA-2896] Adaptive Encoding for Primitive da...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2654 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/8254/ ---
[GitHub] carbondata issue #2681: [CARBONDATA-2906] Output segment size in SHOW SEGMEN...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2681 LGTM ---
[GitHub] carbondata issue #2674: [CARBONDATA-2900] Add dynamic configuration for syst...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2674 @jackylk Does these properties need to update in carbondata documents as well? ---
[GitHub] carbondata issue #2663: [CARBONDATA-2894] Add support for complex map type t...
Github user manishgupta88 commented on the issue: https://github.com/apache/carbondata/pull/2663 @ravipesala ...handled review comments. Kindly review and merge ---