[GitHub] carbondata pull request #2503: [CARBONDATA-2734] Update is not working on th...
Github user manishgupta88 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2503#discussion_r204202741 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -3235,4 +3235,17 @@ public boolean accept(CarbonFile file) { int version = fileHeader.getVersion(); return ColumnarFormatVersion.valueOf((short)version); } + + /** + * Check whether it is standard table means tablepath has Fact/Part0/Segment_ tail present with + * all carbon files. In other cases carbon files present directly under tablepath or + * tablepath/partition folder + * TODO Read segment file and corresponding index file to get the correct carbondata file instead + * of using this way. + * @param table + * @return + */ + public static boolean isStandardCarbonTable(CarbonTable table) { +return !(table.isSupportFlatFolder() || table.isHivePartitionTable()); --- End diff -- Please add a check for sdk case also CarbonTable.isTransactionalTable() ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2531 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5943/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2533 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5942/ ---
[GitHub] carbondata issue #2534: [CARBONDATA-2753] Fix Compatibility issues on index ...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2534 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5941/ ---
[GitHub] carbondata issue #2529: [CARBONDATA-2760] Reduce Memory footprint and store ...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2529 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5940/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2533 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6132/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2533 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7371/ ---
[GitHub] carbondata issue #2534: [CARBONDATA-2753] Fix Compatibility issues on index ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2534 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6131/ ---
[GitHub] carbondata issue #2503: [CARBONDATA-2734] Update is not working on the table...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2503 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5939/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user akashrn5 commented on the issue: https://github.com/apache/carbondata/pull/2533 retest this please ---
[GitHub] carbondata issue #2534: [CARBONDATA-2753] Fix Compatibility issues on index ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2534 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7370/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6130/ ---
[GitHub] carbondata issue #2534: [CARBONDATA-2753] Fix Compatibility issues on index ...
Github user dhatchayani commented on the issue: https://github.com/apache/carbondata/pull/2534 Retest this please ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2530 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5938/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7369/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2533 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6129/ ---
[GitHub] carbondata issue #2534: [CARBONDATA-2753] Fix Compatibility issues on index ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2534 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7368/ ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user brijoobopanna commented on the issue: https://github.com/apache/carbondata/pull/2531 retest sdv please ---
[GitHub] carbondata issue #2534: [CARBONDATA-2753] Fix Compatibility issues on index ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2534 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6128/ ---
[GitHub] carbondata issue #2529: [CARBONDATA-2760] Reduce Memory footprint and store ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2529 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6127/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2533 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7367/ ---
[GitHub] carbondata issue #2529: [CARBONDATA-2760] Reduce Memory footprint and store ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2529 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7366/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2533 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6125/ ---
[GitHub] carbondata issue #2533: [CARBONDATA-2765]handle flat folder support for impl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2533 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7364/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2530 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5937/ ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user kumarvishal09 commented on the issue: https://github.com/apache/carbondata/pull/2531 LGTM ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6123/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user dhatchayani commented on the issue: https://github.com/apache/carbondata/pull/2530 retest this please ---
[GitHub] carbondata pull request #2534: [CARBONDATA-2753] Fix Compatibility issues on...
GitHub user dhatchayani opened a pull request: https://github.com/apache/carbondata/pull/2534 [CARBONDATA-2753] Fix Compatibility issues on index Files **Problem:** Currently,in the segmentFile we are writing the index files list in files field, only if it exists, otherwise it will be empty(in case if it is merged to merge index file). But in the old store, we were writing both the files and mergeFileName fields even if the index files are merged. **Solution:** While querying we have to check the physical existence of the index files listed in the files field. If it physically exists, then we have to consider that. - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [x] Testing done Manual Testing with Old Store - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/dhatchayani/carbondata CARBONDATA-2753_indexFiles Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2534.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2534 commit e257177c182783fb6c78507d4d8ad9823022acb6 Author: dhatchayani Date: 2018-07-20T14:36:00Z [CARBONDATA-2753] Fix Compatibility issues on index Files ---
[GitHub] carbondata issue #2524: [CARBONDATA-2532][Integration] Carbon to support spa...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2524 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6126/ ---
[GitHub] carbondata issue #2524: [CARBONDATA-2532][Integration] Carbon to support spa...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2524 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7365/ ---
[GitHub] carbondata pull request #2533: [CARBONDATA-2765]handle flat folder support f...
GitHub user akashrn5 opened a pull request: https://github.com/apache/carbondata/pull/2533 [CARBONDATA-2765]handle flat folder support for implicit column Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/akashrn5/incubator-carbondata impli Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2533.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2533 commit b5fc1f061b238be3d77445e4eb6b15d273e26069 Author: akashrn5 Date: 2018-07-20T07:16:32Z handle flat folder support for implicit column ---
[jira] [Created] (CARBONDATA-2765) Handle flat folder for implicit column
Akash R Nilugal created CARBONDATA-2765: --- Summary: Handle flat folder for implicit column Key: CARBONDATA-2765 URL: https://issues.apache.org/jira/browse/CARBONDATA-2765 Project: CarbonData Issue Type: Bug Reporter: Akash R Nilugal Assignee: Akash R Nilugal Handle flat folder for implicit column for implicit column, the blocklet id was getting wrong beacuse of path, as the carbondata files will be present after table path in case of flat folder -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2503: [CARBONDATA-2734] Update is not working on the table...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2503 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6124/ ---
[GitHub] carbondata issue #2503: [CARBONDATA-2734] Update is not working on the table...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2503 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7363/ ---
[GitHub] carbondata issue #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when buildin...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2526 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5936/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7362/ ---
[GitHub] carbondata issue #2503: [CARBONDATA-2734] Update is not working on the table...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2503 @manishgupta88 Ok, done the necessary changes ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7360/ ---
[GitHub] carbondata issue #2532: [CARBONDATA-2759]Add Bad_Records_Options to STMPROPE...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2532 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5935/ ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2531 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6120/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7358/ ---
[GitHub] carbondata issue #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when buildin...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2526 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6119/ ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2531 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7357/ ---
[GitHub] carbondata issue #2514: [CARBONDATA-2740]segment file is not getting deleted...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2514 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5934/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6117/ ---
[GitHub] carbondata pull request #2518: [CARBONDATA-2754] fixing testcases if HiveMet...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2518 ---
[GitHub] carbondata issue #2518: [CARBONDATA-2754] fixing testcases if HiveMetastore ...
Github user manishgupta88 commented on the issue: https://github.com/apache/carbondata/pull/2518 LGTM ---
[GitHub] carbondata issue #2525: [CARBONDATA-2756] refactored code to use ZSTD compre...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2525 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7359/ ---
[GitHub] carbondata issue #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when buildin...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2526 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7355/ ---
[GitHub] carbondata pull request #2523: [CARBONDATA-2753] Fix Compatibility issue wit...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2523 ---
[GitHub] carbondata issue #2522: [CARBONDATA-2752][CARBONSTORE] Carbon provide Zeppel...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2522 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6116/ ---
[GitHub] carbondata issue #2523: [CARBONDATA-2753] Fix Compatibility issue with Preag...
Github user kumarvishal09 commented on the issue: https://github.com/apache/carbondata/pull/2523 LGTM ---
[GitHub] carbondata pull request #2531: [HOTFIX] Improved BlockDataMap caching perfor...
Github user manishgupta88 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2531#discussion_r204000404 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/block/SegmentPropertiesAndSchemaHolder.java --- @@ -237,6 +241,32 @@ public void invalidate(String segmentId, int segmentPropertiesIndex, .isEmpty()) { indexToSegmentPropertiesWrapperMapping.remove(segmentPropertiesIndex); segmentPropWrapperToSegmentSetMap.remove(segmentPropertiesWrapper); + } else if (!clearSegmentWrapperFromMap + && segmentIdAndSegmentPropertiesIndexWrapper.segmentIdSet.isEmpty()) { +// min max columns can very when cache is modified. So even though entry is not required +// to be deleted from map clear the column cache so that it can filled again +segmentPropertiesWrapper.clear(); +LOGGER.info("cleared min max for segmentProperties at index: " + segmentPropertiesIndex); + } +} + } + + /** + * add segmentId at given segmentPropertyIndex + * Note: This method is getting used in extension with other features. Please do not remove + * + * @param segmentPropertiesIndex + * @param segmentId + */ + public void addSegmentId(int segmentPropertiesIndex, String segmentId) { +SegmentPropertiesWrapper segmentPropertiesWrapper = +indexToSegmentPropertiesWrapperMapping.get(segmentPropertiesIndex); +if (null != segmentPropertiesWrapper) { + SegmentIdAndSegmentPropertiesIndexWrapper segmentIdAndSegmentPropertiesIndexWrapper = + segmentPropWrapperToSegmentSetMap.get(segmentPropertiesWrapper); + synchronized (segmentPropertiesWrapper.getTableIdentifier().getCarbonTableIdentifier() --- End diff -- ok ---
[GitHub] carbondata pull request #2531: [HOTFIX] Improved BlockDataMap caching perfor...
Github user kumarvishal09 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2531#discussion_r20322 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/block/SegmentPropertiesAndSchemaHolder.java --- @@ -237,6 +241,32 @@ public void invalidate(String segmentId, int segmentPropertiesIndex, .isEmpty()) { indexToSegmentPropertiesWrapperMapping.remove(segmentPropertiesIndex); segmentPropWrapperToSegmentSetMap.remove(segmentPropertiesWrapper); + } else if (!clearSegmentWrapperFromMap + && segmentIdAndSegmentPropertiesIndexWrapper.segmentIdSet.isEmpty()) { +// min max columns can very when cache is modified. So even though entry is not required +// to be deleted from map clear the column cache so that it can filled again +segmentPropertiesWrapper.clear(); +LOGGER.info("cleared min max for segmentProperties at index: " + segmentPropertiesIndex); + } +} + } + + /** + * add segmentId at given segmentPropertyIndex + * Note: This method is getting used in extension with other features. Please do not remove + * + * @param segmentPropertiesIndex + * @param segmentId + */ + public void addSegmentId(int segmentPropertiesIndex, String segmentId) { +SegmentPropertiesWrapper segmentPropertiesWrapper = +indexToSegmentPropertiesWrapperMapping.get(segmentPropertiesIndex); +if (null != segmentPropertiesWrapper) { + SegmentIdAndSegmentPropertiesIndexWrapper segmentIdAndSegmentPropertiesIndexWrapper = + segmentPropWrapperToSegmentSetMap.get(segmentPropertiesWrapper); + synchronized (segmentPropertiesWrapper.getTableIdentifier().getCarbonTableIdentifier() --- End diff -- Use getOrCreateTableLock ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2531 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7354/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7353/ ---
[jira] [Resolved] (CARBONDATA-2742) [MV] Wrong data displayed after MV creation.
[ https://issues.apache.org/jira/browse/CARBONDATA-2742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala resolved CARBONDATA-2742. - Resolution: Duplicate Fix Version/s: 1.4.1 > [MV] Wrong data displayed after MV creation. > -- > > Key: CARBONDATA-2742 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2742 > Project: CarbonData > Issue Type: Bug >Reporter: Babulal >Assignee: xuchuanyin >Priority: Major > Fix For: 1.4.1 > > > 0: jdbc:hive2://10.18.16.173:23040/default> create table mytest_48 (rownumber > int,name string, m1 int) stored by 'carbondata'; > +-+--+ > | Result | > +-+--+ > +-+--+ > No rows selected (1.267 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> load data inpath > 'hdfs://hacluster/tmp/babu/testdata_1.csv' into table mytest_48 ; > +-+--+ > | Result | > +-+--+ > +-+–+ > > 0: jdbc:hive2://10.18.16.173:23040/default> show datamap on table mytest_48; > +--++---+-+--+ > | DataMapName | ClassName | Associated Table | DataMap Properties | > +--++---+-+--+ > +--++---+-+--+ > No rows selected (0.162 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> > 0: jdbc:hive2://10.18.16.173:23040/default> select * from mytest_48; > ++---+--+--+ > | rownumber | name | m1 | > ++---+--+--+ > | 1 | aaa | 1000 | > | 2 | aaa | 65000 | > | 3 | aaa | 100 | > | 1 | ddd | 1000 | > | 2 | ddd | 65000 | > | 3 | ddd | 100 | > ++---+--+--+ > 6 rows selected (1.266 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> create datamap map9 using 'mv' as > select sum(m1),name from mytest_48 group by name; > +-+--+ > | Result | > +-+--+ > +-+--+ > No rows selected (0.82 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> select sum(m1),name from > mytest_48 group by name; > +--+---+--+ > | sum(m1) | name | > +--+---+--+ > +--+---+--+ > No rows selected (2.615 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> explain select sum(m1),name from > mytest_48 group by name; > +-+--+ > | > > > plan > > > | > +-+--+ > | == CarbonData Profiler == > > > > > > | > | == Physical Plan == > *HashAggregate(keys=[mytest_48_name#297], functions=[sum(sum_m1#296L)]) > +- Exchange hashpartitioning(mytest_48_name#297, 200) > +- *HashAggregate(keys=[mytest_48_name#297], > functions=[partial_sum(sum_m1#296L)]) > +- *BatchedScan CarbonDatasourceHadoopRelation [ Database name :babu, > Table name :map9_table, Schema
[jira] [Commented] (CARBONDATA-2742) [MV] Wrong data displayed after MV creation.
[ https://issues.apache.org/jira/browse/CARBONDATA-2742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16550592#comment-16550592 ] Ravindra Pesala commented on CARBONDATA-2742: - This issue is duplicated by https://issues.apache.org/jira/browse/CARBONDATA-2530 > [MV] Wrong data displayed after MV creation. > -- > > Key: CARBONDATA-2742 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2742 > Project: CarbonData > Issue Type: Bug >Reporter: Babulal >Assignee: xuchuanyin >Priority: Major > > 0: jdbc:hive2://10.18.16.173:23040/default> create table mytest_48 (rownumber > int,name string, m1 int) stored by 'carbondata'; > +-+--+ > | Result | > +-+--+ > +-+--+ > No rows selected (1.267 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> load data inpath > 'hdfs://hacluster/tmp/babu/testdata_1.csv' into table mytest_48 ; > +-+--+ > | Result | > +-+--+ > +-+–+ > > 0: jdbc:hive2://10.18.16.173:23040/default> show datamap on table mytest_48; > +--++---+-+--+ > | DataMapName | ClassName | Associated Table | DataMap Properties | > +--++---+-+--+ > +--++---+-+--+ > No rows selected (0.162 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> > 0: jdbc:hive2://10.18.16.173:23040/default> select * from mytest_48; > ++---+--+--+ > | rownumber | name | m1 | > ++---+--+--+ > | 1 | aaa | 1000 | > | 2 | aaa | 65000 | > | 3 | aaa | 100 | > | 1 | ddd | 1000 | > | 2 | ddd | 65000 | > | 3 | ddd | 100 | > ++---+--+--+ > 6 rows selected (1.266 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> create datamap map9 using 'mv' as > select sum(m1),name from mytest_48 group by name; > +-+--+ > | Result | > +-+--+ > +-+--+ > No rows selected (0.82 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> select sum(m1),name from > mytest_48 group by name; > +--+---+--+ > | sum(m1) | name | > +--+---+--+ > +--+---+--+ > No rows selected (2.615 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> explain select sum(m1),name from > mytest_48 group by name; > +-+--+ > | > > > plan > > > | > +-+--+ > | == CarbonData Profiler == > > > > > > | > | == Physical Plan == > *HashAggregate(keys=[mytest_48_name#297], functions=[sum(sum_m1#296L)]) > +- Exchange hashpartitioning(mytest_48_name#297, 200) > +- *HashAggregate(keys=[mytest_48_name#297], > functions=[partial_sum(sum_m1#296L)]) > +- *BatchedScan CarbonDatasourceHadoopRelation [ Database name
[jira] [Resolved] (CARBONDATA-2758) selection on local dictionary fails when column having all null values more than default batch size.
[ https://issues.apache.org/jira/browse/CARBONDATA-2758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] kumar vishal resolved CARBONDATA-2758. -- Resolution: Fixed > selection on local dictionary fails when column having all null values more > than default batch size. > > > Key: CARBONDATA-2758 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2758 > Project: CarbonData > Issue Type: Bug > Components: spark-integration >Affects Versions: 1.5.0 > Environment: 3 node cluster having spark-2.2 >Reporter: Jatin >Assignee: Jatin >Priority: Minor > Fix For: 1.5.0 > > Time Spent: 1h > Remaining Estimate: 0h > > ArrayIndexOutOfBound throws on following command. > 1. create table t1(s1 int,s2 string,s3 string) stored by 'carbondata' > TBLPROPERTIES('SORT_SCOPE'='BATCH_SORT') > 2. load from a csv having all null values alteast 4097 rows > or > insert into t1 select cast(null as int),cast(null as string),cast(null as > string) 5000 times > 3. select * from t1; > Error: org.apache.spark.SparkException: Job aborted due to stage failure: > Task 0 in stage 6.0 failed 4 times, most recent failure: Lost task 0.3 in > stage 6.0 (TID 207, BLR114267, executor 1): > java.lang.ArrayIndexOutOfBoundsException: 4096 > at > org.apache.carbondata.spark.vectorreader.ColumnarVectorWrapper.putNull(ColumnarVectorWrapper.java:181) > at > org.apache.carbondata.core.datastore.chunk.store.impl.LocalDictDimensionDataChunkStore.fillRow(LocalDictDimensionDataChunkStore.java:63) > at > org.apache.carbondata.core.datastore.chunk.impl.VariableLengthDimensionColumnPage.fillVector(VariableLengthDimensionColumnPage.java:117) > at > org.apache.carbondata.core.scan.result.BlockletScannedResult.fillColumnarNoDictionaryBatch(BlockletScannedResult.java:260) > at > org.apache.carbondata.core.scan.collector.impl.DictionaryBasedVectorResultCollector.fillResultToColumnarBatch(DictionaryBasedVectorResultCollector.java:166) > at > org.apache.carbondata.core.scan.collector.impl.DictionaryBasedVectorResultCollector.collectResultInColumnarBatch(DictionaryBasedVectorResultCollector.java:157) > at > org.apache.carbondata.core.scan.processor.DataBlockIterator.processNextBatch(DataBlockIterator.java:245) > at > org.apache.carbondata.core.scan.result.iterator.VectorDetailQueryResultIterator.processNextBatch(VectorDetailQueryResultIterator.java:48) > at > org.apache.carbondata.spark.vectorreader.VectorizedCarbonRecordReader.nextBatch(VectorizedCarbonRecordReader.java:307) > at > org.apache.carbondata.spark.vectorreader.VectorizedCarbonRecordReader.nextKeyValue(VectorizedCarbonRecordReader.java:182) > at > org.apache.carbondata.spark.rdd.CarbonScanRDD$$anon$1.hasNext(CarbonScanRDD.scala:497) > at > org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.scan_nextBatch$(Unknown > Source) > at > org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.agg_doAggregateWithKeys$(Unknown > Source) > at > org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown > Source) > at > org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) > at > org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:381) > at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408) > at > org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:126) -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2527: [CARBONDATA-2758] Fix for filling data with e...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2527 ---
[GitHub] carbondata issue #2520: [CARBONDATA-2750] Added Documentation for Local Dict...
Github user sgururajshetty commented on the issue: https://github.com/apache/carbondata/pull/2520 LGTM ---
[GitHub] carbondata pull request #2525: [CARBONDATA-2756] refactored code to use ZSTD...
Github user manishgupta88 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2525#discussion_r203991119 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/filesystem/LocalCarbonFile.java --- @@ -375,7 +381,15 @@ public DataOutputStream getDataOutputStream(String path, FileFactory.FileType fi } else if ("ZSTD".equalsIgnoreCase(compressor)) { // compression level 1 is cost-effective for sort temp file // which is not used for storage - outputStream = new ZstdOutputStream(new FileOutputStream(path), 1); + FileOutputStream fileOutputStream = new FileOutputStream(path); + try { +outputStream = (OutputStream) Class.forName("com.github.luben.zstd.ZstdOutputStream") +.getConstructor(OutputStream.class, int.class).newInstance(fileOutputStream, 1); + } catch (ReflectiveOperationException e) { +throw new IOException(e); + } finally { +fileOutputStream.close(); --- End diff -- dont close the stream hereit should be closed by the caller after its usage is complete Only in case of failure you close in catch block and catch for Throwable...same comment is applicable for above code changes also.. ---
[jira] [Assigned] (CARBONDATA-2742) [MV] Wrong data displayed after MV creation.
[ https://issues.apache.org/jira/browse/CARBONDATA-2742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] xuchuanyin reassigned CARBONDATA-2742: -- Assignee: xuchuanyin > [MV] Wrong data displayed after MV creation. > -- > > Key: CARBONDATA-2742 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2742 > Project: CarbonData > Issue Type: Bug >Reporter: Babulal >Assignee: xuchuanyin >Priority: Major > > 0: jdbc:hive2://10.18.16.173:23040/default> create table mytest_48 (rownumber > int,name string, m1 int) stored by 'carbondata'; > +-+--+ > | Result | > +-+--+ > +-+--+ > No rows selected (1.267 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> load data inpath > 'hdfs://hacluster/tmp/babu/testdata_1.csv' into table mytest_48 ; > +-+--+ > | Result | > +-+--+ > +-+–+ > > 0: jdbc:hive2://10.18.16.173:23040/default> show datamap on table mytest_48; > +--++---+-+--+ > | DataMapName | ClassName | Associated Table | DataMap Properties | > +--++---+-+--+ > +--++---+-+--+ > No rows selected (0.162 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> > 0: jdbc:hive2://10.18.16.173:23040/default> select * from mytest_48; > ++---+--+--+ > | rownumber | name | m1 | > ++---+--+--+ > | 1 | aaa | 1000 | > | 2 | aaa | 65000 | > | 3 | aaa | 100 | > | 1 | ddd | 1000 | > | 2 | ddd | 65000 | > | 3 | ddd | 100 | > ++---+--+--+ > 6 rows selected (1.266 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> create datamap map9 using 'mv' as > select sum(m1),name from mytest_48 group by name; > +-+--+ > | Result | > +-+--+ > +-+--+ > No rows selected (0.82 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> select sum(m1),name from > mytest_48 group by name; > +--+---+--+ > | sum(m1) | name | > +--+---+--+ > +--+---+--+ > No rows selected (2.615 seconds) > 0: jdbc:hive2://10.18.16.173:23040/default> explain select sum(m1),name from > mytest_48 group by name; > +-+--+ > | > > > plan > > > | > +-+--+ > | == CarbonData Profiler == > > > > > > | > | == Physical Plan == > *HashAggregate(keys=[mytest_48_name#297], functions=[sum(sum_m1#296L)]) > +- Exchange hashpartitioning(mytest_48_name#297, 200) > +- *HashAggregate(keys=[mytest_48_name#297], > functions=[partial_sum(sum_m1#296L)]) > +- *BatchedScan CarbonDatasourceHadoopRelation [ Database name :babu, > Table name :map9_table, Schema > :Some(StructType(StructField(sum_m1,LongType,true), >
[GitHub] carbondata issue #2527: [CARBONDATA-2758] Fix for filling data with enabled ...
Github user kumarvishal09 commented on the issue: https://github.com/apache/carbondata/pull/2527 LGTM ---
[GitHub] carbondata issue #2517: [CARBONDATA-2749][dataload] In HDFS Empty tablestatu...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2517 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6113/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6114/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2530 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5933/ ---
[GitHub] carbondata pull request #2532: [CARBONDATA-2759]Add Bad_Records_Options to S...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2532 ---
[GitHub] carbondata pull request #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when ...
Github user kevinjmh commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2526#discussion_r203986320 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/DecimalColumnPage.java --- @@ -106,4 +109,48 @@ public void setDoublePage(double[] doubleData) { throw new UnsupportedOperationException("invalid data type: " + dataType); } + private BigDecimal getDecimalFromRawData(int rowId) { --- End diff -- Fixed ---
[GitHub] carbondata pull request #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when ...
Github user kevinjmh commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2526#discussion_r203986298 --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/AbstractBloomDataMapWriter.java --- @@ -129,8 +130,12 @@ protected void addValue2BloomIndex(int indexColIdx, Object value) { // convert non-dict dimensions to simple bytes without length // convert internal-dict dimensions to simple bytes without any encode if (indexColumns.get(indexColIdx).isMeasure()) { - if (value == null) { -value = DataConvertUtil.getNullValueForMeasure(indexColumns.get(indexColIdx).getDataType()); + // NULL value of all measures are already processed in `ColumnPage.getData` + // or `RawBytesReadSupport.readRow` with actual data type + + // Carbon stores boolean as byte. Here we convert it for `getValueAsBytes` + if (value instanceof Boolean) { --- End diff -- Yes. FYI, column of boolean type uses column page with inner datatype Byte, and some disagreement on DataType exists between `getData` and `getNull`. I will changed that. ---
[GitHub] carbondata issue #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when buildin...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2526 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6115/ ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2530 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7350/ ---
[GitHub] carbondata issue #2522: [CARBONDATA-2752][CARBONSTORE] Carbon provide Zeppel...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2522 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7352/ ---
[GitHub] carbondata issue #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when buildin...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2526 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7351/ ---
[GitHub] carbondata issue #2517: [CARBONDATA-2749][dataload] In HDFS Empty tablestatu...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2517 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7349/ ---
[GitHub] carbondata pull request #491: [CARBONDATA-583] Add replace function support ...
Github user phalodi closed the pull request at: https://github.com/apache/carbondata/pull/491 ---
[GitHub] carbondata issue #2530: [WIP][CARBONDATA-2753] Fix Compatibility issues
Github user dhatchayani commented on the issue: https://github.com/apache/carbondata/pull/2530 retest this please ---
[GitHub] carbondata issue #2525: [CARBONDATA-2756] refactored code to use ZSTD compre...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2525 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6111/ ---
[GitHub] carbondata issue #2532: [CARBONDATA-2759]Add Bad_Records_Options to STMPROPE...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2532 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6110/ ---
[GitHub] carbondata issue #2525: [CARBONDATA-2756] refactored code to use ZSTD compre...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2525 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7347/ ---
[GitHub] carbondata pull request #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when ...
Github user xuchuanyin commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2526#discussion_r203962823 --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/AbstractBloomDataMapWriter.java --- @@ -129,8 +130,12 @@ protected void addValue2BloomIndex(int indexColIdx, Object value) { // convert non-dict dimensions to simple bytes without length // convert internal-dict dimensions to simple bytes without any encode if (indexColumns.get(indexColIdx).isMeasure()) { - if (value == null) { -value = DataConvertUtil.getNullValueForMeasure(indexColumns.get(indexColIdx).getDataType()); + // NULL value of all measures are already processed in `ColumnPage.getData` + // or `RawBytesReadSupport.readRow` with actual data type + + // Carbon stores boolean as byte. Here we convert it for `getValueAsBytes` + if (value instanceof Boolean) { --- End diff -- can we use the datatype as the condition? ---
[GitHub] carbondata issue #2529: [CARBONDATA-2760] Reduce Memory footprint and store ...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2529 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5932/ ---
[GitHub] carbondata pull request #2526: [CARBONDATA-2757][BloomDataMap] Fix bug when ...
Github user xuchuanyin commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2526#discussion_r203962551 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/DecimalColumnPage.java --- @@ -106,4 +109,48 @@ public void setDoublePage(double[] doubleData) { throw new UnsupportedOperationException("invalid data type: " + dataType); } + private BigDecimal getDecimalFromRawData(int rowId) { --- End diff -- Please add comment to describe when to use this method and the below method ---
[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2484 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7343/ ---
[GitHub] carbondata issue #2484: [HOTFIX] added hadoop conf to thread local
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2484 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6107/ ---
[jira] [Updated] (CARBONDATA-2757) Fix bug when building bloomfilter on measure column
[ https://issues.apache.org/jira/browse/CARBONDATA-2757?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiangmanhua updated CARBONDATA-2757: Summary: Fix bug when building bloomfilter on measure column (was: Fix bug when building bloomfilter on decimal column) > Fix bug when building bloomfilter on measure column > --- > > Key: CARBONDATA-2757 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2757 > Project: CarbonData > Issue Type: Sub-task >Reporter: jiangmanhua >Assignee: jiangmanhua >Priority: Major > Time Spent: 1h 10m > Remaining Estimate: 0h > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2532: [CARBONDATA-2759]Add Bad_Records_Options to STMPROPE...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2532 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7346/ ---
[GitHub] carbondata pull request #2517: [CARBONDATA-2749][dataload] In HDFS Empty tab...
Github user mohammadshahidkhan commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2517#discussion_r203957912 --- Diff: core/src/main/java/org/apache/carbondata/core/fileoperations/AtomicFileOperationsImpl.java --- @@ -70,12 +78,20 @@ public AtomicFileOperationsImpl(String filePath, FileType fileType) { if (null != dataOutStream) { CarbonUtil.closeStream(dataOutStream); CarbonFile tempFile = FileFactory.getCarbonFile(tempWriteFilePath, fileType); - if (!tempFile.renameForce(filePath)) { -throw new IOException("temporary file renaming failed, src=" -+ tempFile.getPath() + ", dest=" + filePath); + if (!this.setFailed) { +if (!tempFile.renameForce(filePath)) { + throw new IOException( + "temporary file renaming failed, src=" + tempFile.getPath() + ", dest=" + filePath); +} } +} else { + LOGGER.warn("The temporary file renaming skipped due to I/O error, deleting file " + + tempWriteFilePath); } } + @Override public void setFailed() { --- End diff -- Fixed ---
[jira] [Commented] (CARBONDATA-2762) Long string column displayed as string in describe formatted
[ https://issues.apache.org/jira/browse/CARBONDATA-2762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16550318#comment-16550318 ] xuchuanyin commented on CARBONDATA-2762: Hi,we will display the long_string_columns property in desc > Long string column displayed as string in describe formatted > > > Key: CARBONDATA-2762 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2762 > Project: CarbonData > Issue Type: Bug > Components: data-query >Affects Versions: 1.4.1 >Reporter: Chetan Bhat >Priority: Minor > > Steps : > User creates a table with long string column and executes the describe > formatted table command. > 0: jdbc:hive2://10.18.98.101:22550/default> create table t2(c1 string, c2 > string) stored by 'carbondata' tblproperties('long_string_columns' = 'c2'); > +-+--+ > | Result | > +-+--+ > +-+--+ > No rows selected (3.034 seconds) > 0: jdbc:hive2://10.18.98.101:22550/default> desc formatted t2; > Actual Output : The describe formatted displays the c2 column as string > instead of long string. > 0: jdbc:hive2://10.18.98.101:22550/default> desc formatted t2; > +---+---+---+--+ > | col_name | data_type | comment | > +---+---+---+--+ > | c1 | string | KEY COLUMN,null | > *| c2 | string | KEY COLUMN,null |* > | | | | > | ##Detailed Table Information | | | > | Database Name | default | | > | Table Name | t2 | | > | CARBON Store Path | > hdfs://hacluster/user/hive/warehouse/carbon.store/default/t2 | | > | Comment | | | > | Table Block Size | 1024 MB | | > | Table Data Size | 0 | | > | Table Index Size | 0 | | > | Last Update Time | 0 | | > | SORT_SCOPE | LOCAL_SORT | LOCAL_SORT | > | CACHE_LEVEL | BLOCK | | > | Streaming | false | | > | Local Dictionary Enabled | true | | > | Local Dictionary Threshold | 1 | | > | Local Dictionary Include | c1,c2 | | > | | | | > | ##Detailed Column property | | | > | ADAPTIVE | | | > | SORT_COLUMNS | c1 | | > +---+---+---+--+ > 22 rows selected (2.847 seconds) > > Expected Output : The describe formatted should display the c2 column as long > string. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2522: [CARBONDATA-2752][CARBONSTORE] Carbon provide...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2522#discussion_r203954234 --- Diff: zeppelin/src/main/java/org/apache/carbondata/zeppelin/CarbonInterpreter.java --- @@ -0,0 +1,184 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.carbondata.zeppelin; + +import java.io.IOException; +import java.io.InputStream; +import java.util.Arrays; +import java.util.Objects; +import java.util.Optional; +import java.util.Properties; +import java.util.function.BiFunction; +import java.util.function.Function; +import java.util.stream.Collectors; + +import org.apache.carbondata.zeppelin.response.CarbonResponse; + +import org.apache.commons.lang.StringUtils; +import org.apache.http.HttpResponse; +import org.apache.http.client.HttpClient; +import org.apache.http.client.methods.HttpPost; +import org.apache.http.entity.StringEntity; +import org.apache.http.impl.client.HttpClientBuilder; +import org.apache.zeppelin.interpreter.Interpreter; +import org.apache.zeppelin.interpreter.InterpreterContext; +import org.apache.zeppelin.interpreter.InterpreterException; +import org.apache.zeppelin.interpreter.InterpreterResult; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * Carbon based interpreter for zeppelin + */ +public class CarbonInterpreter extends Interpreter { + + public static final Logger logger = LoggerFactory.getLogger(CarbonInterpreter.class); --- End diff -- Please use Carbon's LogService ---
[GitHub] carbondata pull request #2519: [CARBONDATA-2747][Lucene] Fix Lucene datamap ...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2519 ---
[GitHub] carbondata pull request #2522: [CARBONDATA-2752][CARBONSTORE] Carbon provide...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2522#discussion_r203954062 --- Diff: zeppelin/README.txt --- @@ -0,0 +1,18 @@ +Please follow below steps to integrate with zeppelin --- End diff -- Is this README written for carbon specificly? ---
[GitHub] carbondata pull request #2522: [CARBONDATA-2752][CARBONSTORE] Carbon provide...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2522#discussion_r203954155 --- Diff: zeppelin/assembly/assembly.xml --- @@ -0,0 +1,37 @@ + --- End diff -- please move zeppelin folder to integration folder ---
[GitHub] carbondata issue #2520: [CARBONDATA-2750] Added Documentation for Local Dict...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2520 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6109/ ---
[GitHub] carbondata issue #2503: [CARBONDATA-2734] Update is not working on the table...
Github user manishgupta88 commented on the issue: https://github.com/apache/carbondata/pull/2503 @ravipesala ...I got your point that we should always take the path relative to tablePath. But problem with using CarbonTablePath.getShortBlockIdForPartitionTable is even in case of normal carbonTable, if you try to get the tupledId using select query, it will give some 15-20 characters extra and if we want to store this tuple ID then space will increase. So we need to find some way to shorten this ID. For normal carbon table, TupleId value using method **getShortBlockIdForPartitionTable** --> Fact#Part0#Segment_0/0/0-0_batchno0-0-0-1532066017077/0/0/0 **getShortBlockId** and correcting **CarbonUtil.getBlockId** --> 0/0/0-0_batchno0-0-0-1532069720380/0/0/0 ---
[GitHub] carbondata issue #2532: [CARBONDATA-2759]Add Bad_Records_Options to STMPROPE...
Github user QiangCai commented on the issue: https://github.com/apache/carbondata/pull/2532 LGTM ---
[GitHub] carbondata issue #2520: [CARBONDATA-2750] Added Documentation for Local Dict...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2520 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7345/ ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2531 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6108/ ---
[GitHub] carbondata issue #2531: [HOTFIX] Improved BlockDataMap caching performance d...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2531 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5931/ ---