Repository: carbondata
Updated Branches:
refs/heads/datamap 69eb26e68 -> 9390abf6a
[CARBONDATA-2172][Lucene] Add text_columns property for Lucene DataMap
Add text_columns property for Lucene DataMap
This closes #2019
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Author: ravipesala
Date: Sun Mar 4 16:51:34 2018
New Revision: 25397
Log:
Upload 1.3.1-rc1
Added:
dev/carbondata/1.3.1-rc1/
dev/carbondata/1.3.1-rc1/apache-carbondata-1.3.1-source-release.zip (with
props)
dev/carbondata/1.3.1-rc1/apache-carbondata-1.3.1-source-release.zip.asc
http://git-wip-us.apache.org/repos/asf/carbondata/blob/586ab702/processing/src/main/java/org/apache/carbondata/processing/loading/sort/unsafe/holder/UnsafeSortTempFileChunkHolder.java
--
diff --git
[REBASE] resolve conflict after rebasing to master
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/6216294c
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/6216294c
Diff:
Support generating assembling JAR for store-sdk module
Support generating assembling JAR for store-sdk module and remove junit
dependency
This closes #1976
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/503e0d96
[CARBONDATA-2091][DataLoad] Support specifying sort column bounds in data
loading
Enhance data loading performance by specifying sort column bounds
1. Add row range number during convert-process-step
2. Dispatch rows to each sorter by range number
3. Sort/Write process step can be done
[REBASE] Solve conflict after merging master
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/8104735f
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/8104735f
Diff:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/dcfe73b8/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/DataLoadingUtil.scala
--
diff --git
[CARBONDATA-1997] Add CarbonWriter SDK API
Added a new module called store-sdk, and added a CarbonWriter API, it can be
used to write Carbondata files to a specified folder, without Spark and Hadoop
dependency. User can use this API in any environment.
This closes #1967
Project:
[CARBONDATA-1968] Add external table support
This PR adds support for creating external table with existing carbondata
files, using Hive syntax.
CREATE EXTERNAL TABLE tableName STORED BY 'carbondata' LOCATION 'path'
This closes #1749
Project:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelFilterExecuterImpl.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/dcfe73b8/processing/src/main/java/org/apache/carbondata/processing/loading/model/CarbonLoadModelBuilder.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/faad967d/processing/src/main/java/org/apache/carbondata/processing/loading/sort/impl/UnsafeParallelReadMergeSorterWithBucketingImpl.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/test/java/org/apache/carbondata/core/util/CarbonUtilTest.java
--
diff --git
a/core/src/test/java/org/apache/carbondata/core/util/CarbonUtilTest.java
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/unsafe/UnsafeFixedLengthDimensionDataChunkStore.java
--
diff --git
[CARBONDATA-2023][DataLoad] Add size base block allocation in data loading
Carbondata assign blocks to nodes at the beginning of data loading.
Previous block allocation strategy is block number based and it will
suffer skewed data problem if the size of input files differs a lot.
We introduced a
http://git-wip-us.apache.org/repos/asf/carbondata/blob/5663e916/processing/src/main/java/org/apache/carbondata/processing/loading/steps/DataWriterBatchProcessorStepImpl.java
--
diff --git
[CARBONDATA-1992] Remove partitionId in CarbonTablePath
In CarbonTablePath, there is a deprecated partition id which is always 0, it
should be removed to avoid confusion.
This closes #1765
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/util/AbstractDataFileFooterConverter.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f06824e9/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableRenameCommand.scala
--
diff --git
[CARBONDATA-1480]Min Max Index Example for DataMap
Datamap Example. Implementation of Min Max Index through Datamap. And Using the
Index while prunning.
This closes #1359
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f06824e9/processing/src/test/java/org/apache/carbondata/carbon/datastore/BlockIndexStoreTest.java
--
diff --git
[CARBONDATA-2023][DataLoad] Add size base block allocation in data loading
Carbondata assign blocks to nodes at the beginning of data loading.
Previous block allocation strategy is block number based and it will
suffer skewed data problem if the size of input files differs a lot.
We introduced a
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f06824e9/core/src/main/java/org/apache/carbondata/core/util/path/CarbonTablePath.java
--
diff --git
Revert "[CARBONDATA-2023][DataLoad] Add size base block allocation in data
loading"
This reverts commit 6dd8b038fc898dbf48ad30adfc870c19eb38e3d0.
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/1d85e916
Tree:
[HotFix][CheckStyle] Fix import related checkstyle
This closes #1952
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/d88d5bb9
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/d88d5bb9
Diff:
[CARBONDATA-2025] Unify all path construction through CarbonTablePath static
method
Refactory CarbonTablePath:
1.Remove CarbonStorePath and use CarbonTablePath only.
2.Make CarbonTablePath an utility without object creation, it can avoid
creating object before using it, thus code is cleaner
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/metadata/blocklet/SegmentInfo.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/result/iterator/PartitionSpliterRawResultIterator.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/processing/src/main/java/org/apache/carbondata/processing/merger/CarbonCompactionExecutor.java
--
diff --git
Revert "[CARBONDATA-2018][DataLoad] Optimization in reading/writing for sort
temp row"
This reverts commit de92ea9a123b17d903f2d1d4662299315c792954.
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/46031a32
Tree:
[CARBONDATA-1544][Datamap] Datamap FineGrain implementation
Implemented interfaces for FG datamap and integrated to filterscanner to use
the pruned bitset from FG datamap.
FG Query flow as follows.
1.The user can add FG datamap to any table and implement there interfaces.
2. Any filter query
[REBASE] Solve conflict after rebasing master
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/0bb4aed6
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/0bb4aed6
Diff:
[CARBONDATA-2018][DataLoad] Optimization in reading/writing for sort temp row
Pick up the no-sort fields in the row and pack them as bytes array and skip
parsing them during merge sort to reduce CPU consumption
This closes #1792
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Repository: carbondata
Updated Branches:
refs/heads/carbonstore c738afbc2 -> 8104735fd (forced update)
[REBASE] Solve conflict after rebasing master
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/bd40a0d7
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeColGroupFilterExecuterImpl.java
--
diff --git
[CARBONDATA-1114][Tests] Fix bugs in tests in windows env
Fix bugs in tests that will cause failure under windows env
This closes #1994
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/ce88eb6a
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f06824e9/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/datacompaction/MajorCompactionIgnoreInMinorTest.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/processor/DataBlockIterator.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/bb5bb00a/datamap/examples/src/minmaxdatamap/main/java/org/apache/carbondata/datamap/examples/MinMaxDataMapFactory.java
--
diff --git
[CARBONDATA-1827] S3 Carbon Implementation
1.Provide support for s3 in carbondata.
2.Added S3Example to create carbon table on s3.
3.Added S3CSVExample to load carbon table using csv from s3.
This closes #1805
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2186] Add InterfaceAudience.Internal to annotate internal interface
This closes #1986
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/623a1f93
Tree:
[CARBONDATA-2080] [S3-Implementation] Propagated hadoopConf from driver to
executor for s3 implementation in cluster mode.
Problem : hadoopconf was not getting propagated from driver to the executor
that's why load was failing to the distributed environment.
Solution: Setting the Hadoop conf in
[CARBONDATA-2159] Remove carbon-spark dependency in store-sdk module
To make assembling JAR of store-sdk module, it should not depend on
carbon-spark module
This closes #1970
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2099] Refactor query scan process to improve readability
Unified concepts in scan process flow:
1.QueryModel contains all parameter for scan, it is created by API in
CarbonTable. (In future, CarbonTable will be the entry point for various table
operations)
2.Use term ColumnChunk to
http://git-wip-us.apache.org/repos/asf/carbondata/blob/46031a32/processing/src/main/java/org/apache/carbondata/processing/loading/sort/unsafe/holder/UnsafeSortTempFileChunkHolder.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/28b5720f/processing/src/main/java/org/apache/carbondata/processing/loading/sort/unsafe/holder/UnsafeSortTempFileChunkHolder.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/datastore/chunk/impl/FixedLengthDimensionDataChunk.java
--
diff --git
[CARBONDATA-2213][DataMap] Fixed wrong version for module datamap-example
The version of Module âcarbondata-datamap-exampleâ should be 1.4.0-snapshot
instead of 1.3.0-snapshot, otherwise compilation will failed.
This closes #2011
Project:
[CARBONDATA-1543] Supported DataMap chooser and expression for supporting
multiple datamaps in single query
This PR supports 3 features.
1.Load datamaps from the DataMapSchema which are created through DDL.
2.DataMap Chooser: It chooses the datamap out of available datamaps based on
simple
http://git-wip-us.apache.org/repos/asf/carbondata/blob/1134431d/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/datamap/TestIndexDataMapCommand.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f8ded96e/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/datamap/IndexDataMapWriterSuite.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/1134431d/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletIndexDataMap.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f8ded96e/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/preaaggregate/PreAggregateTableHelper.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f8ded96e/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletIndexDataMap.java
--
diff --git
[HOTFIX] Fix timestamp issue in TestSortColumnsWithUnsafe
Fix timestamp issue in TestSortColumnsWithUnsafe
This closes #2001
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/96ee82b3
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/1134431d/core/src/main/java/org/apache/carbondata/core/indexstore/FineGrainBlocklet.java
--
diff --git
[CARBONDATA-2216][Test] Fix bugs in sdv tests
This closes #2012
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/62e33e5f
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/62e33e5f
Diff:
[REBASE] Fix style after rebasing master
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/f9139930
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/f9139930
Diff:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f8ded96e/core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletDataMap.java
--
diff --git
[CARBONDATA-2206] Fixed lucene datamap evaluation issue in executor
In case of MatchExpression it should return same bitset from
RowLevelFilterExecuterImpl
This closes #2010
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2189] Add DataMapProvider developer interface
Add developer interface for 2 types of DataMap:
1.IndexDataMap: DataMap that leveraging index to accelerate filter query
2.MVDataMap: DataMap that leveraging Materialized View to accelerate olap style
query, like SPJG query (select,
Repository: carbondata
Updated Branches:
refs/heads/datamap-rebase1 [created] f9139930e
http://git-wip-us.apache.org/repos/asf/carbondata/blob/c7a9f15e/integration/spark-common-test/src/test/scala/org/apache/carbondata/integration/spark/testsuite/preaggregate/TestPreAggCreateCommand.scala
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f8ded96e/docs/datamap-developer-guide.md
--
diff --git a/docs/datamap-developer-guide.md b/docs/datamap-developer-guide.md
new file mode 100644
index 000..31afd34
---
[CARBONDATA-2206] support lucene index datamap
This PR is an initial effort to integrate lucene as an index datamap into
carbondata.
A new module called carbondata-lucene is added to support lucene datamap:
1.Add LuceneFineGrainDataMap, implement FineGrainDataMap interface.
2.Add
http://git-wip-us.apache.org/repos/asf/carbondata/blob/dcfe73b8/processing/src/main/java/org/apache/carbondata/processing/loading/model/CarbonLoadModelBuilder.java
--
diff --git
[REBASE] Solve conflict after merging master
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/8104735f
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/8104735f
Diff:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/metadata/blocklet/SegmentInfo.java
--
diff --git
Repository: carbondata
Updated Branches:
refs/heads/branch-1.3 744032d3c -> ce9695633
[maven-release-plugin] prepare for next development iteration
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/ce969563
Tree:
Repository: carbondata
Updated Tags: refs/tags/apache-carbondata-1.3.1-rc1 [created] a1f6cc4c5
Repository: carbondata
Updated Branches:
refs/heads/branch-1.3 362513a68 -> 744032d3c
[maven-release-plugin] prepare release apache-carbondata-1.3.1-rc1
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/744032d3
http://git-wip-us.apache.org/repos/asf/carbondata/blob/46031a32/processing/src/main/java/org/apache/carbondata/processing/loading/sort/unsafe/holder/UnsafeSortTempFileChunkHolder.java
--
diff --git
[CARBONDATA-2018][DataLoad] Optimization in reading/writing for sort temp row
Pick up the no-sort fields in the row and pack them as bytes array and skip
parsing them during merge sort to reduce CPU consumption
This closes #1792
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
[CARBONDATA-2025] Unify all path construction through CarbonTablePath static
method
Refactory CarbonTablePath:
1.Remove CarbonStorePath and use CarbonTablePath only.
2.Make CarbonTablePath an utility without object creation, it can avoid
creating object before using it, thus code is cleaner
Support generating assembling JAR for store-sdk module
Support generating assembling JAR for store-sdk module and remove junit
dependency
This closes #1976
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/503e0d96
[CARBONDATA-1997] Add CarbonWriter SDK API
Added a new module called store-sdk, and added a CarbonWriter API, it can be
used to write Carbondata files to a specified folder, without Spark and Hadoop
dependency. User can use this API in any environment.
This closes #1967
Project:
[CARBONDATA-2159] Remove carbon-spark dependency in store-sdk module
To make assembling JAR of store-sdk module, it should not depend on
carbon-spark module
This closes #1970
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Revert "[CARBONDATA-2018][DataLoad] Optimization in reading/writing for sort
temp row"
This reverts commit de92ea9a123b17d903f2d1d4662299315c792954.
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/46031a32
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/586ab702/processing/src/main/java/org/apache/carbondata/processing/loading/sort/unsafe/holder/UnsafeSortTempFileChunkHolder.java
--
diff --git
Revert "[CARBONDATA-2023][DataLoad] Add size base block allocation in data
loading"
This reverts commit 6dd8b038fc898dbf48ad30adfc870c19eb38e3d0.
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/1d85e916
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/5663e916/processing/src/main/java/org/apache/carbondata/processing/loading/steps/DataWriterBatchProcessorStepImpl.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/faad967d/processing/src/main/java/org/apache/carbondata/processing/loading/sort/impl/UnsafeParallelReadMergeSorterWithBucketingImpl.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/55c4e438/core/src/main/java/org/apache/carbondata/core/datastore/chunk/impl/FixedLengthDimensionDataChunk.java
--
diff --git
[REBASE] resolve conflict after rebasing to master
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/880bbceb
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/880bbceb
Diff:
[CARBONDATA-2018][DataLoad] Optimization in reading/writing for sort temp row
Pick up the no-sort fields in the row and pack them as bytes array and skip
parsing them during merge sort to reduce CPU consumption
This closes #1792
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
[CARBONDATA-2156] Add interface annotation
InterfaceAudience and InterfaceStability annotation should be added for user
and developer
1.InetfaceAudience can be User and Developer
2.InterfaceStability can be Stable, Evolving, Unstable
This closes #1968
Project:
[CARBONDATA-2186] Add InterfaceAudience.Internal to annotate internal interface
This closes #1986
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/8996cd4a
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/9b9125b6/processing/src/test/java/org/apache/carbondata/carbon/datastore/BlockIndexStoreTest.java
--
diff --git
[CARBONDATA-1997] Add CarbonWriter SDK API
Added a new module called store-sdk, and added a CarbonWriter API, it can be
used to write Carbondata files to a specified folder, without Spark and Hadoop
dependency. User can use this API in any environment.
This closes #1967
Project:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/83df87dd/processing/src/main/java/org/apache/carbondata/processing/loading/model/CarbonLoadModelBuilder.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f06824e9/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableRenameCommand.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java
--
diff --git
[CARBONDATA-1827] S3 Carbon Implementation
1.Provide support for s3 in carbondata.
2.Added S3Example to create carbon table on s3.
3.Added S3CSVExample to load carbon table using csv from s3.
This closes #1805
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2099] Refactor query scan process to improve readability
Unified concepts in scan process flow:
1.QueryModel contains all parameter for scan, it is created by API in
CarbonTable. (In future, CarbonTable will be the entry point for various table
operations)
2.Use term ColumnChunk to
[HotFix][CheckStyle] Fix import related checkstyle
This closes #1952
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/d88d5bb9
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/d88d5bb9
Diff:
[CARBONDATA-1480]Min Max Index Example for DataMap
Datamap Example. Implementation of Min Max Index through Datamap. And Using the
Index while prunning.
This closes #1359
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/dcfe73b8/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/DataLoadingUtil.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/92c9f224/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java
--
diff --git
1 - 100 of 161 matches
Mail list logo