[CARBONDATA-2704] Index file size in describe formatted command is not updated
correctly with the segment file
Problem:
Describe formatted command is not showing correct index files size after index
files merge.
Solution:
Segment file should be updated with the actual index files size of that
http://git-wip-us.apache.org/repos/asf/carbondata/blob/85cdc404/store/horizon/src/main/java/org/apache/carbondata/horizon/rest/model/descriptor/SelectDescriptor.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/d9b40bf9/store/core/src/main/scala/org/apache/carbondata/store/Master.scala
--
diff --git a/store/core/src/main/scala/org/apache/carbondata/store/Master.scala
[CARBONDATA-2719] Block update and delete on table having datamaps
Table update/delete is needed to block on table which has datamaps.
This close #2483
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/56e7dad7
[CARBONDATA-2729][file-format] Schema Compatibility problem between version
1.3.0 and 1.4.0
Problem:
In TableSchema the field Name schemaEvaluation is changed to schemaEvoluation
and in DataMapSchema field name className is changed to providerName.
Due to this current Carbon version & Version
[CARBONDATA-2690][CarbonStore] implement RESTful API: create table, load data
and select
This PR adds:
1.basic framework
rewrite the carbon store's Master, Worker and Scheduler code in Java
2.RESTful API
support create a table by using file meta store
support load data to a table in single work
[CARBONDATA-2734] Fix struct of date issue in create table
problem: Struct of date is not supported currently in create table flow
as date datatype check is missing during parsing.
Hence child date column was not appended with parent name, leading to
StringOutOfIndex exception.
solution: Handle
[CARBONDATA-2648] Fixed NPE issue with legacy store when CACHE_LEVEL is Blocklet
Things done as part of this PR:
Fixed Null pointer exception when store is of <= 1.1 version and DataMap is of
type BlockletDataMap.
Added clearing of SegmentProperties cache holder from executor
Problem 1:
Null
[CARBONDATA-2724][DataMap]Unsupported create datamap on table with V1 or V2
format data
block creating datamap on carbon table with V1 or V2 format
Currently the version info is read from carbon data file
This closes #2488
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
[CARBONDATA-2714] Support merge index files for the segment
Problem :
The first-time query of carbon becomes very slow. It is because of reading many
small carbonindex files and cache to the driver at the first time.
Many carbonindex files are created in below case
Loading data in large cluster
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2009009a/integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/management/CarbonAddSegmentCommand.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/73419071/integration/spark2/src/test/scala/org/apache/spark/carbondata/TestStreamingTableWithRowParser.scala
--
diff --git
[REBASE] Rebasing with master branch and Fixing rebase conflict
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/239a6cad
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/239a6cad
Diff:
[CARBONDATA-2528][MV] Fixed order by in mv and aggregation functions inside
projection expressions are fixed
Problem:
Order by queries and the queries with functions like sum(a)+sum(b) are not
working in MV.
Please check jira for more details.
Solution:
The queries which have projection
[HOTFIX][CARBONDATA-2716][DataMap] fix bug for loading datamap
In some scenarios, input parameter of getCarbonFactDataHandlerModel called
carbonTable may be different from the one in loadmodel.
This close #2497
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2708][BloomDataMap] clear index file in case of data load failure
When data loading failed, clean the index DataMap files that generated
This closes #2463
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2698][CARBONDATA-2700][CARBONDATA-2732][BloomDataMap] block some
operations of bloomfilter datamap
1.Block create bloomfilter datamap index on column which its datatype is
complex type;
2.Block change datatype for bloomfilter index datamap;
3.Block dropping index columns for
http://git-wip-us.apache.org/repos/asf/carbondata/blob/4437920a/store/core/src/main/java/org/apache/carbondata/store/rpc/impl/StoreServiceImpl.java
--
diff --git
[HotFix] Getting carbon table identifier to datamap events
Passing the table identifier to keep track of table in event in case preload
and postload of datamap event.
This closes #2448
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/4437920a/store/horizon/src/main/java/org/apache/carbondata/horizon/antlr/gen/ExpressionLexer.java
--
diff --git
[CARBONDATA-2720] Remove dead code
For acturate coverage results and easy maintainance
This closes #2354
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/f9114036
Tree:
[CARBONDATA-2717] fixed table id empty problem while taking drop lock
This closes #2472
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/637a9746
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/637a9746
http://git-wip-us.apache.org/repos/asf/carbondata/blob/4437920a/store/horizon/src/main/java/org/apache/carbondata/horizon/rest/model/view/LoadRequest.java
--
diff --git
[CARBONDATA-2655][BloomDataMap] BloomFilter datamap support in operator
Now queries with in expression on bloom index column can leverage the
BloomFilter datamap.
This closes #2445
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f9114036/datamap/lucene/src/main/java/org/apache/carbondata/datamap/lucene/LuceneCoarseGrainDataMap.java
--
diff --git
[CARBONDATA-2712] Added fix for Local Dictionary Exclude for multi level
complex columns
What was the problem?
When Local Dictionary Exclude was defined for multi level complex columns, the
columns were still considered for Local Dictionary Include
What has been changed?
The index value was
[CARBONDATA-2723][DataMap] Fix bugs in recreate datamap on table
While we drop datamap/table, the executor side cache for datamap is
stale. So if we recreate the datamap with different index columns, when
we are doing data loading, the cache should be cleaned, otherwise the
DataMapWriterListener
[CARBONDATA-2688][CarbonStore] Support SQL in REST API
Support SQL interface in Horizon service.
Support REST client for SQL
This closes #2481
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/d4a28a25
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f9114036/core/src/test/java/org/apache/carbondata/core/datastore/SegmentTaskIndexStoreTest.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/85cdc404/store/core/src/main/java/org/apache/carbondata/store/conf/StoreConf.java
--
diff --git
a/store/core/src/main/java/org/apache/carbondata/store/conf/StoreConf.java
[CARBONDATA-2609] Change RPC implementation to Hadoop RPC framework
This closes #2372
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/d9b40bf9
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/d9b40bf9
[CARBONDATA-2649] Fixed arrayIndexOutOfBoundException while loading Blocklet
DataMap after alter add column operation
Things done as part of this PR
Fixed arrayIndexOutOfBoundException while loading Blocklet DataMap after alter
add column operation
Problem:
Array Index out of bound exception
http://git-wip-us.apache.org/repos/asf/carbondata/blob/85cdc404/store/core/src/main/java/org/apache/carbondata/store/rpc/model/BaseResponse.java
--
diff --git
[CARBONDATA-2482] Pass uuid while writing segment file if possible
Pass the uuid (segmentFileName) to the writeSegmentFile method file if possible.
Problem:
When the supporting tables depends on the segmentFileName of the main table,
query is failing, as it is expected to be the same name as
http://git-wip-us.apache.org/repos/asf/carbondata/blob/85cdc404/store/core/src/main/java/org/apache/carbondata/store/impl/rpc/model/QueryResponse.java
--
diff --git
[CARBONDATA-2606][Complex DataType Enhancements]Fix Null result if First two
Projection column have same parent and third column has different Parent Struct
Problem:
When multiple columns are there,then the first child elements is only going
to make parent Object Array. For all other cases it
Repository: carbondata
Updated Branches:
refs/heads/carbonstore 4b96ed8ca -> 239a6cadb (forced update)
[CARBONDATA-2613] Support csv based carbon table
1. create csv based carbon table using
CREATE TABLE fact_table (col1 bigint, col2 string, ..., col100 string)
STORED BY 'CarbonData'
TBLPROPERTIES(
'foramt'='csv',
'csv.delimiter'=',',
'csv.header'='col1,col2,col100')
2. Load data to this
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f9114036/processing/src/main/java/org/apache/carbondata/processing/loading/steps/DataWriterProcessorStepImpl.java
--
diff --git
[CARBONDATA-2705][CarbonStore] CarbonStore Java API and Implementation
Support two implementations:
1.LocalCarbonStore for usage in local mode
2.DistributedCarbonStore leveraging multiple server (Master and Workers) via RPC
This closes #2473
Project:
[HOTFIX] Removed BatchedDataSourceScanExec class and extended directly from
FileSourceScanExec
Problem:
Since some of the code of BatchedDataSourceScanExec is copied from spark, it is
difficult to maintain from version upgrades of spark. Currently we face issues
during spark 2.3 upgrade so
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f9114036/core/src/main/java/org/apache/carbondata/core/datastore/block/SegmentTaskIndexWrapper.java
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f9114036/integration/spark-common/src/main/scala/org/apache/carbondata/events/AlterTableEvents.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f9114036/core/src/main/java/org/apache/carbondata/core/scan/executor/util/QueryUtil.java
--
diff --git
[CARBONDATA-2637][DataMap] Fix bugs in rebuild datamap
In cluster mode, readCommitScope is null while rebuilding datamap for
segments, this will cause NPE. Here we use the origin segment object
whose readCommitScope is not null and will work fine.
This closes #2493
Project:
[CARBONDATA-2727][BloomDataMap] Support create bloom datamap on newly added
column
Add a result collector with rowId infomation for datamap rebuild if table
schema is changed;
Use keygenerator to retrieve surrogate value of dictIndexColumn from query
result;
This closes #2490
Project:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/f9114036/core/src/test/java/org/apache/carbondata/core/scan/expression/conditional/GreaterThanExpressionUnitTest.java
--
diff --git
Repository: carbondata
Updated Branches:
refs/heads/carbonstore 96fe233a2 -> 4b96ed8ca
[CARBONDATA-2736][CARBONSTORE] Kafka integration with Carbon StreamSQL
Modification in this PR:
1.Pass source table properties to streamReader.load()
2.Do not pass schema when sparkSession.readStream
Repository: carbondata
Updated Branches:
refs/heads/carbonstore 0aab4e7c6 -> 96fe233a2
http://git-wip-us.apache.org/repos/asf/carbondata/blob/96fe233a/streaming/src/main/java/org/apache/carbondata/streaming/CarbonStreamRecordReader.java
Revert "[CARBONDATA-2532][Integration] Carbon to support spark 2.3 version,
ColumnVector Interface"
This reverts commit 2b8ae2628d50efcd095696b5bf614eab2fcdb8d2.
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 75a602d01 -> cdee81d4d
[CARBONDATA-2693][BloomDataMap]Fix bug for alter rename is renaming the
existing table on which bloomfilter datamp exists
Fix bug for alter rename is renaming the existing table on which bloom filter
datamap
Repository: carbondata
Updated Branches:
refs/heads/master 3df2fd030 -> bc12de004
[CARBONDATA-2729][file-format] Schema Compatibility problem between version
1.3.0 and 1.4.0
Problem:
In TableSchema the field Name schemaEvaluation is changed to schemaEvoluation
and in DataMapSchema field
Repository: carbondata
Updated Branches:
refs/heads/master 653efee02 -> 3df2fd030
[HOTFIX] Removed BatchedDataSourceScanExec class and extended directly from
FileSourceScanExec
Problem:
Since some of the code of BatchedDataSourceScanExec is copied from spark, it is
difficult to maintain
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0aab4e7c/integration/spark2/src/main/commonTo2.2And2.3/org/apache/spark/sql/hive/CreateCarbonSourceTableAsSelectCommand.scala
--
diff --git
Revert "[CARBONDATA-2532][Integration] Carbon to support spark 2.3 version,
compatability issues"
This reverts commit d0fa52396687ccc1a5d029006e7204771c04a9eb.
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0aab4e7c/integration/spark2/src/main/spark2.2/org/apache/spark/sql/execution/BatchedDataSourceScanExec.scala
--
diff --git
Repository: carbondata
Updated Branches:
refs/heads/carbonstore 7306b59dd -> 0aab4e7c6
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0aab4e7c/pom.xml
--
diff --git a/pom.xml b/pom.xml
index d6436d7..74e5c01 100644
---
Repository: carbondata
Updated Branches:
refs/heads/master 18381e3db -> 1fd370399
[CARBONDATA-2708][BloomDataMap] clear index file in case of data load failure
When data loading failed, clean the index DataMap files that generated
This closes #2463
Project:
Repository: carbondata
Updated Branches:
refs/heads/master 641ec098f -> 57b457153
[CARBONDATA-2716][DataMap] Add validation for datamap writer listener during
data loading
In some scenarios, while doing data loading, the loading will use the datamap
writer listener that does not belong to
Repository: carbondata
Updated Branches:
refs/heads/master 438b4421e -> 641ec098f
[CARBONDATA-2702][BloomDatamap] Fix bugs in clear bloom datamap concurrently
add synchronization for clearing bloom datamap, this implementation
refers to BlockletDataMapFactory in PR2324 for CARBONDATA-2496
Repository: carbondata
Updated Branches:
refs/heads/master d8562e5bd -> a5039baa3
[CARBONDATA-2715][LuceneDataMap] Fix bug in search mode with lucene datamap in
windows
While comparing two pathes, the file separator is different in windows,
thus causing empty pruned blocklets. This PR will
Repository: carbondata
Updated Branches:
refs/heads/master 1dc86d291 -> 0112ed096
[CARBONDATA-2685][DataMap] Parallize datamap rebuild processing for segments
Currently in carbondata, while rebuilding datamap, one spark job will be
started for each segment and all the jobs are executed
Repository: carbondata
Updated Branches:
refs/heads/master c2c5b18ba -> c3bc1ba10
[CARBONDATA-2658][DataLoad] Fix bugs in spilling in-memory pages
the parameter carbon.load.sortMemory.spill.percentage configured the value
range 0-100,according to configuration merge and spill in-memory pages
Repository: carbondata
Updated Branches:
refs/heads/master f0c77e5f5 -> c2c5b18ba
[CARBONDATA-2681][32K] Fix loading problem using global/batch sort fails when
table has long string columns
In SortStepRowHandler, global/batch sort use convertRawRowTo3Parts instead of
Repository: carbondata
Updated Branches:
refs/heads/master cb10d03a7 -> 21b56dfd8
[CARBONDATA-2686] Implement Left join on MV datamap
This closes #2444
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/21b56dfd
Repository: carbondata
Updated Branches:
refs/heads/master 3e88858a2 -> a7c4b4878
[CARBONDATA-2657][BloomDataMap] Fix bugs in loading and querying on bloom
column with empty values
Fix bugs in loading and querying on bloom column â¦
Fix bugs in loading and querying with empty values on
http://git-wip-us.apache.org/repos/asf/carbondata/blob/6fa86381/store/core/src/main/java/org/apache/carbondata/store/rpc/impl/StoreServiceImpl.java
--
diff --git
Repository: carbondata
Updated Branches:
refs/heads/carbonstore fa111380f -> 6fa86381f
http://git-wip-us.apache.org/repos/asf/carbondata/blob/6fa86381/store/horizon/src/main/java/org/apache/carbondata/horizon/rest/model/view/LoadRequest.java
[CARBONDATA-2690][CarbonStore] implement RESTful API: create table, load data
and select
This PR adds:
1.basic framework
rewrite the carbon store's Master, Worker and Scheduler code in Java
2.RESTful API
support create a table by using file meta store
support load data to a table in single work
http://git-wip-us.apache.org/repos/asf/carbondata/blob/6fa86381/store/horizon/src/main/java/org/apache/carbondata/horizon/antlr/gen/ExpressionLexer.java
--
diff --git
Repository: carbondata
Updated Branches:
refs/heads/master 5159abfc3 -> 3e88858a2
[CARBONDATA-2683][32K] fix data convertion problem for Varchar
Spark uses org.apache.spark.unsafe.types.UTF8String for string datatype
internally.
In carbon, varchar datatype should do the same convertion as
Repository: carbondata
Updated Branches:
refs/heads/master c0de9f160 -> 020335a8c
[CARBONDATA-2687][BloomDataMap][Doc] Update document for bloomfilter datamap
In previous PR, cache behaviour for bloomfilter datamap has been changed:
changed from guava-cache to carbon-cache. This PR update
Repository: carbondata
Updated Branches:
refs/heads/master aeb2ec4cd -> c0de9f160
[CARBONDATA-2654][Datamap] Optimize output for explaining querying with datamap
Currently if we have multiple datamaps and the query hits all the
datamaps, carbondata explain command will only show the first
Repository: carbondata
Updated Branches:
refs/heads/master 133ec17e5 -> cd7c2102c
http://git-wip-us.apache.org/repos/asf/carbondata/blob/cd7c2102/processing/src/main/java/org/apache/carbondata/processing/datamap/DataMapWriterListener.java
[CARBONDATA-2633][BloomDataMap] Fix bugs in bloomfilter for
dictionary/sort/date/TimeStamp column
for dictionary column, carbon convert literal value to dict value, then
convert dict value to mdk value, at last it stores the mdk value as
internal value in carbonfile.
for other columns, carbon
Repository: carbondata
Updated Branches:
refs/heads/master dac5d3ce3 -> 12c28c946
[CARBONDATA-2674][Streaming]Streaming with merge index enabled does not
consider the merge index file while pruning
This closes #2429
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 8cb37dd75 -> dac5d3ce3
[CARBONDATA-2653][BloomDataMap] Fix bugs in incorrect blocklet number in
bloomfilter
In non-deferred reuibuild scenario, the last bloomfilter index file has already
been written onBlockletEnd, no need to write
Repository: carbondata
Updated Branches:
refs/heads/master 459331c32 -> 8cb37dd75
[CARBONDATA-2644][DataLoad]ADD carbon.load.sortMemory.spill.percentage
parameter invalid value check
This closes #2397
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master e30a84cc5 -> 459331c32
[CARBONDATA-2629] Support SDK carbon reader read data from HDFS and S3 with
filter function
Now SDK carbon reader only support read data from local with filter function,
it will throw exception when read data
Repository: carbondata
Updated Branches:
refs/heads/master ca201604a -> e30a84cc5
[CARBONDATA-2545] Fix some spell error in CarbonData
This closes #2419
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/e30a84cc
Repository: carbondata
Updated Branches:
refs/heads/master 00d6ce49d -> ca201604a
[CARBONDATA-2646][DataLoad]change the log level while loading data into a table
with 'sort_column_bounds' property,'ERROR' flag change to 'WARN' flag for some
expected tasks.
change the log level while loading
Repository: carbondata
Updated Branches:
refs/heads/master 589fe1883 -> 00d6ce49d
[CARBONDATA-2635][BloomDataMap] Support different index datamaps on same column
User can create different provider based index datamaps on one column,
for example user can create bloomfilter datamap and lucene
Repository: carbondata
Updated Branches:
refs/heads/master f7552a97d -> 64ae5ae0b
[CARBONDATA-2634][BloomDataMap] Add datamap properties in show datamap outputs
add datamap properties in show datamap outputs
This closes #2404
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Repository: carbondata
Updated Branches:
refs/heads/master a0350e100 -> 047c502b2
[CARBONDATA-2549] Bloom remove guava cache and use CarbonCache
Currently, bloom cache is implemented using guava cache, carbon has its own lru
cache interfaces and complete sysytem it controls the cache
Repository: carbondata
Updated Branches:
refs/heads/carbonstore b3f782062 -> d5e86db52
[CARBONDATA-2626] Fix bugs in sequence of column page
In tablespec, the first dimension spec is 'name' which is sort_columns but not
dictionary. But Carbondata now treat it as dictionary page. (See
[CARBONDATA-2624] Added validations for complex dataType columns in create
table command for Local Dictionary Support
1. If Duplicate columns exist, the column names were not displayed in the error
message
2. Considered to check for duplicates if extra space was the difference between
column
Repository: carbondata
Updated Branches:
refs/heads/master b3f782062 -> 18dc3ff49
[CARBONDATA-2626] Fix bugs in sequence of column page
In tablespec, the first dimension spec is 'name' which is sort_columns but not
dictionary. But Carbondata now treat it as dictionary page. (See TablePage
http://git-wip-us.apache.org/repos/asf/carbondata/blob/dc53dee2/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/longstring/VarcharDataTypesBasicTestCase.scala
--
diff --git
[CARBONDATA-2509][CARBONDATA-2510][CARBONDATA-2511][32K] Add validate for long
string columns
Add validate for long string columns
1. long string columns cannot be sort_columns
2. long string columns cannot be dictionary include
3. long string columns cannot be dictionary exclude
4. long string
Support adding local dictionary configuration in create table statement and
show the configs in describe formatted table
What changes were proposed in this pull request?
In this PR, in order to support local dictionary,
create table changes are made to support local dictionary configurations as
[CARBONDATA-2604] Getting ArrayIndexOutOfBoundException during compaction after
IUD in cluster is fixed
Issue: if some records are deleted then during filling the measure and
dimension data no of valid rows count and actual rows may be different if
some records are deleted . and during filling
[CARBONDATA-2615][32K] Support page size less than 32000 in CarbondataV3
Since we support super long string, if it is long enough, a column page
with 32000 rows will exceed 2GB, so we support a page less than 32000
rows.
This closes #2383
Project:
[CARBONDATA-2428] Support flat folder for managed carbon table
Currently carbondata writing happens in fixed path
tablepath/Fact/Part0/Segment_NUM folder and it is not same as hive/parquet
folder structure. This PR makes all files written will be inside tablepath, it
does not maintain any
[CARBONDATA-2577] [CARBONDATA-2579] Fixed issue in Avro logical type for nested
Array and document update
Problem: Nested Array logical type of date, timestamp-millis, timestamp-micros
is not working.
Root cause: During the preparation of carbon schema from avro schema. For array
nested type
[CARBONDATA-2553] support ZSTD compression for sort temp file
This closes #2350
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/ece06729
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/ece06729
Diff:
[CARBONDATA-2513][32K] Support write long string from dataframe
support write long string from dataframe
This closes #2382
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/55f4bc6c
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/c5a4ec07/integration/spark-common/src/main/scala/org/apache/spark/sql/catalyst/CarbonDDLSqlParser.scala
--
diff --git
[CARBONDATA-2573] integrate carbonstore mv branch
Fixes bugs related to MV and added tests
This closes #2335
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/0ef7e55c
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2ea3b2dc/store/sdk/src/main/java/org/apache/carbondata/store/LocalCarbonStore.java
--
diff --git
a/store/sdk/src/main/java/org/apache/carbondata/store/LocalCarbonStore.java
[CARBONDATA-2623][DataMap] Add DataMap Pre and Pevent listener
Added Pre and Post Execution Events for index datamap
This closes #2389
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/b3f78206
Tree:
301 - 400 of 2025 matches
Mail list logo