[CARBONDATA-2610] Fix for datamap creation failed on table having loaded data
with null value on string datatype
Problem: Datamap creation having null values already loaded in string datatype
of table fails.
Solution: Check for null before converting data to the string.
This closes #2376
[HOTFIX] Added Performance Optimization for Presto by using MultiBlockSplit
Added Performance Optimization for Presto by using MultiBlockSplit
This closes #2265
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2504][STREAM] Support StreamSQL for streaming job
Currently, user need to write Spark Streaming APP to use carbon streaming
ingest feature, which is not so easy for some users. By providing StreamSQL,
user can manage the streaming job more easily.
This closes #2328
Project:
[CARBONDATA-2593] Add an option 'carbon.insert.storage.level' to support
configuring the storage level when insert into data with
'carbon.insert.persist.enable'='true'
When insert into data with 'carbon.insert.persist.enable'='true', the storage
level of dataset is 'MEMORY_AND_DISK',
it should
[CARBONDATA-2420][32K] Support string longer than 32000 characters
Add a property in creating table 'long_string_columns' to support string
columns that will contains more than 32000 characters.
Inside carbondata, it use an integer instead of short to store the length of
bytes content.
http://git-wip-us.apache.org/repos/asf/carbondata/blob/c5a4ec07/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/localdictionary/LocalDictionarySupportCreateTableTest.scala
--
diff --git
[CARBONDATA-2566] Optimize CarbonReaderExample
Optimize CarbonReaderExample
1.Add different data type, including date and timestamp
2. update the doc
3.invoke the
Schema schema = CarbonSchemaReader
.readSchemaInSchemaFile(dataFiles[0].getAbsolutePath())
.asOriginOrder();
This closes #2356
Repository: carbondata
Updated Branches:
refs/heads/carbonstore 638ed1fa7 -> b3f782062
[CARBONDATA-2521] Support create carbonReader without tableName
Add new method for creating carbonReader without tableName
1.add new interface: public static CarbonReaderBuilder builder(String tablePath)
[HOTFIX][CARBONDATA-2591] Fix SDK CarbonReader filter issue
There are some issue in SDK CarbonReader filter function, please check the lira.
This closes #2363
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/60dfdd38/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/datacompaction/CarbonIndexFileMergeTestCase.scala
--
diff --git
[HOTFIX] fix java style errors
This closes #2371
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/ff036459
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/ff036459
Diff:
[CARBONDATA-2575] Add document to explain DataMap Management
Add document to explain DataMap Management
This closes #2360
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/d401e060
Tree:
[CARONDATA-2559]task id set for each carbonReader in threadlocal
1. Task Id set for CarbonReader because for each CarbonReader object it should
be separate Thread Local variable .
2. If sort-Column is not given to CarbonWriter Describe formatted showing
default sort_cols is fixed
3. Issue :
[CARBONDATA-2578] fixed memory leak inside CarbonReader and handled failure
case for creation of
multi reader for non-transactional table
Issue :
CarbonIterator inside CarbonRecordReader was keeping reference of RowBatch and
it is not being
closed inside CarbonRecordReader. sort_column with
[CARBONDATA-2614] Fix the error when using FG in search mode and the prune
result is none
the prune result is none, and can not set datamapWritePath, which will not
generate bitSegGroup in
[CARBONDATA-2603] Fix: error handling during reader build failure
problem :
When the CarbonReaderBuilder.build() is failed due to some problems like invalid
projection that leads to query model creation failure. Blocklet datamap is not
cleared for that table.So,
the next reader instance uses old
[CARBONDATA-2529] Fixed S3 Issue for Hadoop 2.8.3
This issue fixes the issue while loading the data with S3 as backend
This closes #2340
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/4d22ddc9
Tree:
[CARBONDATA-2569] Change the strategy of Search mode throw exception and run
sparkSQL
Search mode throw exception but test case pass, please check the jira.
This closes #2357
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0ef7e55c/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/matching/TestTPCDS_1_4_Batch.scala
--
diff --git
[Documentation] Editorial Review comment fixed
Editorial Review comment fixed
This closes #2320
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/5ad70095
Tree:
[CARBONDATA-2592][Integration] Getting NoSuchMethod error due to aws sdk
multple version jar conflicts
## What changes were proposed in this pull request?
Currently in Carbon Spark2 project multiple dependency for the aws-sdk jar has
been defined,this will create issue when we build
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0ef7e55c/datamap/mv/plan/src/test/scala/org/apache/carbondata/mv/testutil/TestSQLBatch2.scala
--
diff --git
[CARBONDATA-2355] Support run SQL on carbondata files directly
Support run SQL on carbondata files directly
This closes #2181
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/9469e6bd
Tree:
[CARBONDATA-2571] Calculating the carbonindex and carbondata file size of a
table is wrong
Problem:
While calculating the carbonindex files size, we are checking either index file
or merge file. But in PR#2333, implementation is changed to fill both
the file name and the merge file name. So, we
[CARBONDATA-2508] Fix the exception that can't get executorService when start
search mode twice
This closes #2355
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/6aadfe70
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/60dfdd38/processing/src/main/java/org/apache/carbondata/processing/store/writer/AbstractFactDataWriter.java
--
diff --git
[Hoxfix] Upgrade dev version to 1.5.0-SNAPSHOT and fix some small issues
1.Upgrade dev version to 1.5.0-SNAPSHOT
2.Fix carbon-spark-sql issue
3.Remove hadoop 2.2 profile
This closes #2359
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
[CARBONDATA-2418] [Presto] [S3] Fixed Presto Can't Query CarbonData When
CarbonStore is at S3
This closes #2287
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/dc4f87ba
Tree:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0ef7e55c/datamap/mv/plan/src/main/scala/org/apache/carbondata/mv/dsl/package.scala
--
diff --git
a/datamap/mv/plan/src/main/scala/org/apache/carbondata/mv/dsl/package.scala
[CARBONDATA-2554] Added support for logical type
Added support for date and timestamp logical types in AvroCarbonWriter.
This closes #2347
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/2f234869
Tree:
[CARBONDATA-1787] Updated data-management-on-carbondata.md for
GLOBAL_SORT_PARTITIONS
This closes #1668
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/ca466d9f
Tree:
[CARBONDATA-2557] [CARBONDATA-2472] [CARBONDATA-2570] Improve Carbon Reader
performance on S3 and fixed datamap clear issue in reader
[CARBONDATA-2557] [CARBONDATA-2472] Problem : CarbonReaderBuilder.build() is
slower in s3. It takes around 8 seconds to finish build()
Solution: S3 is slow in
[CARBONDATA-2616][BloomDataMap] Fix bugs in querying bloom datamap with two
index columns
During pruning in bloomfilter datamap, the same blocklets has been added
to result more than once, thus causing explaining and querying returning
incorrect result.
This closes #2386
Project:
[CARBONDATA-2617] Invalid tuple-id and block id getting formed for Non
partition table
Problem
Invalid tuple and block id getting formed for non partition table
Analysis
While creating a partition table a segment file was written in the Metadata
folder under table structure. This was
[CARBONDATA-2309][DataLoad] Add strategy to generate bigger carbondata files in
case of small amout of data
This closes #2314
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/685087ed
Tree:
[CARBONDATA-2611] Added Test Cases for Local Dictionary Support for Create
Table comand
What changes were proposed in this pull request?
In this PR, UTs and SDV test cases are added for local dictionary support for
create table command and describe formatted command.
changed the error message
Repository: carbondata
Updated Branches:
refs/heads/master 01b48fc36 -> 6eb360e1f
[CARBONDATA-2616][BloomDataMap] Fix bugs in querying bloom datamap with two
index columns
During pruning in bloomfilter datamap, the same blocklets has been added
to result more than once, thus causing
Repository: carbondata
Updated Branches:
refs/heads/master dc4f87ba5 -> ca466d9f4
[CARBONDATA-1787] Updated data-management-on-carbondata.md for
GLOBAL_SORT_PARTITIONS
This closes #1668
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master e7fed361b -> dc4f87ba5
[CARBONDATA-2418] [Presto] [S3] Fixed Presto Can't Query CarbonData When
CarbonStore is at S3
This closes #2287
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 5593d1646 -> ece067293
[CARBONDATA-2553] support ZSTD compression for sort temp file
This closes #2350
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 60dfdd385 -> 5593d1646
[CARBONDATA-2614] Fix the error when using FG in search mode and the prune
result is none
the prune result is none, and can not set datamapWritePath, which will not
generate bitSegGroup in
[CARBONDATA-2428] Support flat folder for managed carbon table
Currently carbondata writing happens in fixed path
tablepath/Fact/Part0/Segment_NUM folder and it is not same as hive/parquet
folder structure. This PR makes all files written will be inside tablepath, it
does not maintain any
Repository: carbondata
Updated Branches:
refs/heads/master 181f0ac9b -> 60dfdd385
http://git-wip-us.apache.org/repos/asf/carbondata/blob/60dfdd38/processing/src/main/java/org/apache/carbondata/processing/store/writer/AbstractFactDataWriter.java
http://git-wip-us.apache.org/repos/asf/carbondata/blob/60dfdd38/integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/datacompaction/CarbonIndexFileMergeTestCase.scala
--
diff --git
Repository: carbondata
Updated Branches:
refs/heads/master f0c88348a -> 181f0ac9b
[CARBONDATA-2593] Add an option 'carbon.insert.storage.level' to support
configuring the storage level when insert into data with
'carbon.insert.persist.enable'='true'
When insert into data with
Repository: carbondata
Updated Branches:
refs/heads/master efad40d57 -> f1163524f
[CARBONDATA-2599] Use RowStreamParserImp as default value of config
'carbon.stream.parser'
Parser 'RowStreamParserImpl' is used more often for real scene, so use
'RowStreamParserImpl' as default value of
Repository: carbondata
Updated Branches:
refs/heads/master 685087ed4 -> ff0364599
[HOTFIX] fix java style errors
This closes #2371
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/ff036459
Tree:
Repository: carbondata
Updated Branches:
refs/heads/master 9b88a0652 -> 685087ed4
[CARBONDATA-2309][DataLoad] Add strategy to generate bigger carbondata files in
case of small amout of data
This closes #2314
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0ef7e55c/datamap/mv/plan/src/main/scala/org/apache/carbondata/mv/dsl/package.scala
--
diff --git
a/datamap/mv/plan/src/main/scala/org/apache/carbondata/mv/dsl/package.scala
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0ef7e55c/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/matching/TestTPCDS_1_4_Batch.scala
--
diff --git
[CARBONDATA-2573] integrate carbonstore mv branch
Fixes bugs related to MV and added tests
This closes #2335
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/0ef7e55c
Tree:
Repository: carbondata
Updated Branches:
refs/heads/master 83ee2c45f -> 0ef7e55c4
http://git-wip-us.apache.org/repos/asf/carbondata/blob/0ef7e55c/datamap/mv/plan/src/test/scala/org/apache/carbondata/mv/testutil/TestSQLBatch2.scala
Repository: carbondata
Updated Branches:
refs/heads/master 5f68a792f -> d401e060a
[CARBONDATA-2575] Add document to explain DataMap Management
Add document to explain DataMap Management
This closes #2360
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 5b2b91304 -> 6aadfe70a
[CARBONDATA-2508] Fix the exception that can't get executorService when start
search mode twice
This closes #2355
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master b33845936 -> 5b2b91304
[CARBONDATA-2521] Support create carbonReader without tableName
Add new method for creating carbonReader without tableName
1.add new interface: public static CarbonReaderBuilder builder(String tablePath)
Repository: carbondata
Updated Branches:
refs/heads/master 74770aa38 -> b33845936
[CARBONDATA-2389] Search mode support FG datamap
Search mode support FG datamap
This closes #2290
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master e74018243 -> a7faef8a0
[CARBONDATA-2546] Fixed the ArrayIndexOutOfBoundsException when give same
column twice in projection of CarbonReader
This closes #2348
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 2993034e8 -> e74018243
[CARBONDATA-2558] Optimize carbon schema reader interface of SDK
Optimize carbon schema reader interface of SDK
1.create CarbonSchemaReader and move schema read interface from CarbonReader to
Repository: carbondata
Updated Branches:
refs/heads/master 22d5035c8 -> 8896a6334
[CARBONDATA-2500] Add new API to read user's schema in SDK
The field order in schema that SDK returns is different between write and read
data type of schema in SDK
This closes #2341
Project:
Repository: carbondata
Updated Branches:
refs/heads/master 1b6ce8cdc -> 7f4bd3d06
[CARBONDATA-2520] Clean and close datamap writers on any task failure during
load
Problem: The datamap writers registered to listener are closed or finished only
in case of load success case and not in any
Repository: carbondata
Updated Branches:
refs/heads/master 33b825d7f -> d8bafa34d
[CARBONDATA-2507] enable.offheap.sort not validate in CarbonData
In #2274, the value of enable.offheap.sort will transform to false when args[0]
not equal to true, including false and other string, like f,any
Repository: carbondata
Updated Branches:
refs/heads/master a7ac65648 -> 33b825d7f
[CARBONDATA-2545] Fix some spell error in CarbonData
Change Inerface to Interface
This closes #2346
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 604902b9a -> d9534c2c0
[CARBONDATA-2495][Doc][BloomDataMap] Add document for bloomfilter datamap
add document for bloomfilter datamap
This closes #2323
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 0e011977e -> 7cba44b90
[CARBONDATA-2487] Block filters for lucene with more than one text_match udf
This closes #2311
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 784b22de8 -> b8d5abf2e
[CARBONDATA-2354] fixed streaming example
This closes #2182
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/b8d5abf2
Tree:
Repository: carbondata
Updated Branches:
refs/heads/master 3087323a9 -> 784b22de8
[HOTFIX] [CARBONDATA-2480] Search mode RuntimeException: Error while resolving
filter expression
Invoke chooseFGDataMap in Worker in search mode
This closes #2309
Project:
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2881c6bb/integration/spark-common-test/src/test/resources/tpch/lineitem.csv
--
diff --git a/integration/spark-common-test/src/test/resources/tpch/lineitem.csv
[CARBONDATA-2475] Support Modular Core for Materialized View DataMap for query
matching and rewriting
Integrate MV DataMap to Carbon
This closes #2302
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/2881c6bb
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2881c6bb/integration/spark-common-test/src/test/resources/tpch/region.csv
--
diff --git a/integration/spark-common-test/src/test/resources/tpch/region.csv
http://git-wip-us.apache.org/repos/asf/carbondata/blob/bf73e9fe/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/MVTpchTestCase.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2881c6bb/integration/spark2/src/main/spark2.2/org/apache/spark/sql/hive/CarbonAnalyzer.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2881c6bb/integration/spark-common-test/src/test/resources/tpch/customers.csv
--
diff --git
a/integration/spark-common-test/src/test/resources/tpch/customers.csv
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2881c6bb/integration/spark-common-test/src/test/resources/tpch/supplier.csv
--
diff --git a/integration/spark-common-test/src/test/resources/tpch/supplier.csv
http://git-wip-us.apache.org/repos/asf/carbondata/blob/bf73e9fe/datamap/mv/core/src/main/scala/org/apache/carbondata/mv/rewrite/Utils.scala
--
diff --git
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2881c6bb/integration/spark-common-test/src/test/resources/tpch/nation.csv
--
diff --git a/integration/spark-common-test/src/test/resources/tpch/nation.csv
http://git-wip-us.apache.org/repos/asf/carbondata/blob/2881c6bb/integration/spark-common-test/src/test/resources/tpch/orders.csv
--
diff --git a/integration/spark-common-test/src/test/resources/tpch/orders.csv
[CARBONDATA-2475] Support Modular Core for Materialized View DataMap for query
matching and rewriting
Support Modular Core for Materialized View DataMap
This closes #2302
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master d14c403f6 -> 2881c6bbc
http://git-wip-us.apache.org/repos/asf/carbondata/blob/bf73e9fe/datamap/mv/core/src/test/scala/org/apache/carbondata/mv/rewrite/matching/TestTPCDS_1_4_Batch.scala
Repository: carbondata
Updated Branches:
refs/heads/master ffddba704 -> d14c403f6
[CARBONDATA-2459][DataMap] Add cache for bloom filter datamap
Loading bloom filter from bloomindex file is slow. Adding cache for this
procedure will surely improve the query performance
This closes #2300
Repository: carbondata
Updated Branches:
refs/heads/master 6b70b7e47 -> 61afa42da
[CARBONDATA-2455]Fix _System Folder creation and lucene AND,OR,NOT Filter fix
This closes #2281
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master cc0cbbac7 -> ceb7c8dd1
[CARBONDATA-2464]Fixed OOM issue in case of Complex type
Problem: Query with Complex type is failing with OOM
Root Cause: Complex type child column(No-dictionary) values are written in LV
format, while reading
Repository: carbondata
Updated Branches:
refs/heads/master 452c42b99 -> fb1289747
[CARBONDATA-2408] Fix search mode master SaslException issue in the first time
This closes #2239
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 6b949716b -> 452c42b99
[CARBONDATA-2441][Datamap] Implement distribute interface for bloom datamap
Implement distribute interface for bloom datamap
This closes #2272
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Repository: carbondata
Updated Branches:
refs/heads/master 2c0fa1079 -> 6b949716b
[CARBONDATA-2454][DataMap] Add fpp property for bloom datamap
add fpp(false positive probability) property to configure bloom filter
that used by bloom datamap.
This closes #2279
Project:
Repository: carbondata
Updated Branches:
refs/heads/master d5da9a194 -> 2c0fa1079
[CARBONDATA-2458] Remove unnecessary TableProvider interface
Remove unused / unnecessary TableProvider Interface
This closes #2280
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 9db662a2d -> 5cad92f4f
[HOTFIX] Fix lucene match limit code
Problem
Currently, Lucene match limit is set as Static in MatchExpression it cannot
work in concurrent scenarios.
Solution:
Change to object variable and get the match max
Repository: carbondata
Updated Branches:
refs/heads/master 7a697fd3c -> f2fb06806
[CARBONDATA-2439][Pom] upgrade guava version for bloom datamap
upgrade guava version from 11.0.2(provided by hadoop) to 14.0.1(provided by
spark)
The dependency scope in carbon-bloom is compile, because
Repository: carbondata
Updated Branches:
refs/heads/master 8e659b8dd -> 7a697fd3c
[CARBONDATA-2432][Build] Add bloomfilter datamap to carbondata assembly jar
Currently after build, the generated carbondata assembly jar does not contain
bloomfilter datamap. This PR fix this.
This closes
Repository: carbondata
Updated Branches:
refs/heads/master d1139330f -> 8e659b8dd
[CARBONDATA-2438][Assembly] Remove hadoop/spark/zookeeper related classes from
assembly
It remove hadoop/spark/zookeeper/snappy/guava related classes from assembly
jar. It reduces the size of carbon-assembly
Repository: carbondata
Updated Branches:
refs/heads/master 380473b40 -> f7c0670cd
[CARBONDATA-2392] Add close method for CarbonReader
This closes #2221
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/f7c0670c
Repository: carbondata
Updated Branches:
refs/heads/master 46cee146d -> 1c84d69f2
[CarbonData-2427] Fix SearchMode Serialization Issue during Load
This closes #2260
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 4b8dc0a58 -> 46cee146d
[CARBONDATA-2379] Support SearchModeExample run in cluster
1.support SeachModeExample running in the cluster
2.change the worker hostname to hostAddress
3. support run ConcurrentQueryBenchmark with search mode
Repository: carbondata
Updated Branches:
refs/heads/master 161347155 -> 5229443bd
[CARBONDATA-2347][LUCENE]change datamap factory interface to check supported
features
Added new method to interface which will decide where the table operation
present in list of table operations is allowed on
Repository: carbondata
Updated Branches:
refs/heads/master 61788353d -> cf8fa9540
[CARBONDATA-2356] Ignore SDV Load TestCases for Lucene
This closes #2241
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 2f85381f8 -> 61788353d
http://git-wip-us.apache.org/repos/asf/carbondata/blob/61788353/hadoop/src/main/java/org/apache/carbondata/hadoop/util/BlockLevelTraverser.java
[CARBONDATA-2407]Removed All Unused Executor BTree code
This closes #2234
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/61788353
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/61788353
Diff:
Repository: carbondata
Updated Branches:
refs/heads/master fae457a35 -> 242c08be4
[CARBONDATA-2384] SDK support write/read data into/from S3
User can set his credential in SDK and use SDK to write data into S3 and read
data from S3.
This closes #2226
Project:
Repository: carbondata
Updated Branches:
refs/heads/master afb802e60 -> fae457a35
[CARBONDATA-2356] Added UT Scenarios for LuceneDataMap
This closes #2180
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit:
Repository: carbondata
Updated Branches:
refs/heads/master 184935827 -> 9b45c5b30
[HOTFIX] Fix JVM crash in search mode
This closes #2233
Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/9b45c5b3
Tree:
Repository: carbondata
Updated Branches:
refs/heads/spark-2.3 [created] 3262230cb
401 - 500 of 2025 matches
Mail list logo