Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2898
LGTM
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.3.1, Please check CI
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9652/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.2.1, Please check CI
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1604/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.1.0, Please check CI
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1395/
---
Github user brijoobopanna commented on the issue:
https://github.com/apache/carbondata/pull/2898
retest this please
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.2.1, Please check CI
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1593/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Failed with Spark 2.3.1, Please check CI
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9641/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.1.0, Please check CI
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1383/
---
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2898
retest this please
---
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2898
LGTM
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.1.0, Please check CI
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1310/
---
Github user manishgupta88 commented on the issue:
https://github.com/apache/carbondata/pull/2898
@ravipesala in this method we are passing the `List` . The
list contains the segments file only from the new table. So only the new index
files will be loaded and queried and the same
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Failed with Spark 2.3.1, Please check CI
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9569/
---
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2898
@manishgupta88 I am referring to `DataMapFactory.getDataMaps(List
segments)` . If already cached with old table then who invalidates them. For
new table also it gets the old datamaps right.
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Failed with Spark 2.2.1, Please check CI
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1521/
---
Github user manishgupta88 commented on the issue:
https://github.com/apache/carbondata/pull/2898
@ravipesala ...which method exactly you are referring to?...In all
`getDataMap` methods latest `carbonTable` object is passed.and used for
fetching the dataMaps..there is only one
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.1.0, Please check CI
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1306/
---
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2898
@manishgupta88 it solves part of the problem (schema mismatch issue). But
when you call getDataMaps it will give stale datamaps to you right. How those
can be updated?
---
Github user manishgupta88 commented on the issue:
https://github.com/apache/carbondata/pull/2898
retest this please
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Failed with Spark 2.3.1, Please check CI
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9561/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.1.0, Please check CI
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1300/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Failed with Spark 2.2.1, Please check CI
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1511/
---
Github user manishgupta88 commented on the issue:
https://github.com/apache/carbondata/pull/2898
retest this please
---
Github user manishgupta88 commented on the issue:
https://github.com/apache/carbondata/pull/2898
@xuchuanyin ...yes this scenario will work fine. In case of dropping normal
table it will go through CarbonSession flow and drop table command is already
taking care of clearing the
Github user xuchuanyin commented on the issue:
https://github.com/apache/carbondata/pull/2898
@manishgupta88 What if the user use fileformat carbontable and normal
carbontable at the same time? For example, creating/using/droping fileformat
table and then creating/using/droping
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.1.0, Please check CI
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1289/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Failed with Spark 2.3.1, Please check CI
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9550/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.2.1, Please check CI
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1504/
---
Github user manishgupta88 commented on the issue:
https://github.com/apache/carbondata/pull/2898
retest this please
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.1.0, Please check CI
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1278/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Success with Spark 2.3.1, Please check CI
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9540/
---
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2898
Build Failed with Spark 2.2.1, Please check CI
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1491/
---
Github user manishgupta88 commented on the issue:
https://github.com/apache/carbondata/pull/2898
@xuchuanyin ...your point is correct. To explain this in detail
1. We have already a way to clear the cached DataMaps through API call
Github user xuchuanyin commented on the issue:
https://github.com/apache/carbondata/pull/2898
I think the current modification does not fix the root of the problem. If
you think the table information is not get cleared, you should get it cleared,
not just update it when you need it.
34 matches
Mail list logo