[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2336 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5169/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2336 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6197/ ---
[GitHub] carbondata issue #2282: [WIP] [CARBONDATA-2456] Handling request by shard in...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2282 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5037/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2355 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5035/ ---
[GitHub] carbondata issue #2345: [WIP][CARBONDATA-2557] [CARBONDATA-2472] Improve Car...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2345 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6199/ ---
[GitHub] carbondata issue #2282: [WIP] [CARBONDATA-2456] Handling request by shard in...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2282 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6198/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2355 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6195/ ---
[GitHub] carbondata issue #2282: [WIP] [CARBONDATA-2456] Handling request by shard in...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2282 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5168/ ---
[GitHub] carbondata pull request #2355: [CARBONDATA-2508] Fix the exception that can'...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2355#discussion_r191985218 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala --- @@ -100,6 +100,7 @@ class CarbonSession(@transient val sc: SparkContext, trySearchMode(qe, sse) } catch { case e: Exception => + e.printStackTrace() --- End diff -- ok, done ---
[GitHub] carbondata pull request #2336: [CARBONDATA-2521] Support create carbonReader...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2336#discussion_r191985122 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -631,6 +631,29 @@ public static int nextGreaterValueToTarget(int currentIndex, return integers; } + /** + * Convert path with start and end string + * Convert / . \ : space to _ + * remove duplicate _ + * + * @param path store path + * @param prefix prefix string + * @param suffix suffix string + * @return converted string + */ + public static String convertPath(String path, String prefix, String suffix) { --- End diff -- remove this one and change the table name to UnknownTable+time ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2336 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5167/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2336 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5032/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2336 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6193/ ---
[GitHub] carbondata pull request #2336: [CARBONDATA-2521] Support create carbonReader...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2336#discussion_r191982770 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -631,6 +631,29 @@ public static int nextGreaterValueToTarget(int currentIndex, return integers; } + /** + * Convert path with start and end string + * Convert / . \ : space to _ + * remove duplicate _ + * + * @param path store path + * @param prefix prefix string + * @param suffix suffix string + * @return converted string + */ + public static String convertPath(String path, String prefix, String suffix) { --- End diff -- please add UT for this func ---
[GitHub] carbondata pull request #2355: [CARBONDATA-2508] Fix the exception that can'...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2355#discussion_r191982462 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala --- @@ -100,6 +100,7 @@ class CarbonSession(@transient val sc: SparkContext, trySearchMode(qe, sse) } catch { case e: Exception => + e.printStackTrace() --- End diff -- remove this ---
[GitHub] carbondata pull request #2355: [CARBONDATA-2508] Fix the exception that can'...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2355#discussion_r191982293 --- Diff: core/src/main/java/org/apache/carbondata/core/scan/executor/impl/SearchModeVectorDetailQueryExecutor.java --- @@ -40,7 +40,7 @@ LogServiceFactory.getLogService(SearchModeVectorDetailQueryExecutor.class.getName()); private static ExecutorService executorService = null; - static { + public SearchModeVectorDetailQueryExecutor() { initThreadPool(); --- End diff -- should check whether it is null ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2336 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5166/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2336 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6192/ ---
[GitHub] carbondata pull request #2336: [CARBONDATA-2521] Support create carbonReader...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2336#discussion_r191973756 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReader.java --- @@ -88,6 +90,20 @@ public static CarbonReaderBuilder builder(String tablePath, String tableName) { return new CarbonReaderBuilder(tablePath, tableName); } + /** + * Return a new {@link CarbonReaderBuilder} instance + * Default value of table name is table + time + * + * @param tablePath table path + * @return CarbonReaderBuilder object + */ + public static CarbonReaderBuilder builder(String tablePath) { +String time = new SimpleDateFormat("MMddHHmmssSSS").format(new Date()); +String uniqueName = "table" + time; --- End diff -- ok, I add a convert method for it. "./testWriteFiles/" => "table_testWriteFiles_20180531101931973" "hdfs://testWriteFiles/" =>"table_hdfs_testWriteFiles_20180531102022516" "s3a://sdk/ => "table_s3a_sdk_20180531102058551" ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2355 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5030/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2355 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6191/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2355 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5165/ ---
[GitHub] carbondata issue #2207: [CARBONDATA-2428] Support flat folder for managed ca...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2207 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5164/ ---
[GitHub] carbondata issue #2207: [CARBONDATA-2428] Support flat folder for managed ca...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2207 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5029/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2355 please rebase ---
[GitHub] carbondata pull request #2336: [CARBONDATA-2521] Support create carbonReader...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2336#discussion_r191825493 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReader.java --- @@ -88,6 +90,20 @@ public static CarbonReaderBuilder builder(String tablePath, String tableName) { return new CarbonReaderBuilder(tablePath, tableName); } + /** + * Return a new {@link CarbonReaderBuilder} instance + * Default value of table name is table + time + * + * @param tablePath table path + * @return CarbonReaderBuilder object + */ + public static CarbonReaderBuilder builder(String tablePath) { +String time = new SimpleDateFormat("MMddHHmmssSSS").format(new Date()); +String uniqueName = "table" + time; --- End diff -- can you get the substring of `tablePath` and append with time ---
[jira] [Resolved] (CARBONDATA-2389) Search mode support lucene datamap
[ https://issues.apache.org/jira/browse/CARBONDATA-2389?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jacky Li resolved CARBONDATA-2389. -- Resolution: Fixed Fix Version/s: 1.4.1 > Search mode support lucene datamap > -- > > Key: CARBONDATA-2389 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2389 > Project: CarbonData > Issue Type: Improvement >Reporter: xubo245 >Assignee: xubo245 >Priority: Major > Fix For: 1.4.1 > > Time Spent: 18.5h > Remaining Estimate: 0h > > Carbon doesn's support now > {code:java} > 18/04/23 06:12:14 ERROR CarbonSession: Exception when executing search mode: > Error while resolving filter expression, fallback to SparkSQL > 18/04/23 06:12:14 ERROR CarbonSession: Exception when executing search mode: > Error while resolving filter expression, fallback to SparkSQL > {code} > Carbon should support it. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2290: [CARBONDATA-2389] Search mode support lucene ...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2290 ---
[GitHub] carbondata issue #2290: [CARBONDATA-2389] Search mode support lucene datamap
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2290 LGTM ---
[GitHub] carbondata issue #2345: [WIP][CARBONDATA-2557] [CARBONDATA-2472] Improve Car...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2345 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5163/ ---
[jira] [Created] (CARBONDATA-2565) [MV] Join Query Failed with MV data map
Babulal created CARBONDATA-2565: --- Summary: [MV] Join Query Failed with MV data map Key: CARBONDATA-2565 URL: https://issues.apache.org/jira/browse/CARBONDATA-2565 Project: CarbonData Issue Type: Bug Reporter: Babulal create table mvtest11 (name string,age int,salray int) stored by 'carbondata'; create table mvtest9_1( name_t string,age_t int,salary_t int) stored by 'carbondata'; insert into mvtest11 select 'name1',12,12; insert into mvtest9_1 select 'name1',12,12; create datamap mvtest11_mv_2 using 'mv' as select name,sum(salray) from mvtest11 group by name; rebuild datamap mvtest11_mv_2 0: jdbc:hive2://10.18.222.231:23040> explain select total from (select name ,sum(salray) as total from mvtest11 group by name) t1 join mvtest9_1 t2 on t1.name= t2.name_t; Error: org.apache.spark.sql.AnalysisException: cannot resolve '`t1.total`' given input columns: [name_t, salary_t, age_t, name, sum(salray)]; line 1 pos 28; 'Project [UDF:preAgg() AS preAgg#6324, 't1.total] +- Join Inner, (name#5435 = name_t#2595) :- SubqueryAlias gen_subquery_0 : +- Aggregate [name#5435], [name#5435, sum(cast(salray#5437 as bigint)) AS sum(salray)#6323L] : +- SubqueryAlias mvtest11 : +- Relation[name#5435,age#5436,salray#5437] CarbonDatasourceHadoopRelation [ Database name :default, Table name :mvtest11, Schema :Some(StructType(StructField(name,StringType,true), StructField(age,IntegerType,true), StructField(salray,IntegerType,true))) ] +- SubqueryAlias t2 +- SubqueryAlias mvtest9_1 +- Relation[name_t#2595,age_t#2596,salary_t#2597] CarbonDatasourceHadoopRelation [ Database name :default, Table name :mvtest9_1, Schema :Some(StructType(StructField(name_t,StringType,true), StructField(age_t,IntegerType,true), StructField(salary_t,IntegerType,true))) ] (state=,code=0) 0: jdbc:hive2://10.18.222.231:23040> select t2.* from (select name ,sum(salray) as total from mvtest11 group by name) t1 join mvtest9_1 t2 on t1.name= t2.name_t; +-++---+--+ | name_t | age_t | salary_t | +-++---+--+ +-++---+--+ No rows selected (12.672 seconds) 0: jdbc:hive2://10.18.222.231:23040> select t1.* from (select name ,sum(salray) as total from mvtest11 group by name) t1 join mvtest9_1 t2 on t1.name= t2.name_t; Error: org.apache.spark.sql.AnalysisException: cannot resolve '`t1.total`' given input columns: [salary_t, name_t, sum(salray), name, age_t]; line 1 pos 51; 'Project [UDF:preAgg() AS preAgg#6511, name#6512, 't1.total] +- Join Inner, (name#6512 = name_t#6515) :- SubqueryAlias gen_subquery_0 : +- Aggregate [name#6512], [name#6512, sum(cast(salray#6514 as bigint)) AS sum(salray)#6510L] : +- SubqueryAlias mvtest11 : +- Relation[name#6512,age#6513,salray#6514] CarbonDatasourceHadoopRelation [ Database name :default, Table name :mvtest11, Schema :Some(StructType(StructField(name,StringType,true), StructField(age,IntegerType,true), StructField(salray,IntegerType,true))) ] +- SubqueryAlias t2 +- SubqueryAlias mvtest9_1 +- Relation[name_t#6515,age_t#6516,salary_t#6517] CarbonDatasourceHadoopRelation [ Database name :default, Table name :mvtest9_1, Schema :Some(StructType(StructField(name_t,StringType,true), StructField(age_t,IntegerType,true), StructField(salary_t,IntegerType,true))) ] (state=,code=0) 0: jdbc:hive2://10.18.222.231:23040> -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user xubo245 commented on the issue: https://github.com/apache/carbondata/pull/2336 @jackylk CI passï¼ please review again ---
[GitHub] carbondata issue #2335: [WIP] integrate carbonstore mv branch
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2335 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5162/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2336 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5161/ ---
[GitHub] carbondata issue #2322: [CARBONDATA-2472] Fixed:Refactor NonTransactional ta...
Github user ajantha-bhat commented on the issue: https://github.com/apache/carbondata/pull/2322 Hanldled in #2435 ---
[GitHub] carbondata pull request #2322: [CARBONDATA-2472] Fixed:Refactor NonTransacti...
Github user ajantha-bhat closed the pull request at: https://github.com/apache/carbondata/pull/2322 ---
[GitHub] carbondata issue #2345: [WIP][CARBONDATA-2557] Improve Carbon Reader Schema ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2345 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5028/ ---
[GitHub] carbondata issue #2345: [WIP][CARBONDATA-2557] Improve Carbon Reader Schema ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2345 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6189/ ---
[GitHub] carbondata issue #2345: [WIP][CARBONDATA-2557] Improve Carbon Reader Schema ...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2345 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5160/ ---
[GitHub] carbondata issue #2181: [CARBONDATA-2355] Support run SQL on carbondata file...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2181 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5159/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2336 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5026/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2336 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5158/ ---
[GitHub] carbondata issue #2335: [WIP] integrate carbonstore mv branch
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2335 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6188/ ---
[GitHub] carbondata issue #2181: [CARBONDATA-2355] Support run SQL on carbondata file...
Github user xubo245 commented on the issue: https://github.com/apache/carbondata/pull/2181 @chenliang613 rebase this PR, and CI pass, please review it. ---
[GitHub] carbondata issue #2335: [WIP] integrate carbonstore mv branch
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2335 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5027/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2336 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6187/ ---
[GitHub] carbondata issue #2345: [WIP][CARBONDATA-2557] Improve Carbon Reader Schema ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2345 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6186/ ---
[GitHub] carbondata issue #2181: [CARBONDATA-2355] Support run SQL on carbondata file...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2181 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5024/ ---
[GitHub] carbondata issue #2345: [WIP][CARBONDATA-2557] Improve Carbon Reader Schema ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2345 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5025/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2355 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5157/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user xubo245 commented on the issue: https://github.com/apache/carbondata/pull/2355 @jackylk CI pass, please review it. ---
[GitHub] carbondata issue #2181: [CARBONDATA-2355] Support run SQL on carbondata file...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2181 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6185/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2355 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5022/ ---
[GitHub] carbondata issue #2287: [CARBONDATA-2418] [Presto] [S3] Fixed Presto Can't Q...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2287 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6182/ ---
[GitHub] carbondata issue #2336: [CARBONDATA-2521] Support create carbonReader withou...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2336 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6184/ ---
[GitHub] carbondata issue #2287: [CARBONDATA-2418] [Presto] [S3] Fixed Presto Can't Q...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2287 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5156/ ---
[GitHub] carbondata issue #2355: [CARBONDATA-2508] Fix the exception that can't get e...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2355 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6183/ ---
[GitHub] carbondata pull request #2352: [CARBONDATA-2555]Fixed SDK reader set default...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2352#discussion_r191695689 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -305,7 +309,7 @@ public void testWriteAndReadFilesNonTransactional() throws IOException, Interrup // Write to a Non Transactional Table TestUtil.writeFilesAndVerify(new Schema(fields), path, true, false); -CarbonReader reader = CarbonReader.builder(path, "_temp") +CarbonReader reader = CarbonReader.builder(path, "_temp").isTransactionalTable(true) --- End diff -- ok, It's fine. ---
[GitHub] carbondata pull request #2352: [CARBONDATA-2555]Fixed SDK reader set default...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2352#discussion_r191695582 --- Diff: examples/spark2/src/main/java/org/apache/carbondata/examples/sdk/SDKS3Example.java --- @@ -44,34 +44,14 @@ public static void main(String[] args) throws Exception { num = Integer.parseInt(args[4]); } -Boolean persistSchema = true; -if (args.length > 5) { -if (args[5].equalsIgnoreCase("true")) { -persistSchema = true; -} else { -persistSchema = false; -} -} - -Boolean transactionalTable = true; -if (args.length > 6) { -if (args[6].equalsIgnoreCase("true")) { -transactionalTable = true; -} else { -transactionalTable = false; -} -} - Field[] fields = new Field[2]; fields[0] = new Field("name", DataTypes.STRING); fields[1] = new Field("age", DataTypes.INT); CarbonWriterBuilder builder = CarbonWriter.builder() .setAccessKey(args[0]) .setSecretKey(args[1]) .setEndPoint(args[2]) -.outputPath(path) -.persistSchemaFile(persistSchema) -.isTransactionalTable(transactionalTable); --- End diff -- remove this one , this example can't change value of isTransactionalTable and persistSchemaFile when we test it. I suggest it better to keep it. ---
[GitHub] carbondata issue #2350: [CARBONDATA-2553] support ZSTD compression for sort ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2350 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5020/ ---
[GitHub] carbondata issue #2105: [CARBONDATA-2286][SDV] Added sdv test cases for stre...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2105 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5155/ ---
[GitHub] carbondata pull request #2336: [CARBONDATA-2521] Support create carbonReader...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2336#discussion_r191691800 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReader.java --- @@ -88,6 +89,14 @@ public static CarbonReaderBuilder builder(String tablePath, String tableName) { return new CarbonReaderBuilder(tablePath, tableName); } + /** + * Return a new {@link CarbonReaderBuilder} instance + */ + public static CarbonReaderBuilder builder(String tablePath) { +String uniqueName = "_temp" + UUID.randomUUID().toString(); --- End diff -- I think it maybe has many problems if using s"table@$tablePath", because there are different table path, like local,HDFS and OBS and so on. I think we can add time to table name for recording the create time of table. ---
[GitHub] carbondata pull request #2352: [CARBONDATA-2555]Fixed SDK reader set default...
Github user ajantha-bhat commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2352#discussion_r191691253 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -305,7 +309,7 @@ public void testWriteAndReadFilesNonTransactional() throws IOException, Interrup // Write to a Non Transactional Table TestUtil.writeFilesAndVerify(new Schema(fields), path, true, false); -CarbonReader reader = CarbonReader.builder(path, "_temp") +CarbonReader reader = CarbonReader.builder(path, "_temp").isTransactionalTable(true) --- End diff -- This is a conflict resolve problem. Will be removed in the next commit. Finally it should be false for this testcase ---
[GitHub] carbondata pull request #2352: [CARBONDATA-2555]Fixed SDK reader set default...
Github user ajantha-bhat commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2352#discussion_r191690950 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -305,7 +309,7 @@ public void testWriteAndReadFilesNonTransactional() throws IOException, Interrup // Write to a Non Transactional Table TestUtil.writeFilesAndVerify(new Schema(fields), path, true, false); -CarbonReader reader = CarbonReader.builder(path, "_temp") +CarbonReader reader = CarbonReader.builder(path, "_temp").isTransactionalTable(true) --- End diff -- This came by conflict resolve. This will be removed in the next commit. No harm. ---
[GitHub] carbondata issue #2350: [CARBONDATA-2553] support ZSTD compression for sort ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2350 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6181/ ---
[GitHub] carbondata pull request #2352: [CARBONDATA-2555]Fixed SDK reader set default...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2352#discussion_r191690080 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -305,7 +309,7 @@ public void testWriteAndReadFilesNonTransactional() throws IOException, Interrup // Write to a Non Transactional Table TestUtil.writeFilesAndVerify(new Schema(fields), path, true, false); -CarbonReader reader = CarbonReader.builder(path, "_temp") +CarbonReader reader = CarbonReader.builder(path, "_temp").isTransactionalTable(true) --- End diff -- There has invoked isTransactionalTable method in line 314, why invoke again? ---
[GitHub] carbondata issue #2105: [CARBONDATA-2286][SDV] Added sdv test cases for stre...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2105 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5019/ ---
[GitHub] carbondata issue #2105: [CARBONDATA-2286][SDV] Added sdv test cases for stre...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2105 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6180/ ---
[GitHub] carbondata issue #2347: [CARBONDATA-2554] Added support for logical type
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2347 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5154/ ---
[GitHub] carbondata issue #2347: [CARBONDATA-2554] Added support for logical type
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2347 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5018/ ---
[GitHub] carbondata pull request #2355: [CARBONDATA-2508] Fix the exception that can'...
GitHub user xubo245 opened a pull request: https://github.com/apache/carbondata/pull/2355 [CARBONDATA-2508] Fix the exception that can't get executorService when start search mode twice Fix the exception that can't get executorService when start search mode twice: 1.remove the static code block in SearchModeDetailQueryExecutor and SearchModeVectorDetailQueryExecutor 2.invoke initThreadPool in constructor of class Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [X] Any interfaces changed? No - [X] Any backward compatibility impacted? No - [X] Document update required? No - [X] Testing done add some test case - [X] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. No You can merge this pull request into a Git repository by running: $ git pull https://github.com/xubo245/carbondata CARBONDATA-2508-searchModeStartAgain Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2355.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2355 commit ec29d75b0fe5592d46be4c665b957c416077563d Author: xubo245 Date: 2018-05-30T07:45:49Z [CARBONDATA-2508] Fix the exception that can't get executorService when start search mode twice 1.remove the static code block in SearchModeDetailQueryExecutor and SearchModeVectorDetailQueryExecutor 2.invoke initThreadPool in constructor of class ---
[GitHub] carbondata issue #2347: [CARBONDATA-2554] Added support for logical type
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2347 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6179/ ---
[jira] [Updated] (CARBONDATA-2508) There are some errors when I running SearchModeExample
[ https://issues.apache.org/jira/browse/CARBONDATA-2508?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] xubo245 updated CARBONDATA-2508: Description: There are some errors when I running org.apache.carbondata.examples.SearchModeExample: {code:java} org.apache.carbondata.examples.SearchModeExample log4j:WARN No appenders could be found for logger (org.apache.carbondata.core.util.CarbonProperties). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 18/05/22 16:12:42 INFO SparkContext: Running Spark version 2.2.1 18/05/22 16:12:42 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/05/22 16:12:42 WARN Utils: Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 192.168.44.90 instead (on interface en3) 18/05/22 16:12:42 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 18/05/22 16:12:42 INFO SparkContext: Submitted application: SearchModeExample 18/05/22 16:12:42 INFO SecurityManager: Changing view acls to: xubo 18/05/22 16:12:42 INFO SecurityManager: Changing modify acls to: xubo 18/05/22 16:12:42 INFO SecurityManager: Changing view acls groups to: 18/05/22 16:12:42 INFO SecurityManager: Changing modify acls groups to: 18/05/22 16:12:42 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(xubo); groups with view permissions: Set(); users with modify permissions: Set(xubo); groups with modify permissions: Set() 18/05/22 16:12:43 INFO Utils: Successfully started service 'sparkDriver' on port 64124. 18/05/22 16:12:43 INFO SparkEnv: Registering MapOutputTracker 18/05/22 16:12:43 INFO SparkEnv: Registering BlockManagerMaster 18/05/22 16:12:43 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 18/05/22 16:12:43 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 18/05/22 16:12:43 INFO DiskBlockManager: Created local directory at /private/var/folders/lw/4y5plg0x7rq45h38m4sfxlbmgn/T/blockmgr-0ed23439-9e4f-4798-b197-0681f40e9fa5 18/05/22 16:12:43 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB 18/05/22 16:12:43 INFO SparkEnv: Registering OutputCommitCoordinator 18/05/22 16:12:43 INFO Utils: Successfully started service 'SparkUI' on port 4040. 18/05/22 16:12:43 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.44.90:4040 18/05/22 16:12:43 INFO Executor: Starting executor ID driver on host localhost 18/05/22 16:12:43 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64125. 18/05/22 16:12:43 INFO NettyBlockTransferService: Server created on 192.168.44.90:64125 18/05/22 16:12:43 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 18/05/22 16:12:43 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.44.90:64125 with 2004.6 MB RAM, BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/xubo/Desktop/xubo/git/carbondata1/spark-warehouse'). 18/05/22 16:12:43 INFO SharedState: Warehouse path is 'file:/Users/xubo/Desktop/xubo/git/carbondata1/spark-warehouse'. 18/05/22 16:12:44 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 18/05/22 16:12:45 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 18/05/22 16:12:45 INFO ObjectStore: ObjectStore, initialize called 18/05/22 16:12:45 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 18/05/22 16:12:45 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored 18/05/22 16:12:46 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 18/05/22 16:12:47 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 18/05/22 16:12:47 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
[jira] [Updated] (CARBONDATA-2508) There are some errors when I running SearchModeExample
[ https://issues.apache.org/jira/browse/CARBONDATA-2508?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] xubo245 updated CARBONDATA-2508: Description: There are some errors when I running org.apache.carbondata.examples.SearchModeExample: {code:java} org.apache.carbondata.examples.SearchModeExample log4j:WARN No appenders could be found for logger (org.apache.carbondata.core.util.CarbonProperties). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 18/05/22 16:12:42 INFO SparkContext: Running Spark version 2.2.1 18/05/22 16:12:42 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/05/22 16:12:42 WARN Utils: Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 192.168.44.90 instead (on interface en3) 18/05/22 16:12:42 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 18/05/22 16:12:42 INFO SparkContext: Submitted application: SearchModeExample 18/05/22 16:12:42 INFO SecurityManager: Changing view acls to: xubo 18/05/22 16:12:42 INFO SecurityManager: Changing modify acls to: xubo 18/05/22 16:12:42 INFO SecurityManager: Changing view acls groups to: 18/05/22 16:12:42 INFO SecurityManager: Changing modify acls groups to: 18/05/22 16:12:42 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(xubo); groups with view permissions: Set(); users with modify permissions: Set(xubo); groups with modify permissions: Set() 18/05/22 16:12:43 INFO Utils: Successfully started service 'sparkDriver' on port 64124. 18/05/22 16:12:43 INFO SparkEnv: Registering MapOutputTracker 18/05/22 16:12:43 INFO SparkEnv: Registering BlockManagerMaster 18/05/22 16:12:43 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 18/05/22 16:12:43 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 18/05/22 16:12:43 INFO DiskBlockManager: Created local directory at /private/var/folders/lw/4y5plg0x7rq45h38m4sfxlbmgn/T/blockmgr-0ed23439-9e4f-4798-b197-0681f40e9fa5 18/05/22 16:12:43 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB 18/05/22 16:12:43 INFO SparkEnv: Registering OutputCommitCoordinator 18/05/22 16:12:43 INFO Utils: Successfully started service 'SparkUI' on port 4040. 18/05/22 16:12:43 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.44.90:4040 18/05/22 16:12:43 INFO Executor: Starting executor ID driver on host localhost 18/05/22 16:12:43 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64125. 18/05/22 16:12:43 INFO NettyBlockTransferService: Server created on 192.168.44.90:64125 18/05/22 16:12:43 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 18/05/22 16:12:43 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.44.90:64125 with 2004.6 MB RAM, BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.44.90, 64125, None) 18/05/22 16:12:43 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/xubo/Desktop/xubo/git/carbondata1/spark-warehouse'). 18/05/22 16:12:43 INFO SharedState: Warehouse path is 'file:/Users/xubo/Desktop/xubo/git/carbondata1/spark-warehouse'. 18/05/22 16:12:44 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 18/05/22 16:12:45 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 18/05/22 16:12:45 INFO ObjectStore: ObjectStore, initialize called 18/05/22 16:12:45 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 18/05/22 16:12:45 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored 18/05/22 16:12:46 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 18/05/22 16:12:47 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 18/05/22 16:12:47 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
[GitHub] carbondata issue #2350: [CARBONDATA-2553] support ZSTD compression for sort ...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2350 retest this please ---
[GitHub] carbondata issue #2351: [CARBONDATA-2559] task id set for each carbonReader ...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2351 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5153/ ---
[GitHub] carbondata issue #2354: [WIP] Remove dead code
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2354 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5016/ ---
[GitHub] carbondata issue #2351: [CARBONDATA-2559] task id set for each carbonReader ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2351 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6178/ ---
[GitHub] carbondata issue #2354: [WIP] Remove dead code
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2354 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6177/ ---
[GitHub] carbondata issue #2354: [WIP] Remove dead code
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2354 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5152/ ---
[GitHub] carbondata issue #2351: [CARBONDATA-2559] task id set for each carbonReader ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2351 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5017/ ---