[jira] [Created] (FLINK-34480) Add method to support user jar overwrite flink inner jar class when same class

2024-02-21 Thread JinxinTang (Jira)
JinxinTang created FLINK-34480:
--

 Summary: Add method to support user jar overwrite flink inner jar 
class when same class
 Key: FLINK-34480
 URL: https://issues.apache.org/jira/browse/FLINK-34480
 Project: Flink
  Issue Type: Improvement
Reporter: JinxinTang






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-30795) StreamingWithStateTestBase of win not correct

2023-01-26 Thread JinxinTang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-30795?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

JinxinTang updated FLINK-30795:
---
Description: 
Windows path such as

"

[file://C:\Users\xx\AppData\Local\Temp\junit373749850266957074\junit7014045318909690439|file:///C:/UsersxxAppDataLocalTempjunit373749850266957074junit7014045318909690439]

"

will throw 

 
new IllegalArgumentException("Cannot use the root directory for checkpoints.");

> StreamingWithStateTestBase of win not correct
> -
>
> Key: FLINK-30795
> URL: https://issues.apache.org/jira/browse/FLINK-30795
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
> Environment: Windows path such as
> "
> file://C:\Users\xx\AppData\Local\Temp\junit373749850266957074\junit7014045318909690439
> "
> will throw 
>  
> new IllegalArgumentException("Cannot use the root directory for 
> checkpoints.");
>Reporter: JinxinTang
>Priority: Major
>
> Windows path such as
> "
> [file://C:\Users\xx\AppData\Local\Temp\junit373749850266957074\junit7014045318909690439|file:///C:/UsersxxAppDataLocalTempjunit373749850266957074junit7014045318909690439]
> "
> will throw 
>  
> new IllegalArgumentException("Cannot use the root directory for 
> checkpoints.");



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-30795) StreamingWithStateTestBase of win not correct

2023-01-26 Thread JinxinTang (Jira)
JinxinTang created FLINK-30795:
--

 Summary: StreamingWithStateTestBase of win not correct
 Key: FLINK-30795
 URL: https://issues.apache.org/jira/browse/FLINK-30795
 Project: Flink
  Issue Type: Bug
  Components: Tests
 Environment: Windows path such as

"

file://C:\Users\xx\AppData\Local\Temp\junit373749850266957074\junit7014045318909690439

"

will throw 

 
new IllegalArgumentException("Cannot use the root directory for checkpoints.");
Reporter: JinxinTang






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-22853) FLinkSql聚合函数max/min/sum返回结果重复

2021-06-02 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-22853?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17356104#comment-17356104
 ] 

JinxinTang commented on FLINK-22853:


seems can not reproduce ~
{code:java}
// code placeholder
EnvironmentSettings settings = 
EnvironmentSettings.newInstance().useBlinkPlanner().inBatchMode().build();
TableEnvironment tableEnvironment = TableEnvironment.create(settings);

Table input =
 tableEnvironment.fromValues(DataTypes.ROW(DataTypes.FIELD("id", 
DataTypes.STRING()),
 DataTypes.FIELD("offset", DataTypes.INT())),
 Row.of("1", 1), Row.of("1", 3), Row.of("1", 2));

tableEnvironment.createTemporaryView("table1", input);
tableEnvironment.executeSql("select id,sum(`offset`) from table1 group by id" 
).print();
{code}

> FLinkSql聚合函数max/min/sum返回结果重复
> -
>
> Key: FLINK-22853
> URL: https://issues.apache.org/jira/browse/FLINK-22853
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / API
>Affects Versions: 1.12.1
>Reporter: Raypon Wang
>Priority: Blocker
>
> mysql数据如下:
> id    offset
> 1      1
> 1      3
> 1      2
> flinksql code:
> val settings = 
> EnvironmentSettings.newInstance().useBlinkPlanner().inBatchMode().build()
> val tableEnvironment = TableEnvironment.create(settings)
> tableEnvironment.executeSql("select id,max(`offset`) from table group by id" 
> ).print()
>  
> result:
> +--+-+
> | id | EXPR$1 |
> +--+-+
> | 1 | 1 |
> | 1 | 3 |
> | 1 | 2 |
> +--+-+
> max/min/sum都有这个问题;
> 但是 avg/count/last_value/first_value 没有这个问题
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-22853) FLinkSql聚合函数max/min/sum返回结果重复

2021-06-02 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-22853?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17356050#comment-17356050
 ] 

JinxinTang commented on FLINK-22853:


Let me check ~

> FLinkSql聚合函数max/min/sum返回结果重复
> -
>
> Key: FLINK-22853
> URL: https://issues.apache.org/jira/browse/FLINK-22853
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / API
>Affects Versions: 1.12.1
>Reporter: Raypon Wang
>Priority: Blocker
>
> mysql数据如下:
> id    offset
> 1      1
> 1      3
> 1      2
> flinksql code:
> val settings = 
> EnvironmentSettings.newInstance().useBlinkPlanner().inBatchMode().build()
> val tableEnvironment = TableEnvironment.create(settings)
> tableEnvironment.executeSql("select id,max(`offset`) from table group by id" 
> ).print()
>  
> result:
> +--+-+
> | id | EXPR$1 |
> +--+-+
> | 1 | 1 |
> | 1 | 3 |
> | 1 | 2 |
> +--+-+
> max/min/sum都有这个问题;
> 但是 avg/count/last_value/first_value 没有这个问题
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18750) SqlValidatorException thrown when select from a view which contains a UDTF call

2020-07-29 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18750?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17167046#comment-17167046
 ] 

JinxinTang commented on FLINK-18750:


Seems interesting, nice reproduce code, I will check this too.

> SqlValidatorException thrown when select from a view which contains a UDTF 
> call
> ---
>
> Key: FLINK-18750
> URL: https://issues.apache.org/jira/browse/FLINK-18750
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / API
>Affects Versions: 1.11.1
>Reporter: Wei Zhong
>Priority: Major
>
> When executing such code:
>  
> {code:java}
> package com.example;
> import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
> import org.apache.flink.table.api.EnvironmentSettings;
> import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
> import org.apache.flink.table.functions.TableFunction;
> public class TestUTDF {
>public static class UDTF extends TableFunction {
>   public void eval(String input) {
>  collect(input);
>   }
>}
>public static void main(String[] args) {
>   StreamExecutionEnvironment env = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   StreamTableEnvironment tEnv = StreamTableEnvironment.create(
>  env, EnvironmentSettings.newInstance().useBlinkPlanner().build());
>   tEnv.createTemporarySystemFunction("udtf", new UDTF());
>   tEnv.createTemporaryView("source", tEnv.fromValues("a", "b", 
> "c").as("f0"));
>   String udtfCall = "SELECT S.f0, T.f1 FROM source as S, LATERAL 
> TABLE(udtf(f0)) as T(f1)";
>   System.out.println(tEnv.explainSql(udtfCall));
>   String createViewCall = "CREATE VIEW tmp_view AS" + udtfCall;
>   tEnv.executeSql(createViewCall);
>   System.out.println(tEnv.from("tmp_view").explain());
>}
> }
> {code}
> Such a SqlValidatorException would be thrown:
>  
>  
> {code:java}
> == Abstract Syntax Tree  Abstract Syntax Tree ==LogicalProject(f0=[$0], 
> f1=[$1])+- LogicalCorrelate(correlation=[$cor0], joinType=[inner], 
> requiredColumns=[{0}])   :- LogicalProject(f0=[AS($0, _UTF-16LE'f0')])   :  
> +- LogicalValues(tuples=[[{ _UTF-16LE'a' }, { _UTF-16LE'b' }, { _UTF-16LE'c' 
> }]])   +- LogicalTableFunctionScan(invocation=[udtf($cor0.f0)], 
> rowType=[RecordType(VARCHAR(2147483647) EXPR$0)])
> == Optimized Logical Plan ==Correlate(invocation=[udtf($cor0.f0)], 
> correlate=[table(udtf($cor0.f0))], select=[f0,EXPR$0], 
> rowType=[RecordType(CHAR(1) f0, VARCHAR(2147483647) EXPR$0)], 
> joinType=[INNER])+- Calc(select=[f0])   +- Values(type=[RecordType(CHAR(1) 
> f0)], tuples=[[{ _UTF-16LE'a' }, { _UTF-16LE'b' }, { _UTF-16LE'c' }]])
> == Physical Execution Plan ==Stage 1 : Data Source content : Source: 
> Values(tuples=[[{ _UTF-16LE'a' }, { _UTF-16LE'b' }, { _UTF-16LE'c' }]])
>  Stage 2 : Operator content : Calc(select=[f0]) ship_strategy : FORWARD
>  Stage 3 : Operator content : Correlate(invocation=[udtf($cor0.f0)], 
> correlate=[table(udtf($cor0.f0))], select=[f0,EXPR$0], 
> rowType=[RecordType(CHAR(1) f0, VARCHAR(2147483647) EXPR$0)], 
> joinType=[INNER]) ship_strategy : FORWARDException in thread "main" 
> org.apache.flink.table.api.ValidationException: SQL validation failed. From 
> line 4, column 14 to line 4, column 17: Column 'f0' not found in any table at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)
>  at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:204)
>  at org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52) 
> at 
> org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
>  at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
>  at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.toRel(ExpandingPreparingTable.java:55)
>  at 
> org.apache.calcite.rel.core.RelFactories$TableScanFactoryImpl.createScan(RelFactories.java:533)
>  at org.apache.calcite.tools.RelBuilder.scan(RelBuilder.java:1044) at 
> org.apache.calcite.tools.RelBuilder.scan(RelBuilder.java:1068) at 
> org.apache.flink.table.planner.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:349)
>  at 
> org.apache.flink.table.planner.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:152)
>  at 
> org.apache.flink.table.operations.CatalogQueryOperation.accept(CatalogQueryOperation.java:69)
>  at 
> org.apache.flink.table.planner.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:149)
>  at 
> 

[jira] [Commented] (FLINK-18437) org.apache.calcite.sql.validate.SqlValidatorException: List of column aliases must have same degree as table

2020-07-04 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17151270#comment-17151270
 ] 

JinxinTang commented on FLINK-18437:


The former exception is from 
`org.apache.calcite.runtime.CalciteResource#aliasListDegree `, should we 
overwrite it [~jark]  ?

> org.apache.calcite.sql.validate.SqlValidatorException: List of column aliases 
> must have same degree as table
> 
>
> Key: FLINK-18437
> URL: https://issues.apache.org/jira/browse/FLINK-18437
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / API
>Affects Versions: 1.9.3
>Reporter: mzz
>Priority: Critical
>
>  .withSchema(new Schema()
> .field("ip", Types.STRING())
> .field("ts", Types.STRING())
> .field("environment", Types.ROW(Array("access", "brand"), 
> Array[TypeInformation[_]](Types.STRING(), Types.STRING)))
> .field("advs", ObjectArrayTypeInfo.getInfoFor(new 
> Array[Row](0).getClass, Types.ROW(Array("count", "eventid"), 
> Array[TypeInformation[_]](Types.STRING(), Types.STRING
>   )
>   .inAppendMode()
>   .registerTableSource("aggs_test")
> The code above is dataSchema,i tried this way  
> [https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/table/sql/queries.html|http://example.com],but
>   when i execute this sql: 
> val sql1 =
>   """
> |SELECT
> |ip,
> |ts,
> |environment,
> |adv
> |FROM aggs_test
> |CROSS JOIN UNNEST(advs) AS t (adv)
> |""".stripMargin
> It report an error:
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. From line 8, column 31 to line 8, column 33: List of 
> column aliases must have same degree as table; table has 2 columns ('access', 
> 'brand'), whereas alias list has 1 columns
>   at 
> org.apache.flink.table.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:128)
>   at 
> org.apache.flink.table.sqlexec.SqlToOperationConverter.convert(SqlToOperationConverter.java:83)
>   at 
> org.apache.flink.table.planner.StreamPlanner.parse(StreamPlanner.scala:115)
>   at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:298)
>   at 
> QM.COM.Flink.KafakaHelper.FlinkTableConnKafka$.main(FlinkTableConnKafka.scala:88)
>   at 
> QM.COM.Flink.KafakaHelper.FlinkTableConnKafka.main(FlinkTableConnKafka.scala)
> Caused by: org.apache.calcite.runtime.CalciteContextException: From line 8, 
> column 31 to line 8, column 33: List of column aliases must have same degree 
> as table; table has 2 columns ('access', 'brand'), whereas alias list has 1 
> columns
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at 
> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:463)
>   at org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:824)
>   at org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:809)
>   at 
> org.apache.calcite.sql.validate.SqlValidatorImpl.newValidationError(SqlValidatorImpl.java:4807)
>   at 
> org.apache.calcite.sql.validate.AliasNamespace.validateImpl(AliasNamespace.java:86)
>   at 
> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:84)
>   at 
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:997)
>   at 
> org.apache.calcite.sql.validate.AbstractNamespace.getRowType(AbstractNamespace.java:115)
>   at 
> org.apache.calcite.sql.validate.AliasNamespace.getRowType(AliasNamespace.java:41)
>   at 
> org.apache.calcite.sql.validate.DelegatingScope.resolveInNamespace(DelegatingScope.java:101)
>   at org.apache.calcite.sql.validate.ListScope.resolve(ListScope.java:191)
>   at 
> org.apache.calcite.sql.validate.ListScope.findQualifyingTableNames(ListScope.java:156)
>   at 
> org.apache.calcite.sql.validate.DelegatingScope.fullyQualify(DelegatingScope.java:238)
>   at 
> org.apache.calcite.sql.validate.SqlValidatorImpl$Expander.visit(SqlValidatorImpl.java:5699)
>   at 
> org.apache.calcite.sql.validate.SqlValidatorImpl$Expander.visit(SqlValidatorImpl.java:5684)
>   at org.apache.calcite.sql.SqlIdentifier.accept(SqlIdentifier.java:317)
>   at 
> org.apache.calcite.sql.validate.SqlValidatorImpl.expand(SqlValidatorImpl.java:5291)
>   at 
> 

[jira] [Commented] (FLINK-18440) ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions

2020-07-03 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17150917#comment-17150917
 ] 

JinxinTang commented on FLINK-18440:


cc [~jark] Thank you for your test code and suggestion, I have just open a PR 
for this, could you please help me review :)

> ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or 
> ROW_NUMBER functions
> 
>
> Key: FLINK-18440
> URL: https://issues.apache.org/jira/browse/FLINK-18440
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Runtime
>Affects Versions: 1.11.0
>Reporter: LakeShen
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.11.1
>
> Attachments: image-2020-06-28-18-55-43-692.png
>
>
> When I run flink sql ,the flink sql like this:
> create view test as select  name, eat ,sum(age) as cnt from test_source group 
> by  name,eat;
> create view results as select *, ROW_NUMBER() OVER (PARTITION BY name 
> ORDER BY cnt DESC) as row_num from test;
> create table sink (
> name varchar,
> eat varchar,
> cnt bigint
> )
> with(
> 'connector' = 'print'
> );
> insert into sink select name,eat , cnt from results where  row_num <= 3 ;
> The same sql code I could run success in flink 1.10, now I change the flink 
> version into flink 1.11, it throw the exception.
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. From line 1, column 124 to line 1, column 127: 
> ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:204)
>   at 
> org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52)
>   at 
> org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.toRel(ExpandingPreparingTable.java:55)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3492)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2415)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2102)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:661)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:642)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3345)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:568)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:164)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:151)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:773)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:745)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:238)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:527)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (FLINK-18440) ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions

2020-06-28 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17147311#comment-17147311
 ] 

JinxinTang edited comment on FLINK-18440 at 6/28/20, 11:07 AM:
---

!image-2020-06-28-18-55-43-692.png!

Hope this could help :)


was (Author: jinxintang):
!image-2020-06-28-18-55-43-692.png!

> ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or 
> ROW_NUMBER functions
> 
>
> Key: FLINK-18440
> URL: https://issues.apache.org/jira/browse/FLINK-18440
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Runtime
>Affects Versions: 1.11.0
>Reporter: LakeShen
>Priority: Major
> Fix For: 1.11.1
>
> Attachments: image-2020-06-28-18-55-43-692.png
>
>
> When I run flink sql ,the flink sql like this:
> create view test as select  name, eat ,sum(age) as cnt from test_source group 
> by  name,eat;
> create view results as select *, ROW_NUMBER() OVER (PARTITION BY name 
> ORDER BY cnt DESC) as row_num from test;
> create table sink (
> name varchar,
> eat varchar,
> cnt bigint
> )
> with(
> 'connector' = 'print'
> );
> insert into sink select name,eat , cnt from results where  row_num <= 3 ;
> The same sql code I could run success in flink 1.10, now I change the flink 
> version into flink 1.11, it throw the exception.
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. From line 1, column 124 to line 1, column 127: 
> ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:204)
>   at 
> org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52)
>   at 
> org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.toRel(ExpandingPreparingTable.java:55)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3492)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2415)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2102)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:661)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:642)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3345)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:568)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:164)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:151)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:773)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:745)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:238)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:527)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (FLINK-18440) ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions

2020-06-28 Thread JinxinTang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18440?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

JinxinTang updated FLINK-18440:
---
Comment: was deleted

(was: Maybe this could help. :))

> ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or 
> ROW_NUMBER functions
> 
>
> Key: FLINK-18440
> URL: https://issues.apache.org/jira/browse/FLINK-18440
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Runtime
>Affects Versions: 1.11.0
>Reporter: LakeShen
>Priority: Major
> Fix For: 1.11.1
>
> Attachments: image-2020-06-28-18-55-43-692.png
>
>
> When I run flink sql ,the flink sql like this:
> create view test as select  name, eat ,sum(age) as cnt from test_source group 
> by  name,eat;
> create view results as select *, ROW_NUMBER() OVER (PARTITION BY name 
> ORDER BY cnt DESC) as row_num from test;
> create table sink (
> name varchar,
> eat varchar,
> cnt bigint
> )
> with(
> 'connector' = 'print'
> );
> insert into sink select name,eat , cnt from results where  row_num <= 3 ;
> The same sql code I could run success in flink 1.10, now I change the flink 
> version into flink 1.11, it throw the exception.
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. From line 1, column 124 to line 1, column 127: 
> ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:204)
>   at 
> org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52)
>   at 
> org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.toRel(ExpandingPreparingTable.java:55)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3492)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2415)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2102)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:661)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:642)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3345)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:568)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:164)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:151)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:773)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:745)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:238)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:527)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18440) ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions

2020-06-28 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17147312#comment-17147312
 ] 

JinxinTang commented on FLINK-18440:


Maybe this could help. :)

> ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or 
> ROW_NUMBER functions
> 
>
> Key: FLINK-18440
> URL: https://issues.apache.org/jira/browse/FLINK-18440
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Runtime
>Affects Versions: 1.11.0
>Reporter: LakeShen
>Priority: Major
> Fix For: 1.11.1
>
> Attachments: image-2020-06-28-18-55-43-692.png
>
>
> When I run flink sql ,the flink sql like this:
> create view test as select  name, eat ,sum(age) as cnt from test_source group 
> by  name,eat;
> create view results as select *, ROW_NUMBER() OVER (PARTITION BY name 
> ORDER BY cnt DESC) as row_num from test;
> create table sink (
> name varchar,
> eat varchar,
> cnt bigint
> )
> with(
> 'connector' = 'print'
> );
> insert into sink select name,eat , cnt from results where  row_num <= 3 ;
> The same sql code I could run success in flink 1.10, now I change the flink 
> version into flink 1.11, it throw the exception.
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. From line 1, column 124 to line 1, column 127: 
> ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:204)
>   at 
> org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52)
>   at 
> org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.toRel(ExpandingPreparingTable.java:55)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3492)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2415)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2102)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:661)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:642)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3345)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:568)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:164)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:151)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:773)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:745)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:238)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:527)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18440) ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions

2020-06-28 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17147311#comment-17147311
 ] 

JinxinTang commented on FLINK-18440:


!image-2020-06-28-18-55-43-692.png!

> ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or 
> ROW_NUMBER functions
> 
>
> Key: FLINK-18440
> URL: https://issues.apache.org/jira/browse/FLINK-18440
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Runtime
>Affects Versions: 1.11.0
>Reporter: LakeShen
>Priority: Major
> Fix For: 1.11.1
>
> Attachments: image-2020-06-28-18-55-43-692.png
>
>
> When I run flink sql ,the flink sql like this:
> create view test as select  name, eat ,sum(age) as cnt from test_source group 
> by  name,eat;
> create view results as select *, ROW_NUMBER() OVER (PARTITION BY name 
> ORDER BY cnt DESC) as row_num from test;
> create table sink (
> name varchar,
> eat varchar,
> cnt bigint
> )
> with(
> 'connector' = 'print'
> );
> insert into sink select name,eat , cnt from results where  row_num <= 3 ;
> The same sql code I could run success in flink 1.10, now I change the flink 
> version into flink 1.11, it throw the exception.
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. From line 1, column 124 to line 1, column 127: 
> ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:204)
>   at 
> org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52)
>   at 
> org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.toRel(ExpandingPreparingTable.java:55)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3492)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2415)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2102)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:661)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:642)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3345)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:568)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:164)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:151)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:773)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:745)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:238)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:527)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (FLINK-18440) ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions

2020-06-28 Thread JinxinTang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18440?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

JinxinTang updated FLINK-18440:
---
Attachment: image-2020-06-28-18-55-43-692.png

> ROW_NUMBER function: ROW/RANGE not allowed with RANK, DENSE_RANK or 
> ROW_NUMBER functions
> 
>
> Key: FLINK-18440
> URL: https://issues.apache.org/jira/browse/FLINK-18440
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Runtime
>Affects Versions: 1.11.0
>Reporter: LakeShen
>Priority: Major
> Fix For: 1.11.1
>
> Attachments: image-2020-06-28-18-55-43-692.png
>
>
> When I run flink sql ,the flink sql like this:
> create view test as select  name, eat ,sum(age) as cnt from test_source group 
> by  name,eat;
> create view results as select *, ROW_NUMBER() OVER (PARTITION BY name 
> ORDER BY cnt DESC) as row_num from test;
> create table sink (
> name varchar,
> eat varchar,
> cnt bigint
> )
> with(
> 'connector' = 'print'
> );
> insert into sink select name,eat , cnt from results where  row_num <= 3 ;
> The same sql code I could run success in flink 1.10, now I change the flink 
> version into flink 1.11, it throw the exception.
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. From line 1, column 124 to line 1, column 127: 
> ROW/RANGE not allowed with RANK, DENSE_RANK or ROW_NUMBER functions
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:204)
>   at 
> org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52)
>   at 
> org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
>   at 
> org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.toRel(ExpandingPreparingTable.java:55)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3492)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2415)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2102)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:661)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:642)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3345)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:568)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:164)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:151)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:773)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:745)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:238)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:527)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (FLINK-17657) jdbc not support read BIGINT UNSIGNED field

2020-05-14 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17657?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17107228#comment-17107228
 ] 

JinxinTang edited comment on FLINK-17657 at 5/14/20, 11:42 AM:
---

 

hi, [~zhanglun], Could you please provide code detail to reproduce this issus, 
it seems ok in my side, here is my code piece: [code 
piece|https://github.com/TJX2014/flink/blob/master-flink17657-jdbc-bigint/flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/jdbc/JdbcReader.java]


was (Author: jinxintang):
 

hi, [~zhanglun], Could you please provide code detail to reproduce this issus, 
it seems ok in my side, here is my code piece:[code 
piece|https://github.com/TJX2014/flink/blob/master-flink17657-jdbc-bigint/flink-examples/flink-examples-batch/src/main/scala/org/apache/flink/examples/scala/jdbc/ReadMysql.scala]

> jdbc not support read BIGINT UNSIGNED field
> ---
>
> Key: FLINK-17657
> URL: https://issues.apache.org/jira/browse/FLINK-17657
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC, Table SQL / Client
>Affects Versions: 1.10.0, 1.10.1
>Reporter: lun zhang
>Priority: Major
> Fix For: 1.11.0
>
> Attachments: env.yaml, excetion.txt
>
>
> I use sql client read mysql table, but I found I can't read a table contain 
> `BIGINT UNSIGNED` field. It will 
>  Caused by: java.lang.ClassCastException: java.math.BigInteger cannot be cast 
> to java.lang.Long
>  
> MySQL table:
>  
> create table tb
>  (
>  id BIGINT UNSIGNED auto_increment 
>  primary key,
>  cooper BIGINT(19) null ,
> user_sex VARCHAR(2) null 
> );
>  
> my env yaml is env.yaml .



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17657) jdbc not support read BIGINT UNSIGNED field

2020-05-14 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17657?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17107228#comment-17107228
 ] 

JinxinTang commented on FLINK-17657:


 

hi, [~zhanglun], Could you please provide code detail to reproduce this issus, 
it seems ok in my side, here is my code piece:[code 
piece|https://github.com/TJX2014/flink/blob/master-flink17657-jdbc-bigint/flink-examples/flink-examples-batch/src/main/scala/org/apache/flink/examples/scala/jdbc/ReadMysql.scala]

> jdbc not support read BIGINT UNSIGNED field
> ---
>
> Key: FLINK-17657
> URL: https://issues.apache.org/jira/browse/FLINK-17657
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC, Table SQL / Client
>Affects Versions: 1.10.0, 1.10.1
>Reporter: lun zhang
>Priority: Major
> Fix For: 1.11.0
>
> Attachments: env.yaml, excetion.txt
>
>
> I use sql client read mysql table, but I found I can't read a table contain 
> `BIGINT UNSIGNED` field. It will 
>  Caused by: java.lang.ClassCastException: java.math.BigInteger cannot be cast 
> to java.lang.Long
>  
> MySQL table:
>  
> create table tb
>  (
>  id BIGINT UNSIGNED auto_increment 
>  primary key,
>  cooper BIGINT(19) null ,
> user_sex VARCHAR(2) null 
> );
>  
> my env yaml is env.yaml .



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17657) jdbc not support read BIGINT UNSIGNED field

2020-05-13 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17657?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17106782#comment-17106782
 ] 

JinxinTang commented on FLINK-17657:


BigInt in mysql has a bigger range than Long in java, you could use Decimal 
instead in java to parse.

> jdbc not support read BIGINT UNSIGNED field
> ---
>
> Key: FLINK-17657
> URL: https://issues.apache.org/jira/browse/FLINK-17657
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC, Table SQL / Client
>Affects Versions: 1.10.0, 1.10.1
>Reporter: lun zhang
>Priority: Major
> Fix For: 1.11.0
>
> Attachments: env.yaml, excetion.txt
>
>
> I use sql client read mysql table, but I found I can't read a table contain 
> `BIGINT UNSIGNED` field. It will 
>  Caused by: java.lang.ClassCastException: java.math.BigInteger cannot be cast 
> to java.lang.Long
>  
> MySQL table:
>  
> create table tb
>  (
>  id BIGINT UNSIGNED auto_increment comment '自增主键'
>  primary key,
>  cooper BIGINT(19) null ,
> user_sex VARCHAR(2) null 
> );
>  
> my env yaml is env.yaml .



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-16937) ParquetTableSource should generate correct isFilterPushedDown

2020-04-22 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-16937?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17089369#comment-17089369
 ] 

JinxinTang commented on FLINK-16937:


Could you please provide minimum complete code to reproduce the issue?

> ParquetTableSource should generate correct isFilterPushedDown
> -
>
> Key: FLINK-16937
> URL: https://issues.apache.org/jira/browse/FLINK-16937
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.9.2, 1.10.0
>Reporter: Jingsong Lee
>Priority: Major
>
>  
> {code:java}
> if (predicate != null) {
>this.isFilterPushedDown = true;
> }
> {code}
> If all filters can not be converted to parquet filter, the predicate will be 
> null, this will lead to false isFilterPushdedDown, which is wrong.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17190) SQL client does not support views that reference a table from DDL

2020-04-20 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17087479#comment-17087479
 ] 

JinxinTang commented on FLINK-17190:


Could you please provide version info and is it can be reproduced in local 
environment?

> SQL client does not support views that reference a table from DDL
> -
>
> Key: FLINK-17190
> URL: https://issues.apache.org/jira/browse/FLINK-17190
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Client
>Reporter: Timo Walther
>Priority: Major
>
> It seems to be a classloading issue when the view references a DDL table.
> {code}
> CREATE TABLE PROD_LINEITEM (
>   L_ORDERKEY   INTEGER,
>   L_PARTKEYINTEGER,
>   L_SUPPKEYINTEGER,
>   L_LINENUMBER INTEGER,
>   L_QUANTITY   DOUBLE,
>   L_EXTENDEDPRICE  DOUBLE,
>   L_DISCOUNT   DOUBLE,
>   L_TAXDOUBLE,
>   L_CURRENCY   STRING,
>   L_RETURNFLAG STRING,
>   L_LINESTATUS STRING,
>   L_ORDERTIME  TIMESTAMP(3),
>   L_SHIPINSTRUCT   STRING,
>   L_SHIPMODE   STRING,
>   L_COMMENTSTRING,
>   WATERMARK FOR L_ORDERTIME AS L_ORDERTIME - INTERVAL '5' MINUTE,
>   L_PROCTIME   AS PROCTIME()
> ) WITH (
>   'connector.type' = 'kafka',
>   'connector.version' = 'universal',
>   'connector.topic' = 'Lineitem',
>   'connector.properties.zookeeper.connect' = 'not-needed',
>   'connector.properties.bootstrap.servers' = 'kafka:9092',
>   'connector.startup-mode' = 'earliest-offset',
>   'format.type' = 'csv',
>   'format.field-delimiter' = '|'
> );
> CREATE VIEW v AS SELECT * FROM PROD_LINEITEM;
> {code}
> Result:
> {code}
> Exception in thread "main" org.apache.flink.table.client.SqlClientException: 
> Unexpected exception. This is a bug. Please consider filing an issue.
>   at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)
> Caused by: org.apache.flink.table.api.TableException: 
> findAndCreateTableSource failed.
>   at 
> org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:55)
>   at 
> org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:92)
>   at 
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.findAndCreateTableSource(CatalogSourceTable.scala:156)
>   at 
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.tableSource$lzycompute(CatalogSourceTable.scala:65)
>   at 
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.tableSource(CatalogSourceTable.scala:65)
>   at 
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.scala:76)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3328)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2357)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2005)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:646)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:627)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3181)
>   at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:563)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:148)
>   at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:135)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:522)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:436)
>   at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:154)
>   at 
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:66)
>   at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)
>   at 
> org.apache.flink.table.client.gateway.local.LocalExecutor.addView(LocalExecutor.java:300)
>   at 
> org.apache.flink.table.client.cli.CliClient.callCreateView(CliClient.java:579)
>   at 
> org.apache.flink.table.client.cli.CliClient.callCommand(CliClient.java:308)
>   at java.util.Optional.ifPresent(Optional.java:159)
>   at org.apache.flink.table.client.cli.CliClient.open(CliClient.java:200)
>   at 

[jira] [Commented] (FLINK-16933) No suitable driver found for jdbc:xxx

2020-04-19 Thread JinxinTang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-16933?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17087007#comment-17087007
 ] 

JinxinTang commented on FLINK-16933:


Could you please provide simple method to reproduce the problem quickly, or 
picture attached to describe the detail?

> No suitable driver found for jdbc:xxx
> -
>
> Key: FLINK-16933
> URL: https://issues.apache.org/jira/browse/FLINK-16933
> Project: Flink
>  Issue Type: Bug
>  Components: API / DataStream
>Affects Versions: 1.9.0
> Environment: CentOS Linux release 7.7.1908 (Core)
>Reporter: morgan
>Priority: Major
>
> For example, there are two streaming jobs A and B.
> When A calculation is running, I submit B, at this time, an exception will be 
> thrown, Caused by: java.sql.SQLException: No suitable driver found for jdbc: 
> clickhouse: // host: 8123 / data. When stopping flink ./stop-cluster.sh. Then 
> restart ./start-cluster.sh. When flink is not running any tasks, submit B, 
> and B is successfully executed. But if you execute B multiple times, Caused 
> by: java.sql.SQLException: No suitable driver found for jdbc: clickhouse: // 
> host: 8123 / data will also be thrown. Both B and A depend on clickhouse4j in 
> the common. A submission can run normally, but B can only be executed when 
> restart and flink does not contain other tasks, which seems strange. I 
> checked flink / lib / and clickhouse4j-1.4.1.jar has been added. I tried to 
> package clickhouse4j-1.4.1.jar into the job, but still couldn't solve this 
> problem. I tried to add it to jre / lib / ext and added it to the classpath. 
> The problem was not solved. But I executed in the local environment idea and 
> did not throw any exception.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)