simenliuxing created FLINK-24536:
------------------------------------

             Summary: flink sql where condition is not supported  != writing
                 Key: FLINK-24536
                 URL: https://issues.apache.org/jira/browse/FLINK-24536
             Project: Flink
          Issue Type: Improvement
          Components: Table SQL / Planner
    Affects Versions: 1.14.0
            Reporter: simenliuxing
             Fix For: 1.14.1


sql:

 
{code:java}
CREATE TABLE source
(
 id INT,
 name STRING,
 money DECIMAL(32, 2),
 dateone timestamp,
 age bigint,
 datethree timestamp,
 datesix timestamp(6),
 datenigth timestamp(9),
 dtdate date,
 dttime time
) WITH (
 'connector' = 'datagen'
 ,'rows-per-second' = '1'
 );

CREATE TABLE sink
(
 id bigint,
 name STRING
) WITH (
 'connector' = 'print'
 );

insert into sink
select sum(id) as id, name
from source
where name != 'aa'
group by name;
{code}
 

exception:

 
{code:java}
Caused by: org.apache.calcite.sql.parser.SqlParseException: Bang equal '!=' is 
not allowed under the current SQL conformance levelCaused by: 
org.apache.calcite.sql.parser.SqlParseException: Bang equal '!=' is not allowed 
under the current SQL conformance level at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.convertException(FlinkSqlParserImpl.java:462)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.normalizeException(FlinkSqlParserImpl.java:225)
 at org.apache.calcite.sql.parser.SqlParser.handleException(SqlParser.java:140) 
at org.apache.calcite.sql.parser.SqlParser.parseQuery(SqlParser.java:155) at 
org.apache.calcite.sql.parser.SqlParser.parseStmt(SqlParser.java:180) at 
org.apache.flink.table.planner.parse.CalciteParser.parse(CalciteParser.java:54) 
... 22 moreCaused by: org.apache.calcite.runtime.CalciteException: Bang equal 
'!=' is not allowed under the current SQL conformance level at 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at 
org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:467) at 
org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:560) at 
org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:883) at 
org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:868) at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.BinaryRowOperator(FlinkSqlParserImpl.java:31759)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2(FlinkSqlParserImpl.java:19802)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression(FlinkSqlParserImpl.java:19553)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.WhereOpt(FlinkSqlParserImpl.java:14370)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.SqlSelect(FlinkSqlParserImpl.java:7836)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.LeafQuery(FlinkSqlParserImpl.java:704)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.LeafQueryOrExpr(FlinkSqlParserImpl.java:19536)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.QueryOrExpr(FlinkSqlParserImpl.java:18982)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.OrderedQueryOrExpr(FlinkSqlParserImpl.java:578)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.RichSqlInsert(FlinkSqlParserImpl.java:5596)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.SqlStmt(FlinkSqlParserImpl.java:3404)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.SqlStmtEof(FlinkSqlParserImpl.java:3980)
 at 
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.parseSqlStmtEof(FlinkSqlParserImpl.java:273)
 at org.apache.calcite.sql.parser.SqlParser.parseQuery(SqlParser.java:153) ... 
24 more{code}
 

It is ok when I use the following syntax:

where name <> 'aa'

Why not support '!=' This kind of grammar, will it be supported later?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to