[ 
https://issues.apache.org/jira/browse/FLINK-27368?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jark Wu updated FLINK-27368:
----------------------------
    Description: 
{code:sql}
Flink SQL> select
>                     cast(' 1 ' as tinyint),
>                     cast(' 1 ' as smallint),
>                     cast(' 1 ' as int),
>                     cast(' 1 ' as bigint),
>                     cast(' 1 ' as float),
>                     cast(' 1 ' as double);
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
| op | EXPR$0 | EXPR$1 |      EXPR$2 |               EXPR$3 |                   
      EXPR$4 |                         EXPR$5 |
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
[ERROR] Could not execute SQL statement. Reason:
java.lang.NumberFormatException: For input string: ' 1 '. Invalid character 
found.
        at 
org.apache.flink.table.data.binary.BinaryStringDataUtil.numberFormatExceptionFor(BinaryStringDataUtil.java:585)
        at 
org.apache.flink.table.data.binary.BinaryStringDataUtil.toInt(BinaryStringDataUtil.java:518)
        at 
org.apache.flink.table.data.binary.BinaryStringDataUtil.toByte(BinaryStringDataUtil.java:568)
        at StreamExecCalc$392.processElement(Unknown Source)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
        at 
org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:92)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
        at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:332)
{code}

Setting CAST behavior to legacy but got null result :

{code}
Flink SQL> set table.exec.legacy-cast-behaviour=enabled;
[INFO] Session property has been set.

Flink SQL> select
>                     cast(' 1 ' as tinyint),
>                     cast(' 1 ' as smallint),
>                     cast(' 1 ' as int),
>                     cast(' 1 ' as bigint),
>                     cast(' 1 ' as float),
>                     cast(' 1 ' as double);
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
| op | EXPR$0 | EXPR$1 |      EXPR$2 |               EXPR$3 |                   
      EXPR$4 |                         EXPR$5 |
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.api.TableException: Column 'EXPR$0' is NOT NULL, 
however, a null value is being written into it. You can set job configuration 
'table.exec.sink.not-null-enforcer'='DROP' to suppress this exception and drop 
such records silently.
        at 
org.apache.flink.table.runtime.operators.sink.ConstraintEnforcer.processNotNullConstraint(ConstraintEnforcer.java:261)
        at 
org.apache.flink.table.runtime.operators.sink.ConstraintEnforcer.processElement(ConstraintEnforcer.java:241)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
        at StreamExecCalc$591.processElement_split1(Unknown Source)
        at StreamExecCalc$591.processElement(Unknown Source)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
        at 
org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:92)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
        at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:332)
{code}


In 1.14 the result should be {{[1, 1, 1, 1, 1.0, 1.0]}}. 

In Postgres:
{code}
postgres=# select cast(' 1 ' as int), cast(' 1 ' as bigint), cast(' 1 ' as 
float);
 int4 | int8 | float8
------+------+--------
    1 |    1 |      1
(1 row)
{code}

  was:
{code:sql}
Flink SQL> select
>                     cast(' 1 ' as tinyint),
>                     cast(' 1 ' as smallint),
>                     cast(' 1 ' as int),
>                     cast(' 1 ' as bigint),
>                     cast(' 1 ' as float),
>                     cast(' 1 ' as double);
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
| op | EXPR$0 | EXPR$1 |      EXPR$2 |               EXPR$3 |                   
      EXPR$4 |                         EXPR$5 |
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
[ERROR] Could not execute SQL statement. Reason:
java.lang.NumberFormatException: For input string: ' 1 '. Invalid character 
found.
        at 
org.apache.flink.table.data.binary.BinaryStringDataUtil.numberFormatExceptionFor(BinaryStringDataUtil.java:585)
        at 
org.apache.flink.table.data.binary.BinaryStringDataUtil.toInt(BinaryStringDataUtil.java:518)
        at 
org.apache.flink.table.data.binary.BinaryStringDataUtil.toByte(BinaryStringDataUtil.java:568)
        at StreamExecCalc$392.processElement(Unknown Source)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
        at 
org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:92)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
        at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:332)
{code}

Setting CAST behavior to legacy but got null result :

{code}
Flink SQL> set table.exec.legacy-cast-behaviour=enabled;
[INFO] Session property has been set.

Flink SQL> select
>                     cast(' 1 ' as tinyint),
>                     cast(' 1 ' as smallint),
>                     cast(' 1 ' as int),
>                     cast(' 1 ' as bigint),
>                     cast(' 1 ' as float),
>                     cast(' 1 ' as double);
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
| op | EXPR$0 | EXPR$1 |      EXPR$2 |               EXPR$3 |                   
      EXPR$4 |                         EXPR$5 |
+----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.api.TableException: Column 'EXPR$0' is NOT NULL, 
however, a null value is being written into it. You can set job configuration 
'table.exec.sink.not-null-enforcer'='DROP' to suppress this exception and drop 
such records silently.
        at 
org.apache.flink.table.runtime.operators.sink.ConstraintEnforcer.processNotNullConstraint(ConstraintEnforcer.java:261)
        at 
org.apache.flink.table.runtime.operators.sink.ConstraintEnforcer.processElement(ConstraintEnforcer.java:241)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
        at StreamExecCalc$591.processElement_split1(Unknown Source)
        at StreamExecCalc$591.processElement(Unknown Source)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
        at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
        at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
        at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
        at 
org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:92)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
        at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:332)
{code}


In 1.14 the result should be {{[null, null, null, null, 1.0, 1.0]}}. 

In Postgres:
{code}
postgres=# select cast(' 1 ' as int), cast(' 1 ' as bigint), cast(' 1 ' as 
float);
 int4 | int8 | float8
------+------+--------
    1 |    1 |      1
(1 row)
{code}


> CAST(' 1 ' as BIGINT) reutrns wrong result
> ------------------------------------------
>
>                 Key: FLINK-27368
>                 URL: https://issues.apache.org/jira/browse/FLINK-27368
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / Planner, Table SQL / Runtime
>    Affects Versions: 1.15.0
>            Reporter: Jark Wu
>            Priority: Critical
>             Fix For: 1.15.0
>
>
> {code:sql}
> Flink SQL> select
> >                     cast(' 1 ' as tinyint),
> >                     cast(' 1 ' as smallint),
> >                     cast(' 1 ' as int),
> >                     cast(' 1 ' as bigint),
> >                     cast(' 1 ' as float),
> >                     cast(' 1 ' as double);
> +----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
> | op | EXPR$0 | EXPR$1 |      EXPR$2 |               EXPR$3 |                 
>         EXPR$4 |                         EXPR$5 |
> +----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
> [ERROR] Could not execute SQL statement. Reason:
> java.lang.NumberFormatException: For input string: ' 1 '. Invalid character 
> found.
>       at 
> org.apache.flink.table.data.binary.BinaryStringDataUtil.numberFormatExceptionFor(BinaryStringDataUtil.java:585)
>       at 
> org.apache.flink.table.data.binary.BinaryStringDataUtil.toInt(BinaryStringDataUtil.java:518)
>       at 
> org.apache.flink.table.data.binary.BinaryStringDataUtil.toByte(BinaryStringDataUtil.java:568)
>       at StreamExecCalc$392.processElement(Unknown Source)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
>       at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
>       at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
>       at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
>       at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
>       at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
>       at 
> org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:92)
>       at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
>       at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
>       at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:332)
> {code}
> Setting CAST behavior to legacy but got null result :
> {code}
> Flink SQL> set table.exec.legacy-cast-behaviour=enabled;
> [INFO] Session property has been set.
> Flink SQL> select
> >                     cast(' 1 ' as tinyint),
> >                     cast(' 1 ' as smallint),
> >                     cast(' 1 ' as int),
> >                     cast(' 1 ' as bigint),
> >                     cast(' 1 ' as float),
> >                     cast(' 1 ' as double);
> +----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
> | op | EXPR$0 | EXPR$1 |      EXPR$2 |               EXPR$3 |                 
>         EXPR$4 |                         EXPR$5 |
> +----+--------+--------+-------------+----------------------+--------------------------------+--------------------------------+
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.flink.table.api.TableException: Column 'EXPR$0' is NOT NULL, 
> however, a null value is being written into it. You can set job configuration 
> 'table.exec.sink.not-null-enforcer'='DROP' to suppress this exception and 
> drop such records silently.
>       at 
> org.apache.flink.table.runtime.operators.sink.ConstraintEnforcer.processNotNullConstraint(ConstraintEnforcer.java:261)
>       at 
> org.apache.flink.table.runtime.operators.sink.ConstraintEnforcer.processElement(ConstraintEnforcer.java:241)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
>       at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
>       at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
>       at StreamExecCalc$591.processElement_split1(Unknown Source)
>       at StreamExecCalc$591.processElement(Unknown Source)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
>       at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
>       at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
>       at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
>       at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
>       at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
>       at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
>       at 
> org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:92)
>       at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
>       at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
>       at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:332)
> {code}
> In 1.14 the result should be {{[1, 1, 1, 1, 1.0, 1.0]}}. 
> In Postgres:
> {code}
> postgres=# select cast(' 1 ' as int), cast(' 1 ' as bigint), cast(' 1 ' as 
> float);
>  int4 | int8 | float8
> ------+------+--------
>     1 |    1 |      1
> (1 row)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to