[ 
https://issues.apache.org/jira/browse/FLINK-27544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Martijn Visser closed FLINK-27544.
----------------------------------
    Fix Version/s: 1.16.0
         Assignee: Chengkai Yang
       Resolution: Fixed

Fixed in master: 041b24b2bcd251023b87b692a671dfc42585c858

> Example code in 'Structure of Table API and SQL Programs' is out of date and 
> cannot run
> ---------------------------------------------------------------------------------------
>
>                 Key: FLINK-27544
>                 URL: https://issues.apache.org/jira/browse/FLINK-27544
>             Project: Flink
>          Issue Type: Improvement
>          Components: Documentation
>    Affects Versions: 1.14.0, 1.15.0, 1.14.2, 1.14.3, 1.14.4
>            Reporter: Chengkai Yang
>            Assignee: Chengkai Yang
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 1.16.0
>
>
> The example code in [Structure of Table API and SQL 
> Programs|https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/table/common/#structure-of-table-api-and-sql-programs]
>  of ['Concepts & Common 
> API'|https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/table/common/]
>  is out of date and when user run this piece of code, they will get the 
> following result:
> {code:java}
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> Unable to create a sink for writing table 
> 'default_catalog.default_database.SinkTable'.
> Table options are:
> 'connector'='blackhole'
> 'rows-per-second'='1'
>       at 
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSink(FactoryUtil.java:262)
>       at 
> org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:421)
>       at 
> org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:222)
>       at 
> org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:178)
>       at 
> scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233)
>       at scala.collection.Iterator.foreach(Iterator.scala:937)
>       at scala.collection.Iterator.foreach$(Iterator.scala:937)
>       at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
>       at scala.collection.IterableLike.foreach(IterableLike.scala:70)
>       at scala.collection.IterableLike.foreach$(IterableLike.scala:69)
>       at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>       at scala.collection.TraversableLike.map(TraversableLike.scala:233)
>       at scala.collection.TraversableLike.map$(TraversableLike.scala:226)
>       at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>       at 
> org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:178)
>       at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1656)
>       at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:782)
>       at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:861)
>       at 
> org.apache.flink.table.api.internal.TablePipelineImpl.execute(TablePipelineImpl.java:56)
>       at com.yck.TestTableAPI.main(TestTableAPI.java:43)
> Caused by: org.apache.flink.table.api.ValidationException: Unsupported 
> options found for 'blackhole'.
> Unsupported options:
> rows-per-second
> Supported options:
> connector
> property-version
>       at 
> org.apache.flink.table.factories.FactoryUtil.validateUnconsumedKeys(FactoryUtil.java:624)
>       at 
> org.apache.flink.table.factories.FactoryUtil$FactoryHelper.validate(FactoryUtil.java:914)
>       at 
> org.apache.flink.table.factories.FactoryUtil$TableFactoryHelper.validate(FactoryUtil.java:978)
>       at 
> org.apache.flink.connector.blackhole.table.BlackHoleTableSinkFactory.createDynamicTableSink(BlackHoleTableSinkFactory.java:64)
>       at 
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSink(FactoryUtil.java:259)
>       ... 19 more
> {code}
> I think this mistake would drive users crazy when they first fry Table API & 
> Flink SQL since this is the very first code they see.
> Overall this code is outdated in two places:
> 1. The Query creating temporary table should be 
> {code:sql}
> CREATE TEMPORARY TABLE SinkTable WITH ('connector' = 'blackhole') LIKE 
> SourceTable (EXCLUDING OPTIONS) 
> {code}
> instead of 
> {code:sql}
> CREATE TEMPORARY TABLE SinkTable WITH ('connector' = 'blackhole') LIKE 
> SourceTable 
> {code} which missed {code:sql}
> (EXCLUDING OPTIONS) 
> {code} sql_like_pattern
> 2. The part creating a source table should be 
> {code:java}
>         tableEnv.createTemporaryTable("SourceTable", 
> TableDescriptor.forConnector("datagen")
>                 .schema(Schema.newBuilder()
>                         .column("f0", DataTypes.STRING())
>                         .build())
>                 .option(DataGenConnectorOptions.ROWS_PER_SECOND, 1L)
>                 .build());
> {code}
> instead of  
> {code:java}
> tableEnv.createTemporaryTable("SourceTable", 
> TableDescriptor.forConnector("datagen")
>     .schema(Schema.newBuilder()
>       .column("f0", DataTypes.STRING())
>       .build())
>     .option(DataGenOptions.ROWS_PER_SECOND, 100)
>     .build());
> {code}
> since the class DataGenOptions was replaced by class DataGenConnectorOptions 
> in 
> [this 
> commit|https://github.com/apache/flink/pull/16334/commits/865e71fade2890c584d2aaf28af366249f116f2d#diff-a2b4ad2b1792b147efc81895f0dc1d0d092fbce56f563d1b37b73a2619f29a13]
> The test code is in my [github Repository(version 
> 1.15)|https://github.com/ChengkaiYang2022/flink-test/blob/main/flink115/src/main/java/com/yck/TestTableAPI.java#L22]
>  and [version 
> 1.14|https://github.com/ChengkaiYang2022/flink-test/blob/main/flink114/src/main/java/com/yck/TestTableAPI.java]
> The affected versions are 1.15 and 1.14.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to