[ 
https://issues.apache.org/jira/browse/FLINK-17459?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17100442#comment-17100442
 ] 

Jark Wu commented on FLINK-17459:
---------------------------------

Hi [~michael ran], here is an example:


{code:java}
tableEnv.createTemporaryView("test",streamSource);
JDBCUpsertTableSink sink = JDBCUpsertTableSink.builder()
                .setOptions(options)
                .setTableSchema(schema)
                .setFlushIntervalMills(3000)
                .build();
tableEnv.registerTableSink("jdbc_sink", sink);
tableEnv.sqlUpdate("insert into jdbc_sink select  order_id,user_id,status from 
test");
tableEnv.execute()
{code}


> JDBCAppendTableSink not  support  flush  by flushIntervalMills
> --------------------------------------------------------------
>
>                 Key: FLINK-17459
>                 URL: https://issues.apache.org/jira/browse/FLINK-17459
>             Project: Flink
>          Issue Type: Improvement
>          Components: Connectors / JDBC
>    Affects Versions: 1.10.0
>            Reporter: ranqiqiang
>            Priority: Major
>
> {{JDBCAppendTableSink just support append by 
> "JDBCAppendTableSinkBuilder#batchSize",}}{{not support like 
> "JDBCUpsertTableSink#flushIntervalMills"}}
>  
> {{If batchSize=5000 ,  my data rows=5000*N+1 ,then last one record could not 
> be append !!}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to