wuchong commented on a change in pull request #11490: 
[FLINK-15579][table-planner-blink] UpsertStreamTableSink should work on batch 
mode
URL: https://github.com/apache/flink/pull/11490#discussion_r396487243
 
 

 ##########
 File path: 
flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/nodes/physical/batch/BatchExecSink.scala
 ##########
 @@ -81,13 +82,26 @@ class BatchExecSink[T](
   override protected def translateToPlanInternal(
       planner: BatchPlanner): Transformation[Any] = {
     val resultTransformation = sink match {
-      case _: RetractStreamTableSink[T] | _: UpsertStreamTableSink[T] =>
-        throw new TableException("RetractStreamTableSink and 
UpsertStreamTableSink is not" +
-          " supported in Batch environment.")
-
       case streamTableSink: StreamTableSink[T] =>
-        // we can insert the bounded DataStream into a StreamTableSink
-        val transformation = translateToTransformation(withChangeFlag = false, 
planner)
+        val transformation = streamTableSink match {
+          case _: RetractStreamTableSink[T] =>
+            translateToTransformation(withChangeFlag = true, planner)
+
+          case upsertSink: UpsertStreamTableSink[T] =>
+            UpdatingPlanChecker.getUniqueKeyForUpsertSink(this, planner, 
upsertSink) match {
+              case Some(keys) =>
+                upsertSink.setIsAppendOnly(false)
+                upsertSink.setKeyFields(keys)
+              case None =>
+                upsertSink.setIsAppendOnly(true)
+                upsertSink.setKeyFields(null)
 
 Review comment:
   I think this should be always in `isAppendOnly` and may have a key fields. 
Some connector can have an optimizaton based on this, e.g. MySQL can use INSERT 
INTO rather than INSERT .. ON DUPLICATE KEY. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to