slinkydeveloper commented on a change in pull request #18804:
URL: https://github.com/apache/flink/pull/18804#discussion_r808980741



##########
File path: docs/content/docs/dev/table/tableApi.md
##########
@@ -1454,21 +1454,23 @@ result3 = 
table.order_by(table.a.asc).offset(10).fetch(5)
 
 {{< label Batch >}} {{< label Streaming >}}
 
-Similar to the `INSERT INTO` clause in a SQL query, the method performs an 
insertion into a registered output table. The `executeInsert()` method will 
immediately submit a Flink job which execute the insert operation.
+Similar to the `INSERT INTO` clause in a SQL query, the method performs an 
insertion into a registered output table. 
+The `insertInto()` method will translate the `INSERT INTO` to a 
`TablePipeline`. 

Review comment:
       The reason I used _translate_ here is because I initially wanted to use 
_compile_, which better resembles the idea of this method. I personally don't 
like _convert_ here, as for me convert recalls converting data from one data 
structure to another. Here we're doing something more complicated than just 
converting data, we're doing parsing, planning, optimizing, etc. Yes, at the 
end it's still a conversion, but it's a bit too broad here. But again, i don't 
have a strong opinion, I'm open to suggestions




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to