autophagy commented on code in PR #26167:
URL: https://github.com/apache/flink/pull/26167#discussion_r1964230526
##########
flink-python/pyflink/table/table.py:
##########
@@ -1077,6 +1078,91 @@ def explain(self, *extra_details: ExplainDetail) -> str:
j_extra_details = to_j_explain_detail_arr(extra_details)
return self._j_table.explain(TEXT, j_extra_details)
+ def insert_into(
+ self, table_path_or_descriptor: Union[str, TableDescriptor],
overwrite: bool = False
+ ) -> TablePipeline:
+ """
+ When ``target_path_or_descriptor`` is a table path:
+
+ Declares that the pipeline defined by the given :class:`Table`
(backed by a
+ DynamicTableSink) should be written to a table that was registered
under the specified
+ path.
+
+ See the documentation of
+
:func:`pyflink.table.table_environment.TableEnvironment.use_database` or
+
:func:`pyflink.table.table_environment.TableEnvironment.use_catalog` for the
rules on
+ the path resolution.
+
+ Example:
+ ::
+
+ >>> table = table_env.sql_query("SELECTFROM MyTable")
+ >>> table_pipeline =
table.insert_into_table_path("MySinkTable", True)
Review Comment:
Thanks for the catch - I had initially implemented `insert_into` as 2
different functions (`insert_into_table_path` and
`insert_into_table_descriptor`), but missed this doc example when refactoring
it into a single function `insert_into`. Will fix and add examples of both.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]