HeartSaVioR commented on a change in pull request #30835:
URL: https://github.com/apache/spark/pull/30835#discussion_r545577586



##########
File path: python/pyspark/sql/streaming.py
##########
@@ -1464,6 +1491,76 @@ def start(self, path=None, format=None, outputMode=None, 
partitionBy=None, query
         else:
             return self._sq(self._jwrite.start(path))
 
+    def toTable(self, tableName, format=None, outputMode=None, 
partitionBy=None, queryName=None,
+                **options):
+        r"""
+        Streams the contents of the :class:`DataFrame` to the output table.
+
+        A new table will be created if the table not exists. The returned 
[[StreamingQuery]]
+        object can be used to interact with the stream.
+
+        .. versionadded:: 3.2.0

Review comment:
       Ideally it'd be nice to ship the change in 3.1.0 so that the API is 
available for both at the same time. I just safely set this to 3.2.0 to see the 
voices on when to add.

##########
File path: python/pyspark/sql/streaming.py
##########
@@ -953,6 +953,33 @@ def csv(self, path, schema=None, sep=None, encoding=None, 
quote=None, escape=Non
         else:
             raise TypeError("path can be only a single string")
 
+    def table(self, tableName):
+        r"""Define a Streaming DataFrame on a Table and returns the result as 
a :class:`DataFrame`.
+
+        The DataSource corresponding to the table should support streaming 
mode.
+
+        Parameters
+        ----------
+        tableName : str
+            string, for the name of the table.
+
+        .. versionadded:: 3.2.0

Review comment:
       Same here




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to