HyukjinKwon commented on a change in pull request #32161:
URL: https://github.com/apache/spark/pull/32161#discussion_r629802818



##########
File path: python/pyspark/sql/readwriter.py
##########
@@ -416,53 +416,10 @@ def parquet(self, *paths, **options):
 
         Other Parameters
         ----------------
-        mergeSchema : str or bool, optional
-            sets whether we should merge schemas collected from all
-            Parquet part-files. This will override
-            ``spark.sql.parquet.mergeSchema``. The default value is specified 
in
-            ``spark.sql.parquet.mergeSchema``.
-        pathGlobFilter : str or bool, optional
-            an optional glob pattern to only include files with paths matching
-            the pattern. The syntax follows `org.apache.hadoop.fs.GlobFilter`.
-            It does not change the behavior of
-            `partition discovery 
<https://spark.apache.org/docs/latest/sql-data-sources-parquet.html#partition-discovery>`_.
  # noqa
-        recursiveFileLookup : str or bool, optional
-            recursively scan a directory for files. Using this option
-            disables
-            `partition discovery 
<https://spark.apache.org/docs/latest/sql-data-sources-parquet.html#partition-discovery>`_.
  # noqa
-
-            modification times occurring before the specified time. The 
provided timestamp
-            must be in the following format: YYYY-MM-DDTHH:mm:ss (e.g. 
2020-06-01T13:00:00)
-        modifiedBefore (batch only) : an optional timestamp to only include 
files with

Review comment:
       this too. I think it's not parquet specific option. Can you double check 
if the options are parquet specific, and if there's some options mistakenly 
removed?  For example, the docs of this was removed and does not exist anymore.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to