HyukjinKwon commented on a change in pull request #32546:
URL: https://github.com/apache/spark/pull/32546#discussion_r635796796
##########
File path: sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala
##########
@@ -874,23 +874,14 @@ class DataFrameReader private[sql](sparkSession:
SparkSession) extends Logging {
/**
* Loads ORC files and returns the result as a `DataFrame`.
*
- * You can set the following ORC-specific option(s) for reading ORC files:
- * <ul>
- * <li>`mergeSchema` (default is the value specified in
`spark.sql.orc.mergeSchema`): sets whether
- * we should merge schemas collected from all ORC part-files. This will
override
- * `spark.sql.orc.mergeSchema`.</li>
- * <li>`pathGlobFilter`: an optional glob pattern to only include files with
paths matching
- * the pattern. The syntax follows
<code>org.apache.hadoop.fs.GlobFilter</code>.
- * It does not change the behavior of partition discovery.</li>
- * <li>`modifiedBefore` (batch only): an optional timestamp to only include
files with
- * modification times occurring before the specified Time. The provided
timestamp
- * must be in the following form: YYYY-MM-DDTHH:mm:ss (e.g.
2020-06-01T13:00:00)</li>
- * <li>`modifiedAfter` (batch only): an optional timestamp to only include
files with
- * modification times occurring after the specified Time. The provided
timestamp
- * must be in the following form: YYYY-MM-DDTHH:mm:ss (e.g.
2020-06-01T13:00:00)</li>
- * <li>`recursiveFileLookup`: recursively scan a directory for files. Using
this option
- * disables partition discovery</li>
- * </ul>
+ * ORC-specific option(s) for reading ORC files can be found in
Review comment:
ditto. it says ORC-specific options but also mentions about Generic
Files Source Options later
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]