HyukjinKwon commented on a change in pull request #32204:
URL: https://github.com/apache/spark/pull/32204#discussion_r634194606



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala
##########
@@ -443,20 +443,6 @@ class DataFrameReader private[sql](sparkSession: 
SparkSession) extends Logging {
    *
    * You can set the following JSON-specific options to deal with non-standard 
JSON files:
    * <ul>
-   * <li>`primitivesAsString` (default `false`): infers all primitive values 
as a string type</li>
-   * <li>`prefersDecimal` (default `false`): infers all floating-point values 
as a decimal
-   * type. If the values do not fit in decimal, then it infers them as 
doubles.</li>
-   * <li>`allowComments` (default `false`): ignores Java/C++ style comment in 
JSON records</li>
-   * <li>`allowUnquotedFieldNames` (default `false`): allows unquoted JSON 
field names</li>
-   * <li>`allowSingleQuotes` (default `true`): allows single quotes in 
addition to double quotes
-   * </li>
-   * <li>`allowNumericLeadingZeros` (default `false`): allows leading zeros in 
numbers
-   * (e.g. 00012)</li>
-   * <li>`allowBackslashEscapingAnyCharacter` (default `false`): allows 
accepting quoting of all
-   * character using backslash quoting mechanism</li>
-   * <li>`allowUnquotedControlChars` (default `false`): allows JSON Strings to 
contain unquoted
-   * control characters (ASCII characters with value less than 32, including 
tab and line feed
-   * characters) or not.</li>
    * <li>`mode` (default `PERMISSIVE`): allows a mode for dealing with corrupt 
records

Review comment:
       mode in read path is an option for JSON and CSV. write mode (overwrite, 
etc.) isn't an option.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to