Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16928#discussion_r120027395
  
    --- Diff: python/pyspark/sql/readwriter.py ---
    @@ -191,10 +191,13 @@ def json(self, path, schema=None, 
primitivesAsString=None, prefersDecimal=None,
             :param mode: allows a mode for dealing with corrupt records during 
parsing. If None is
                          set, it uses the default value, ``PERMISSIVE``.
     
    -                *  ``PERMISSIVE`` : sets other fields to ``null`` when it 
meets a corrupted \
    -                  record and puts the malformed string into a new field 
configured by \
    -                 ``columnNameOfCorruptRecord``. When a schema is set by 
user, it sets \
    -                 ``null`` for extra fields.
    +                * ``PERMISSIVE`` : sets other fields to ``null`` when it 
meets a corrupted \
    +                 record, and puts the malformed string into a field 
configured by \
    +                 ``columnNameOfCorruptRecord``. To keep corrupt records, 
an user can set \
    +                 a string type field named ``columnNameOfCorruptRecord`` 
in an user-defined \
    +                 schema. If a schema does not have the field, it drops 
corrupt records during \
    +                 parsing. When inferring a schema, it implicitly adds a \
    +                 ``columnNameOfCorruptRecord`` field in an output schema.
    --- End diff --
    
    Let me give a shot to fix the bug I found above (second case). I think this 
can be easily fixed (but I am pretty sure the behaviour could be arguable). I 
will open a PR and cc you to show what it looks like.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to