Github user sun-rui commented on the pull request:

    https://github.com/apache/spark/pull/7494#issuecomment-123162043
  
    Could you try adding a zero as done previously in writeString():
    
        val utf8 = value.getBytes("UTF-8")
        val len = utf8.length
        out.writeInt(len + 1)
        out.write(utf8, 0, len)
        out.writeByte(0)
    
    For those unicode strings in the test case, not sure if need to force the 
encoding of them to be "UTF-8" before writing them to a JSON file. Seems not 
necessary.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to