Github user kanzhang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1338#discussion_r15153275
  
    --- Diff: docs/programming-guide.md ---
    @@ -403,31 +403,30 @@ PySpark SequenceFile support loads an RDD within 
Java, and pickles the resulting
     <tr><td>BooleanWritable</td><td>bool</td></tr>
     <tr><td>BytesWritable</td><td>bytearray</td></tr>
     <tr><td>NullWritable</td><td>None</td></tr>
    -<tr><td>ArrayWritable</td><td>list of primitives, or tuple of 
objects</td></tr>
    --- End diff --
    
    We can't write arrays in Scala either (the implicit conversion from Array 
to ArrayWritable is marked private). Otherwise, it can be awkward as we can't 
read it back since ArrayWritable doesn't have a no-arg constructor. For user 
supplied ArrayWritable subtypes, we can read them, it's just they won't be 
implicitly converted. Essentially the same support as we have in Python. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to