GitHub user davies opened a pull request:

    https://github.com/apache/spark/pull/3078

    [SPARK-4186] add binaryFiles and binaryRecords in Python

    add binaryFiles() and binaryRecords() in Python
    ```
    binaryFiles(self, path, minPartitions=None):
        :: Developer API ::
    
        Read a directory of binary files from HDFS, a local file system
        (available on all nodes), or any Hadoop-supported file system URI
        as a byte array. Each file is read as a single record and returned
        in a key-value pair, where the key is the path of each file, the
        value is the content of each file.
    
        Note: Small files are preferred, large file is also allowable, but
        may cause bad performance.
    
    binaryRecords(self, path, recordLength):
        Load data from a flat binary file, assuming each record is a set of 
numbers
        with the specified numerical format (see ByteBuffer), and the number of
        bytes per record is constant.
    
        :param path: Directory to the input data files
        :param recordLength: The length at which to split the records
    ```

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/davies/spark binary

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/3078.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #3078
    
----
commit bb224423073618c428dfac46c0bb7cc93bb35e6f
Author: Davies Liu <[email protected]>
Date:   2014-11-03T20:38:48Z

    add binaryFiles and binaryRecords in Python

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to