GitHub user kiszk opened a pull request:

    [SPARK-17915][SQL] Prepare a new ColumnVector implementation for UnsafeData

    ## What changes were proposed in this pull request?
    This PR prepares a new implementation `OnHeapUnsafeColumnarVector` that is 
optimized for reading data from a `Unsafe` related data structure (e.g. 
`UnsafeArrayData` or `UnsafeMapData`).
    Current implementations of `ColumnarVector` are `OnHeapColumnarVector` and 
`OffHeapColumnarVector`, which are optimized for reading data from Parquet. 
When they get an array, an map, or an struct stored in a `Unsafe` related data 
structure, that operation leads to additional copy or data conversion. The 
`OnHeapUnsafeColumnarVector` only requires a simple memory copy and keep data 
in an `byte` array.
    `OnHeapUnsafeColumnarVector can compress/decompress stored data by using 
`CompressionCodec` with `spark.sql.inMemoryColumnarStorage.compression.codec` 
property (default is lz4).
    This PR is a part of This is an 
component independent of others. For ease of review, this PR only introduces 
    ## How was this patch tested?
    Perform existing tests for `OnHeapUnsafeColumnVector`

You can merge this pull request into a Git repository by running:

    $ git pull columnarcolumnvector

Alternatively you can review and apply these changes as the patch at:

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #15468
commit 7e703c6fe7c20138146e172b01d6511d5873338c
Author: Kazuaki Ishizaki <>
Date:   2016-10-13T18:00:52Z

    add OnHeapUnsafeColumnVector

commit 8b385310401be631a3ab740d7053aac31fa87952
Author: Kazuaki Ishizaki <>
Date:   2016-10-13T18:11:27Z

    add a file


If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to