You want to use ParquetWriter and ParquetRead with the appropriate
Read/WriteSupport
https://github.com/apache/incubator-parquet-mr/blob/998d6507ecabf025188d9f3e8c8367f810895a17/parquet-hadoop/src/main/java/parquet/hadoop/ParquetWriter.java
It still depends on the hadoop library but not to have a hadoop cluster
around. It will just use the Hadoop LocalFileSystem implementation.


On Fri, Mar 13, 2015 at 8:48 PM, Matt Bossenbroek <
[email protected]> wrote:

> How would I go about reading a parquet file locally in java code? Say I
> have a java.io.InputStream - what's the simplest way to get the structured
> data out of it, one record at a time?
>
> Same question for writing data to a file locally.
>
> I'm looking for something that preferably doesn't involve hadoop - just a
> simple reader/writer for the parquet format.
>
> Feel free to point me at any existing examples or discussions, but
> everything I've seen has been tightly coupled to the hadoop/pig/etc.
>
> This is the closest I could find for reading, but still uses hadoop stuff:
> https://github.com/apache/incubator-parquet-mr/blob/a0c77b6a442e2c4a355a4b145898bed976f23bb4/parquet-tools/src/main/java/parquet/tools/command/CatCommand.java#L56
>
> Thanks,
> Matt
>
>
>
>

Reply via email to