Sachin,

The data model you're using (example/Group) is for internal testing and as
an example for object model implementers, not for building applications. To
build an application, you should use parquet-avro, which allows you to work
with Avro objects and schemas. That will solve your problem.

rb

On Tue, Mar 22, 2016 at 4:51 AM, Sachin Singh <[email protected]>
wrote:

> Hi,
>
> I have some data in my ArrayList<Hashmap<String,String>> can we write this
> data in parquet file or not,
>
>  if yes then how,please recommend and Java program,thanks in advance.
>
> Please Note I am not using any Spark,Kafka...
>
> if you will refer below example then I have gone through this but,
>
> I have gone though this and from the example 7,source code I want to
> implement similar thing (programcreek.com/java-api-examples/…
> <
> http://www.programcreek.com/java-api-examples/index.php?source_dir=pbase-master/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/pbase/util/GenerateParquetFile.java
> >
> )
>
> but after adding all dependencies code not working , I am getting issue
> near ParquetWriter<Group> writer = new
> ParquetWriter<Group>(initFile(fileName),new
> GoupWriteSupport(metas),CompressionCodecName.SNAPPY,
> 1024,1024,512,true,false,ParquetProperties.WriterVersion.PARQUET_1_0,conf);
> the The constructor GroupWriteSupport(Map<String,String>) is undefined,
> please can you help on this
>
> Regards
>
> Sachin
>



-- 
Ryan Blue
Software Engineer
Netflix

Reply via email to