Hi,

Avro array uses java.util.List datatype. So you must do something like

List<Double> nums = new ArrayList<Double>();
nums.add(new Double(9.97));
.
.

On Sep 24, 2013, at 9:02 PM, Raihan Jamal <[email protected]> wrote:

> Earlier, I was using JSON in our project so one of our attribute data looks 
> like below in JSON format. Below is the attribute `e3` data in JSON format.
>       
>       {"lv":[{"v":{"prc":9.97}},{"v":{"prc":5.56}},{"v":{"prc":21.48}}]}
>       
> Now, I am planning to use Apache Avro for our Data Serialization format. So I 
> decided to design the Avro schema for the above attributes data. And I came 
> up with the below design.
>   
>       {
>      "namespace": "com.avro.test.AvroExperiment",
>      "type": "record",
>      "name": "AVG_PRICE",
>      "doc": "AVG_PRICE data",
>      "fields": [
>          {"name": "prc", "type": {"type": "array", "items": "double"}}
>      ]
>     }
> 
> Now, I am not sure whether the above schema looks right or not corresponding 
> to the values I have in JSON? Can anyone help me on that? Assuming the above 
> schema looks correct, if I try to serialize the data using the above avro 
> schema, I always get the below error-
>   
>       double[] nums = new double[] { 9.97, 5.56, 21.48 };
>       
>       Schema schema = new 
> Parser().parse((AvroExperiment.class.getResourceAsStream("/aspmc.avsc")));
>       GenericRecord record = new GenericData.Record(schema);
>       record.put("prc", nums);
>       
>       GenericDatumWriter<GenericRecord> writer = new 
> GenericDatumWriter<GenericRecord>(schema); 
>       ByteArrayOutputStream os = new ByteArrayOutputStream(); 
> 
>       Encoder e = EncoderFactory.get().binaryEncoder(os, null);
>       
>       // this line gives me exception..
>       writer.write(record, e); 
>       
> Below is the exception, I always get-
> 
>     Exception in thread "main" java.lang.ClassCastException: [D incompatible 
> with java.util.Collection
>       
> Any idea what wrong I am doing here?

Reply via email to