It can be done as follows in this JUnit test:
@Test
public void test() throws IOException {
Schema byteBlobSchema = Schema.create(Schema.Type.BYTES);
File path = new File("src/test/resources/test.avro");
GenericDatumWriter<ByteBuffer> wdata = new
GenericDatumWriter<ByteBuffer>();
DataFileWriter<ByteBuffer> dataFileWriter = new
DataFileWriter<ByteBuffer>(wdata);
dataFileWriter.create(byteBlobSchema, new FileOutputStream(path));
dataFileWriter.append(ByteBuffer.wrap(new String("Hello").getBytes()));
dataFileWriter.close();
GenericDatumReader<ByteBuffer> rdata =
new GenericDatumReader<ByteBuffer>(byteBlobSchema);
DataFileReader<ByteBuffer> dataFileReader =
new DataFileReader<ByteBuffer>(path, rdata);
ByteBuffer b = null;
while(dataFileReader.hasNext()) {
b = dataFileReader.next(b);
byte[] result = new byte[b.remaining()];
b.get(result);
System.out.println(new String(result));
}
}
Regards
Rob Turner.
On 11 February 2014 18:15, Milind Vaidya <[email protected]> wrote:
> I am trying to serialize and read byte blob with Byte Schema as follows
>
> static final org.apache.avro.Schema byteBlobSchema =
> org.apache.avro.Schema.create(org.apache.avro.Schema.Type.BYTES);
>
>
> The writing part works fine and I can serialize data. The problem is with
> reading.
>
> GenericDatumReader rdata = new GenericDatumReader(byteBlobSchema);
>
> DataFileReader dataFileReader = new DataFileReader(path, rdata);
>
> GenericData.Record record = new GenericData.Record(byteBlobSchema);
>
> while(dataFileReader.hasNext()) {
>
> dataFileReader.next(record);
>
> }
>
> This gives error as: Not a record schema: "bytes"
>
> There is something like
>
> public static final Schema.Type
> <https://avro.apache.org/docs/1.4.1/api/java/org/apache/avro/Schema.Type.html>
> *BYTES*
>
> and I checked if that is the same thing I am getting if I call
> byteBlobSchema.getType()
>
> Is there something I am missing ? Please let me know if I need to give
> more details about writing the the records.
>
>
> - Milind
>
>
>
>
>
>
>
>
--
Cheers
Rob.