> I am writing a record, which has some array elements in it. In one of > the records the number of entries for the array are around 600, in that > case when i append a value in get an error "Value too large for file > block size". > > And the worst part is that, once i hit this error for records after that > i get the same error. > > I am using avro-c, the error comes while calling the routine > "avro_file_writer_append_value".
Hi Amit, this is a known issue; the relevant entry in the bug tracker is AVRO-724 [1]. If you use avro_file_writer_create_with_codec to open your output file, instead of avro_file_writer_create, then you can specify a larger maximum block size. (The default is 16K) As long as you know some reasonable cap on the serialized size of your array values, this should let you avoid the error. [1] https://issues.apache.org/jira/browse/AVRO-724 cheers –doug
signature.asc
Description: OpenPGP digital signature
