Let me explain further. Our data is not static. We do not know the type
of Java object at runtime, as we only have the schema. We use the avro
reflect package to transparently serialise and deserialise an Object
instance given its schema. Ours is a black box that can serialise and
deserialise any Object given a schema. We are given the Object to
serialise by the caller, which is not under our control -- the only
constraint is that both sides have the schema. The Specific
readers/writers need code generation, and the generic readers and
writers expect the objects to be "indexed records" and so barf. For any
old POJO (with schema), the black box method can only be satisified by
Avro's reflect package, unless I'm mistaken?
Peter
On 05/07/2012 18:09, Mark Hayes wrote:
On Thu, Jul 5, 2012 at 9:44 AM, Peter Cameron
<peter.came...@2icworld.com <mailto:peter.came...@2icworld.com>> wrote:
"This API is not recommended except as a stepping stone for
systems that currently uses Java interfaces to define RPC
protocols. For new RPC systems, the |specific|
<http://avro.apache.org/docs/1.7.0/api/java/org/apache/avro/specific/package-summary.html>
API is preferred. For systems that process dynamic data, the
|generic|
<http://avro.apache.org/docs/1.7.0/api/java/org/apache/avro/generic/package-summary.html>
API is probably best."
What I'm confused by is the assertion that the generic API is
"probably best" for processing dynamic data.
I am still fairly new to Avro but I think what the warning in the docs
is trying to say is that the Specific API is better for static data,
because reflection is slower. If you're representing data using a
Java bean, then your data is static (known at build time).
--mark
--
Peter Cameron
2iC Limited
T: +44 (0) 208 123 7479
E: peter.came...@2icworld.com
W: www.2iCworld.com