In addition to what you have already noticed, the javadoc for these changes has a lot of other related information:
http://avro.apache.org/docs/current/api/java/org/apache/avro/io/EncoderFact ory.html http://avro.apache.org/docs/current/api/java/org/apache/avro/io/DecoderFact ory.html#directBinaryDecoder%28java.io.InputStream,%20org.apache.avro.io.Bi naryDecoder%29 On 2/24/12 3:38 PM, "Lewis John Mcgibbney" <[email protected]> wrote: >Hi List, > >I've embarked upon upgrading the Avro (and subsequently Hadoop) libraries >in Apache Gora to 1.6.2 and 1.0.0 respectively and have run into a small >(one of many) question. > >Use of DecoderFactory().configureDirectDecoder(boolean) is now deprecated >[0], so I am looking for advice on how to configure the direct decoder >accordingly, to achieve the same results as explained below. > >I was thinking of implementing > >decoder = new DecoderFactory().binaryDecoder(in, decoder), however is >this suitable/satisfactory? > >If so can someone please explain why this is the case and how >functionality has been retained even after removing the configuration for >the DirectDecoder? > >Thank you in advance for any suggestions. > >Lewis > > @Override > public void open(InputStream in) throws IOException { > /* It is very important to use a direct buffer, since Hadoop > * supplies an input stream that is only valid until the end of one > * record serialization. Each time deserialize() is called, the IS > * is advanced to point to the right location, so we should not > * buffer the whole input stream at once. > */ > decoder = new DecoderFactory().configureDirectDecoder(true) > .createBinaryDecoder(in, decoder); > } > >[0] >http://avro.apache.org/docs/1.6.2/api/java/org/apache/avro/io/DecoderFacto >ry.html >-- > >Lewis
