[ https://issues.apache.org/jira/browse/MRUNIT-193?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13991199#comment-13991199 ]
Hudson commented on MRUNIT-193: ------------------------------- FAILURE: Integrated in mrunit-trunk #1119 (See [https://builds.apache.org/job/mrunit-trunk/1119/]) MRUNIT-193 - Serialization.copy throws NPE instead of ISE (missing serilization impl) for Hadoop 2.x (Cosmin Lehen via Brock Noland) (brock: rev 57196c39a73cf2251e0f2ba7da225731362407ba) * src/main/java/org/apache/hadoop/mrunit/internal/io/Serialization.java * src/test/java/org/apache/hadoop/mrunit/internal/io/TestSerialization.java > Serialization.copy throws NPE instead of ISE (missing serilization impl) for > Hadoop 2.x > --------------------------------------------------------------------------------------- > > Key: MRUNIT-193 > URL: https://issues.apache.org/jira/browse/MRUNIT-193 > Project: MRUnit > Issue Type: Bug > Affects Versions: 1.0.0 > Reporter: Cosmin Lehene > Assignee: Cosmin Lehene > Priority: Trivial > Fix For: 1.1.0 > > Attachments: > 0001-MRUNIT-193-Serialization.copy-throws-NPE-instead-of-.patch > > Original Estimate: 1h > Remaining Estimate: 1h > > This may be the result of a refactoring. > The current code attempts to catch a NPE with the intent to detect a missing > serialization implementation. > However this behavior differs between Hadoop 1.x and 2.x > Hadoop 1.x will NPE in the first try/catch block, while 2.x will in the > second. > {code} > try { > serializer = (Serializer<Object>) serializationFactory > .getSerializer(clazz); > deserializer = (Deserializer<Object>) serializationFactory > .getDeserializer(clazz); > } catch (NullPointerException e) { > throw new IllegalStateException( > "No applicable class implementing Serialization in conf at > io.serializations for " > + orig.getClass(), e); > } > try { > final DataOutputBuffer outputBuffer = new DataOutputBuffer(); > serializer.open(outputBuffer); > serializer.serialize(orig); > final DataInputBuffer inputBuffer = new DataInputBuffer(); > inputBuffer.reset(outputBuffer.getData(), outputBuffer.getLength()); > deserializer.open(inputBuffer); > return (T) deserializer.deserialize(copy); > } catch (final IOException e) { > throw new RuntimeException(e); > } > {code} > Hadoop 1.x > {code} > public <T> Serializer<T> getSerializer(Class<T> c) { > return getSerialization(c).getSerializer(c); > } > {code} > Hadoop 2.x > {code} > public <T> Serializer<T> getSerializer(Class<T> c) { > Serialization<T> serializer = getSerialization(c); > if (serializer != null) { > return serializer.getSerializer(c); > } > return null; > } > {code} -- This message was sent by Atlassian JIRA (v6.2#6252)