[
https://issues.apache.org/jira/browse/HDFS-10265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15227787#comment-15227787
]
Wan Chang commented on HDFS-10265:
----------------------------------
FSEditLogOp$UpdateBlocksOp.fromXml:
{code}
void fromXml(Stanza st) throws InvalidXmlException {
this.path = st.getValue("PATH");
List<Stanza> blocks = st.getChildren("BLOCK");
this.blocks = new Block[blocks.size()];
for (int i = 0; i < blocks.size(); i++) {
this.blocks[i] = FSEditLogOp.blockFromXml(blocks.get(i));
}
readRpcIdsFromXml(st);
}
{code}
XMLUtils$Stanza.getChildren:
{code}
public List<Stanza> getChildren(String name) throws InvalidXmlException {
LinkedList <Stanza> children = subtrees.get(name);
if (children == null) {
throw new InvalidXmlException("no entry found for " + name);
}
return children;
}
{code}
When OEV tool encountered the OP_UPDATE_BLOCKS operation with no BLOCK tag, it
will throw Exception and exits.
Both 2.4.1 and 2.7.1 have the same problem.
> OEV tool fails to read edit xml file if OP_UPDATE_BLOCKS has no BLOCK tag
> -------------------------------------------------------------------------
>
> Key: HDFS-10265
> URL: https://issues.apache.org/jira/browse/HDFS-10265
> Project: Hadoop HDFS
> Issue Type: Bug
> Components: tools
> Affects Versions: 2.4.1, 2.7.1
> Reporter: Wan Chang
> Assignee: Wan Chang
> Priority: Minor
> Labels: patch
>
> I use OEV tool to convert editlog to xml file, then convert the xml file back
> to binary editslog file(so that low level NameNode can load edits that
> generated by high level NameNode). But when OP_UPDATE_BLOCKS has no BLOCK
> tag, the OEV tool doesn't handle the case and exits with InvalidXmlException.
> Here is the stack:
> {code}
> fromXml error decoding opcode null
> {<PATH>{"/tmp/100M3/slive/data/subDir_13/subDir_7/subDir_15/subDir_11/subFile_5"},
> <RPC_CALLID>{"-2"}, <RPC_CLIENTID>{},
> <TXID>{"3875711"}}
> Encountered exception. Exiting: no entry found for BLOCK
> org.apache.hadoop.hdfs.util.XMLUtils$InvalidXmlException: no entry found for
> BLOCK
> at
> org.apache.hadoop.hdfs.util.XMLUtils$Stanza.getChildren(XMLUtils.java:242)
> at
> org.apache.hadoop.hdfs.server.namenode.FSEditLogOp$UpdateBlocksOp.fromXml(FSEditLogOp.java:908)
> at
> org.apache.hadoop.hdfs.server.namenode.FSEditLogOp.decodeXml(FSEditLogOp.java:3942)
> ...
> {code}
> Here is part of the xml file:
> {code}
> <RECORD>
> <OPCODE>OP_UPDATE_BLOCKS</OPCODE>
> <DATA>
> <TXID>3875711</TXID>
>
> <PATH>/tmp/100M3/slive/data/subDir_13/subDir_7/subDir_15/subDir_11/subFile_5</PATH>
> <RPC_CLIENTID></RPC_CLIENTID>
> <RPC_CALLID>-2</RPC_CALLID>
> </DATA>
> </RECORD>
> {code}
> I tracked the NN's log and found those operation:
> 1. Client ask NN to add block to the file.
> 2. Client failed to write to DN and asked NameNode to abandon block.
> 3. NN remove the block and write an OP_UPDATE_BLOCKS to editlog
> The file
> /tmp/100M3/slive/data/subDir_13/subDir_7/subDir_15/subDir_11/subFile_5
> contains only one block, so NN generated a OP_UPDATE_BLOCKS with no BLOCK
> tags.
> In FSEditLogOp$UpdateBlocksOp.fromXml, we need to handle the case above.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)