[ 
https://issues.apache.org/jira/browse/HDFS-4909?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Colin Patrick McCabe updated HDFS-4909:
---------------------------------------

    Attachment: HDFS-4909.001.patch

This patch changes the protobuf package which DatanodeProtocol, 
NamenodeProtocol, and QJournalProtocol reside in to match their names.  This 
avoids the name conflicts that were previously caused by putting everything 
into "package hadoop.hdfs".

This doesn't alter what gets sent over the wire at all, so there are no 
compatibility implications.  I have confirmed this for myself by running some 
servers with this patch against clients which do not have it.

No changes to the Java code are needed, since we manually specify which Java 
packages things should go in (the java package is different than the protobuf 
package).  Essentially, the protobuf package is irrelevant in the java code 
because it gets overridden by {{option java_package}}.

It does change some of the generated code a bit.  For example, 
{{internal_static_hadoop_hdfs_CommitBlockSynchronizationResponseProto_descriptor}}
 becomes 
{{internal_static_hadoop_hdfs_datanode_CommitBlockSynchronizationResponseProto_descriptor}}.
  But this seems to all be internal stuff deep inside the PB generated code 
which doesn't get exposed to the outside world.

> Protocol buffer support compiles under C, but fails to link due to duplicate 
> symbols
> ------------------------------------------------------------------------------------
>
>                 Key: HDFS-4909
>                 URL: https://issues.apache.org/jira/browse/HDFS-4909
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: datanode, journal-node, namenode
>    Affects Versions: 3.0.0, 2.1.0-beta
>            Reporter: Ralph Castain
>            Assignee: Colin Patrick McCabe
>            Priority: Blocker
>         Attachments: HDFS-4909.001.patch, pcreate.pl
>
>
> The revised protocol buffer support seems to be compiling for me when using 
> the protobuf-c cross-compiler. However, I still cannot construct a library of 
> the results. This may be a Hadoop issue, or could be an issue with the 
> protobuf-c cross-compiler. What I see are a bunch of these when attempting to 
> link the resulting .o files:
> /home/common/hadoop/hadoop-common/foo/obj/DatanodeProtocol.pb-c.o: In 
> function `hadoop_hdfsreport_bad_blocks_request_proto_init':
> DatanodeProtocol.pb-c.c.text+0x2bb4): multiple definition of 
> `hadoop_hdfsreport_bad_blocks_request_proto_init'
> /home/common/hadoop/hadoop-common/foo/obj/ClientNamenodeProtocol.pb-c.o:ClientNamenodeProtocol.pb-c.c.text+0x277d):
>  first defined here
> From what I can see, this is caused by the
> package hadoop.hdfs;
> line in the .proto files, when combined with the later
> import "hdfs.proto";
> This appears to bring a complete copy of the hdfs.proto file into the source 
> code, which then recompiles it - leading to the duplicate symbols.
> I have attached an updated pcreate.pl script that illustrates the problem. 
> Excluding the following .proto files allows all to be successfully built and 
> linked:
> DatanodeProtocol
> ClientNamenodeProtocol
> QJournalProtocol
> HTH
> Ralph



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to