[ 
https://issues.apache.org/jira/browse/HDFS-4046?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13476468#comment-13476468
 ] 

Binglin Chang commented on HDFS-4046:
-------------------------------------

bq. We should get more input on the fix and ways to prevent further mistakes 
like this.
BTW, it seams that the proto files are written in a java flavor. Many enums 
names are not proper named for language compatibility and may cause future 
problems. 
For example:

decster:~/projects/hadoop-trunk> grep "SUCCESS" `find . | grep "\.proto$"`
./hadoop-common-project/hadoop-common/src/main/proto/RpcPayloadHeader.proto: 
SUCCESS = 0;  // RPC succeeded
./hadoop-hdfs-project/hadoop-hdfs/src/main/proto/datatransfer.proto:    SUCCESS 
= 0;
./hadoop-hdfs-project/hadoop-hdfs/src/main/proto/datatransfer.proto:  SUCCESS = 
0;

If we don't have namespace declaration in proto files(which is the case for 
now), the SUCCESS enum values are redefined. In fact many enum names are just 
too short and ambiguous, such as:

RpcPayloadHeader.proto:

enum RpcStatusProto {
 SUCCESS = 0;  // RPC succeeded
 ERROR = 1;    // RPC Failed
 FATAL = 2;    // Fatal error - connection is closed
}

datatransfer.proto:
enum Status {
  SUCCESS = 0;
  ERROR = 1;
  ERROR_CHECKSUM = 2;
  ERROR_INVALID = 3;
  ERROR_EXISTS = 4;
  ERROR_ACCESS_TOKEN = 5;
  CHECKSUM_OK = 6;
}




                
> ChecksumTypeProto use NULL as enum value which is illegal in C/C++
> ------------------------------------------------------------------
>
>                 Key: HDFS-4046
>                 URL: https://issues.apache.org/jira/browse/HDFS-4046
>             Project: Hadoop HDFS
>          Issue Type: Bug
>            Reporter: Binglin Chang
>            Assignee: Binglin Chang
>            Priority: Minor
>         Attachments: HDFS-4046-ChecksumType-NULL-and-TestAuditLogs-bug.patch, 
> HDFS-4046-ChecksumType-NULL.patch
>
>
> I tried to write a native hdfs client using protobuf based protocol, when I 
> generate c++ code using hdfs.proto, the generated file can not compile, 
> because NULL is an already defined macro.
> I am thinking two solutions:
> 1. refactor all DataChecksum.Type.NULL references to NONE, which should be 
> fine for all languages, but this may breaking compatibility.
> 2. only change protobuf definition ChecksumTypeProto.NULL to NONE, and use 
> enum integer value(DataChecksum.Type.id) to convert between ChecksumTypeProto 
> and DataChecksum.Type, and make sure enum integer values are match(currently 
> already match).
> I can make a patch for solution 2.
>  

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to