[ 
https://issues.apache.org/jira/browse/HADOOP-18487?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17759687#comment-17759687
 ] 

ASF GitHub Bot commented on HADOOP-18487:
-----------------------------------------

apurtell commented on PR #4996:
URL: https://github.com/apache/hadoop/pull/4996#issuecomment-1696154985

   > we can't cut an unshaded protobuf of some form without RPC not linking, so 
hbase/hive/ozone are in trouble here. 
   
   HBase branch-3 and trunk are fine, we use our own shaded protobuf 
everywhere. The major version increment allowed it.
   
   For HBase branch-2, all the 2.x releases, and Apache Phoenix, the 
com.google.protobuf API classes are part of the RPC side of the Coprocessor API 
so indeed these would all be in trouble in theory. Internally even in 2.x 
releases we use protobuf 3 for actual RPC. The PB 2.5 classes are only required 
due to interface compatibility constraints and therefore its on us to ensure 
the dependencies are imported. We should not depend on Hadoop providing an 
explicit export of PB 2.5 anyway. 
   
   When I made a similar change in an internal fork of Hadoop 3 I removed the 
legacy RPC classes so linkage without any PB 2.5 imports at the Hadoop level in 
any scope (compile, test, provided, whatever) was not an issue. I couldn't find 
any users of the legacy RPC classes in any of our downstream code. This is a 
much more isolated ecosystem than the public Hadoop one so YMMV. Switching PB 
2.5 to 'provided' would be a good careful initial step. 




> protobuf-2.5.0 dependencies => provided
> ---------------------------------------
>
>                 Key: HADOOP-18487
>                 URL: https://issues.apache.org/jira/browse/HADOOP-18487
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: build, ipc
>    Affects Versions: 3.3.4
>            Reporter: Steve Loughran
>            Priority: Major
>              Labels: pull-request-available
>
> uses of protobuf 2.5 and RpcEnginej have been deprecated since 3.3.0 in 
> HADOOP-17046
> while still keeping those files around (for a long time...), how about we 
> make the protobuf 2.5.0 export off hadoop common and hadoop-hdfs *provided*, 
> rather than *compile*
> that way, if apps want it for their own apis, they have to explicitly ask for 
> it, but at least our own scans don't break.
> i have no idea what will happen to the rest of the stack at this point, it 
> will be "interesting" to see



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to