Hi David,
> It looks like Hadoop commons updated the PB RPC Engine from PB2 to PB3 and
> now the default is PB3. However, both cannot be loaded into the same JVM
> at the same time. Is there a reason for this?
Thanks for raising this issue. I also ran into this while making a
brief attempt to
Hello Gang,
I'm trying to upgrade the Apache Hive project to use Hadoop 3.3.0 so that
Hive can build/run with JDK 9/11. The current Hadoop dependency for Hive
fails to run with JDK 9/11.
However, I ran into an issue:
ProtobufRpcEngine2
It looks like Hadoop commons updated the PB RPC Engine fro
Alternatively, does anyone know how it was done that made the code
generated by protobuf libraries extend/implement the hadoop third-party JAR
instead of it's own? I can perhaps regenerate the Hive code with the
Hadoop 3P libraries and that might fix it.
org.apache.hadoop.thirdparty.protobuf.Gene
Hello,
For additional context, the unit tests in Hive are using MiniDFSCluster. I
suspect that is what is loading the ProtobufRpcEngine2 class, while the
local code under test uses ProtobufRpcEngine.
Is there a way to load MiniDFSCluster in its own class loader?
Thanks.
On Wed, Dec 9, 2020 at
Hello Gang,
There seems to be a new RpcEngine created as part of Hadoop 3.3. I'm
trying to upgrade Hive to use Hadoop 3.3 and it uses the RpcEngine
mechanism. However, tests are failing for Hive complaining about:
"ReRegistration of rpcKind"
I believe this is because both classes register, sta