[
https://issues.apache.org/jira/browse/HDFS-11026?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15862236#comment-15862236
]
Chris Douglas commented on HDFS-11026:
--------------------------------------
bq. The non-deterministic decoding exception needs to resolved before
integration.
I'm having trouble reproducing it. [~ehiggs], what version of the Oracle JDK
are you using? I tried Oracle 1.8.0_92, 1.8.0_121 and openjdk 1.8.0_121. All
throw NegativeArraySizeException.
> Convert BlockTokenIdentifier to use Protobuf
> --------------------------------------------
>
> Key: HDFS-11026
> URL: https://issues.apache.org/jira/browse/HDFS-11026
> Project: Hadoop HDFS
> Issue Type: Task
> Components: hdfs, hdfs-client
> Affects Versions: 2.9.0, 3.0.0-alpha1
> Reporter: Ewan Higgs
> Assignee: Ewan Higgs
> Fix For: 3.0.0-alpha3
>
> Attachments: blocktokenidentifier-protobuf.patch,
> HDFS-11026.002.patch, HDFS-11026.003.patch, HDFS-11026.004.patch,
> HDFS-11026.005.patch, HDFS-11026.006.patch
>
>
> {{BlockTokenIdentifier}} currently uses a {{DataInput}}/{{DataOutput}}
> (basically a {{byte[]}}) and manual serialization to get data into and out of
> the encrypted buffer (in {{BlockKeyProto}}). Other TokenIdentifiers (e.g.
> {{ContainerTokenIdentifier}}, {{AMRMTokenIdentifier}}) use Protobuf. The
> {{BlockTokenIdenfitier}} should use Protobuf as well so it can be expanded
> more easily and will be consistent with the rest of the system.
> NB: Release of this will require a version update since 2.8.x won't be able
> to decipher {{BlockKeyProto.keyBytes}} from 2.8.y.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]