[ 
https://issues.apache.org/jira/browse/HIVE-11339?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Zoltan Haindrich updated HIVE-11339:
------------------------------------
    Status: Patch Available  (was: Open)

i've removed write( OutputStream ) method - because it was not declared on any 
upstream interfaces; and was only used from a test - i corrected it for the 
public api usage

> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) 
> makes incorrect cast
> ---------------------------------------------------------------------------------------------
>
>                 Key: HIVE-11339
>                 URL: https://issues.apache.org/jira/browse/HIVE-11339
>             Project: Hive
>          Issue Type: Bug
>          Components: Serializers/Deserializers
>    Affects Versions: 0.14.0
>            Reporter: Arnaud Linz
>            Assignee: Zoltan Haindrich
>              Labels: easyfix, newbie
>         Attachments: HIVE-11339.patch
>
>
> Hi, it's my first Jira and I don't know how to make patches, so I'll explain 
> the issue in the description as it is rather simple.
> I have a problem serializing "DefaultHCatRecord" using Apache Flink when 
> those records include Timestamps because of an incorrect class cast in 
> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out). It 
> is implemented using a cast to Outputstream  : 
> public void write(DataOutput out) throws IOException {
>     write((OutputStream) out);
>  }
> but nothing says that a DataOutput object is an OutputStream, (and it's not 
> the case in Flink) it should rather be implmented using the same code as 
> write(OutputStream) :
> {
> checkBytes();
> out.write(currentBytes, offset, getTotalLength());
> }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to