[
https://issues.apache.org/jira/browse/FLINK-1305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14235722#comment-14235722
]
Stephan Ewen commented on FLINK-1305:
-------------------------------------
How does Hadoop handle this? In many cases, they create a new instance as well
(public null-ary constructor) and use that to read the data...
> Flink's hadoop compatibility layer cannot handle NullWritables
> --------------------------------------------------------------
>
> Key: FLINK-1305
> URL: https://issues.apache.org/jira/browse/FLINK-1305
> Project: Flink
> Issue Type: Bug
> Components: Hadoop Compatibility
> Affects Versions: 0.7.0-incubating
> Reporter: Sebastian Schelter
> Assignee: Robert Metzger
> Priority: Critical
>
> NullWritable is a special object that is commonly used in Hadoop
> applications. NullWritable does not provide a public constructor, but only a
> singleton factory method. Therefore Flink fails when users to try to read
> NullWritables from Hadoop sequencefiles.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)