[ 
https://issues.apache.org/jira/browse/FLINK-1076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14124697#comment-14124697
 ] 

ASF GitHub Bot commented on FLINK-1076:
---------------------------------------

Github user fhueske commented on the pull request:

    https://github.com/apache/incubator-flink/pull/37#issuecomment-54731181
  
    @atsikiridis I think we can close this PR for now.
    
    The support to run complete Hadoop Jobs requires a bit more work. At least 
the combiner should work.
    I am waiting for comments on PR #108 which is required to add custom 
combiners to the Hadoop Job operation.
    
    Parts of this PR (wrappers for iterators and collectors, dummy reporters, 
etc.) can be added in a new PR which addresses FLINK-1076.


>  Support function-level compatibility for  Hadoop's wrappers functions
> ----------------------------------------------------------------------
>
>                 Key: FLINK-1076
>                 URL: https://issues.apache.org/jira/browse/FLINK-1076
>             Project: Flink
>          Issue Type: New Feature
>          Components: Hadoop Compatibility
>    Affects Versions: 0.7-incubating
>            Reporter: Artem Tsikiridis
>            Assignee: Artem Tsikiridis
>              Labels: features
>
> While the Flink wrappers for Hadoop Map and Reduce tasks are implemented in 
> https://github.com/apache/incubator-flink/pull/37 it is currently not 
> possible to use the {{HadoopMapFunction}} and the {{HadoopReduceFunction}} 
> without a {{JobConf}}. It woule be useful if we could specify a Hadoop 
> Mapper, Reducer (or Combiner) and use them as seperate components in a Flink 
> Job.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to