[ 
https://issues.apache.org/jira/browse/FLINK-4315?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15622515#comment-15622515
 ] 

ASF GitHub Bot commented on FLINK-4315:
---------------------------------------

Github user fhueske commented on the issue:

    https://github.com/apache/flink/pull/2637
  
    Thanks for the update @kenmy. We are trying to keep the Java and Scala APIs 
as close as possible. Could you convert the Scala `FlinkHadoopEnvironment` into 
a `HadoopInputs` class as well?
    
    I also noticed that there are quite a few Hadoop-related tests in the 
`flink-tests` module. I think it would be good to move the tests from the 
`org.apache.flink.test.hadoop` and `org.apache.flink.api.scala.hadoop` packages 
of `flink-tests` to `flink-hadoop-compatibility`. 
    
    In fact, there might be a bit of overlap with other tests in 
`flink-hadoop-compatibility`. It would be great if you could check for tests 
with overlapping test coverage. Then we could drop some of these tests.
    
    Thanks for your work,
    Fabian


> Deprecate Hadoop dependent methods in flink-java
> ------------------------------------------------
>
>                 Key: FLINK-4315
>                 URL: https://issues.apache.org/jira/browse/FLINK-4315
>             Project: Flink
>          Issue Type: Task
>          Components: Java API
>            Reporter: Stephan Ewen
>            Assignee: Evgeny Kincharov
>             Fix For: 2.0.0
>
>
> The API projects should be independent of Hadoop, because Hadoop is not an 
> integral part of the Flink stack, and we should have the option to offer 
> Flink without Hadoop dependencies.
> The current batch APIs have a hard dependency on Hadoop, mainly because the 
> API has utility methods like `readHadoopFile(...)`.
> I suggest to deprecate those methods and add helpers in the 
> `flink-hadoop-compatibility` project.
> FLINK-4048 will later remove the deprecated methods.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to