[
https://issues.apache.org/jira/browse/HADOOP-4760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12672298#action_12672298
]
Raghu Angadi commented on HADOOP-4760:
--------------------------------------
bq. import statements are reordered and * are converted to actual classes again
by save actions. As in this case here, the import statements are constantly
switched between actual class names or * between patches.
hmm.. I am pretty sure eclipse can be configured not to do that (in fact, by
default it may not do that). If every patch includes a lot of corrections like
this, it would be pretty hard to track and maintain. There might even be
constants flips committed due to minor variations in different eclipse
configurations or JDKs used by eclipse environments. Pretty error prone as well.
I am -0.5 on these. I might be biased in this since I make sure my patch is not
polluted even by minor white space changes. At least two separate patches would
be much better. Note that it should be ok to fix the code just around the
actual code changes.
> HDFS streams should not throw exceptions when closed twice
> ----------------------------------------------------------
>
> Key: HADOOP-4760
> URL: https://issues.apache.org/jira/browse/HADOOP-4760
> Project: Hadoop Core
> Issue Type: Bug
> Components: fs/s3
> Affects Versions: 0.18.4, 0.19.1, 0.20.0, 0.21.0
> Environment: all
> Reporter: Alejandro Abdelnur
> Assignee: Enis Soztutar
> Fix For: 0.20.0
>
> Attachments: closehdfsstream_v1.patch, closehdfsstream_v2.patch
>
>
> When adding an {{InputStream}} via {{addResource(InputStream)}} to a
> {{Configuration}} instance, if the stream is a HDFS stream the
> {{loadResource(..)}} method fails with {{IOException}} indicating that the
> stream has already been closed.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.