[ 
https://issues.apache.org/jira/browse/HADOOP-15928?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685787#comment-16685787
 ] 

Pranay Singh commented on HADOOP-15928:
---------------------------------------

[~ste...@apache.org] this problem is not particular to Impala it is seen every 
time when a file is opened (via hdfsOpenFileImpl()) in a S3 environment there 
is an error/exception logged (below) to STDERR, which is unwarranted. This 
error is generated because hdfsOpenFileImpl() calls readDirect() to do a 
buffered read, which is resulting in this exception.

Message dumped to STDERR
----------------------------------------- 
UnsupportedOperationException: Byte-buffer read unsupported by input 
streamjava.lang.UnsupportedOperationException: Byte-buffer read unsupported by 
input stream
at org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:150)

Writing test case will require access to S3 which will require AWS credentials, 
I have done a manual test to verify the fix with my AWS keys (which cannot be 
shared)

> Excessive error logging when using HDFS in S3 environment
> ---------------------------------------------------------
>
>                 Key: HADOOP-15928
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15928
>             Project: Hadoop Common
>          Issue Type: Improvement
>            Reporter: Pranay Singh
>            Assignee: Pranay Singh
>            Priority: Major
>         Attachments: HADOOP-15928.001.patch
>
>
> Problem:
> ------------
> There is excessive error logging when Impala uses HDFS in S3 environment, 
> this issue is caused because of  defect HADOOP-14603 "S3A input stream to 
> support ByteBufferReadable"  
> Excessive error logging results in defect IMPALA-5256: "ERROR log files can 
> get very large". This causes the error log files to be huge. 
> The following message is printed repeatedly in the error log:
> UnsupportedOperationException: Byte-buffer read unsupported by input 
> streamjava.lang.UnsupportedOperationException: Byte-buffer read unsupported 
> by input stream
>         at 
> org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:150)
> Root cause
> ----------------
> After investigating the issue, it appears that the above exception is printed 
> because
> when a file is opened via hdfsOpenFileImpl() calls readDirect() which is 
> hitting this
> exception.
> Fix:
> ----
> Since the hdfs client is not initiating the byte buffered read but is 
> happening in a implicit manner, we should not be generating the error log 
> during open of a file.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to