[
https://issues.apache.org/jira/browse/HADOOP-15928?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16687270#comment-16687270
]
Steve Loughran commented on HADOOP-15928:
-----------------------------------------
Move the JIRA to HDFS, ask the team there what they expect.
> libhdfs logs errors when opened FS doesn't support ByteBufferReadable
> ---------------------------------------------------------------------
>
> Key: HADOOP-15928
> URL: https://issues.apache.org/jira/browse/HADOOP-15928
> Project: Hadoop Common
> Issue Type: Improvement
> Components: hdfs-client
> Reporter: Pranay Singh
> Assignee: Pranay Singh
> Priority: Major
> Labels: libhdfs
> Fix For: 3.0.3
>
> Attachments: HADOOP-15928.001.patch
>
>
> Problem:
> ------------
> There is excessive error logging when a file is opened by libhdfs
> (DFSClient/HDFS) in S3 environment, this issue is caused because buffered
> read is not supported in S3 environment, HADOOP-14603 "S3A input stream to
> support ByteBufferReadable"
> The following message is printed repeatedly in the error log/ to STDERR:
> --------------------------------------------------------------------------------------------------
> UnsupportedOperationException: Byte-buffer read unsupported by input
> streamjava.lang.UnsupportedOperationException: Byte-buffer read unsupported
> by input stream
> at
> org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:150)
> Root cause
> ----------------
> After investigating the issue, it appears that the above exception is printed
> because
> when a file is opened via hdfsOpenFileImpl() calls readDirect() which is
> hitting this
> exception.
> Fix:
> ----
> Since the hdfs client is not initiating the byte buffered read but is
> happening in a implicit manner, we should not be generating the error log
> during open of a file.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]