Hi Sudhir,

Sorry for the late reply as I was on vocation and had limited access to 
internet.


Usually the error message means that the connection between hdfs client and 
server is broken.
1. How did you write data to hdfs (i.e., libhdfs3, other libraries or 
applications) before you accessing it using libhdfs3?
2. What's the format of the data (i.e, parquet, orc)?
3. To dig deeper, you might need to go through how hawq use libhdfs3 to access 
data on hdfs, and then check if there is anything difference with your program.


Best regards,
Ruilong Huo

At 2019-02-08 09:36:37, "Sudhir Babu Pothineni" <[email protected]> wrote:
Hi Ruilong,


It’s through my own program.


Thanks
Sudhir

On Feb 7, 2019, at 7:11 PM, Ruilong Huo <[email protected]> wrote:


Hi Sudhir,


Please let us know if you are accessing HDFS through HAWQ or your program? 


Best regards,
Ruilong Huo


At 2019-02-08 07:37:47, "Yi JIN" <[email protected]> wrote:
Sudhir, you are welcome ;) I guess some guys are on holiday due to lunar new 
year holiday, I am pinging them to get response to you asap. 


Yi


On Fri, Feb 8, 2019 at 12:52 AM Sudhir Babu Pothineni <[email protected]> 
wrote:

Thanks Yi!

We are using hadoop-2.6.0-cdh5.14.0



On Feb 7, 2019, at 5:21 AM, Yi JIN <[email protected]> wrote:


Hi Sudhir,


I think the topics about libhdfs3 should be placed here. Can you provide full 
version number of HDFS you are using? Thanks.


Best
Yi (yjin)


On Thu, Feb 7, 2019 at 7:56 AM Sudhir Babu Pothineni <[email protected]> 
wrote:

Don’t no if this is right place to post this, I am using libhdfs3 from hawq 
bundle, I am getting following error very frequently,


Any suggestions?


20190206 09:05:46:683402 DETAIL - RemoteBlockReader.cpp: 332: HdfsIOException: 
RemoteBlockReader: failed to read Block: [block pool ID: 
BP-408958698-192.168.93.200-1519838163456 block ID 1074022852_282029] from 
Datanode: hostXXX.XXX.com(192.168.93.204).
20190206 09:05:46:683411 DETAIL - 
@Hdfs::Internal::RemoteBlockReader::read(char*, int)
20190206 09:05:46:683420 DETAIL - 
@Hdfs::Internal::InputStreamImpl::readOneBlock(char*, int, bool)
20190206 09:05:46:683427 DETAIL - 
@Hdfs::Internal::InputStreamImpl::readInternal(char*, int)
20190206 09:05:46:683435 DETAIL - @Hdfs::Internal::InputStreamImpl::read(char*, 
int)
20190206 09:05:46:683441 DETAIL - @hdfsRead


.....
20190206 09:05:46:683878 DETAIL - Caused by
20190206 09:05:46:683886 DETAIL - TcpSocket.cpp: 69: HdfsNetworkException: Read 
3040 bytes failed from "192.168.93.204:50010": (errno: 104) Connection reset by 
peer
20190206 09:05:46:683894 DETAIL - @Hdfs::Internal::TcpSocketImpl::read(char*, 
int)
20190206 09:05:46:683901 DETAIL - 
@Hdfs::Internal::TcpSocketImpl::readFully(char*, int, int)
20190206 09:05:46:683908 DETAIL - 
@Hdfs::Internal::RemoteBlockReader::readNextPacket()
20190206 09:05:46:683916 DETAIL - 
@Hdfs::Internal::RemoteBlockReader::read(char*, int)
20190206 09:05:46:683923 DETAIL - 
@Hdfs::Internal::InputStreamImpl::readOneBlock(char*, int, bool)
20190206 09:05:46:683931 DETAIL - 
@Hdfs::Internal::InputStreamImpl::readInternal(char*, int)
20190206 09:05:46:683938 DETAIL - @Hdfs::Internal::InputStreamImpl::read(char*, 
int)
20190206 09:05:46:683945 DETAIL - @hdfsRead
.....
, retry read again from another Datanode.

Reply via email to