[ 
https://issues.apache.org/jira/browse/HDFS-15101?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran resolved HDFS-15101.
-----------------------------------
    Resolution: Cannot Reproduce

> Only few files[5] created with empty content out of 10000 files ingested to 
> hdfs
> --------------------------------------------------------------------------------
>
>                 Key: HDFS-15101
>                 URL: https://issues.apache.org/jira/browse/HDFS-15101
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: hdfs
>         Environment: Production
>            Reporter: Chandrashekar S
>            Priority: Major
>
> When we ingesting files through spark streaming, we are finding few files are 
> empty and 99.9% files are created properly with contents. In the yarn we 
> found that an interruption generated at each failure to write contents to 
> file.
>  
>  
>  
> 20/01/08 16:43:16 INFO DFSClient: Exception in createBlockOutputStream
> java.io.InterruptedIOException: Interrupted while waiting for IO on channel 
> java.nio.channels.SocketChannel[connected local=/10.136.184.59:51154 
> remote=/10.136.184.59:1019]. 75000 millis timeout left.
>  at 
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:342)
>  at 
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
>  at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
>  at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
>  at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118)
>  at java.io.FilterInputStream.read(FilterInputStream.java:83)
>  at java.io.FilterInputStream.read(FilterInputStream.java:83)
>  at 
> org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2390)
>  at 
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1455)
>  at 
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1374)
>  at 
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:552)
> 20/01/08 16:43:16 INFO BlockManager: Removing RDD 6
> 20/01/08 16:43:16 INFO DFSClient: Abandoning 
> BP-383742638-10.136.184.33-1429219667936:blk_2355453392_1283401139
> 20/01/08 16:43:16 WARN Client: interrupted waiting to send rpc request to 
> server
> java.lang.InterruptedException
>  at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404)
>  at java.util.concurrent.FutureTask.get(FutureTask.java:191)
>  at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1094)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1457)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1398)
>  at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
>  at com.sun.proxy.$Proxy17.abandonBlock(Unknown Source)
>  at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:436)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)
>  at com.sun.proxy.$Proxy18.abandonBlock(Unknown Source)
>  at 
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1378)
>  at 
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:552)
> 20/01/08 16:43:16 WARN DFSClient: DataStreamer Exception
> java.io.IOException: java.lang.InterruptedException
>  at org.apache.hadoop.ipc.Client.call(Client.java:1463)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1398)
>  at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
>  at com.sun.proxy.$Proxy17.abandonBlock(Unknown Source)
>  at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:436)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)
>  at com.sun.proxy.$Proxy18.abandonBlock(Unknown Source)
>  at 
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1378)
>  at 
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:552)
> Caused by: java.lang.InterruptedException
>  at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404)
>  at java.util.concurrent.FutureTask.get(FutureTask.java:191)
>  at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1094)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1457)
>  ... 14 more
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org

Reply via email to