[jira] [Updated] (HDFS-5637) try to refeatchToken while local read InvalidToken occurred

2014-02-03 Thread Kihwal Lee (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-5637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kihwal Lee updated HDFS-5637:
-

Fix Version/s: 0.23.11

> try to refeatchToken while local read InvalidToken occurred
> ---
>
> Key: HDFS-5637
> URL: https://issues.apache.org/jira/browse/HDFS-5637
> Project: Hadoop HDFS
>  Issue Type: Improvement
>  Components: hdfs-client, security
>Affects Versions: 2.0.5-alpha, 2.2.0
>Reporter: Liang Xie
>Assignee: Liang Xie
> Fix For: 0.23.11, 2.3.0
>
> Attachments: HDFS-5637-v2.txt, HDFS-5637.txt
>
>
> we observed several warning logs like below from region server nodes:
> 2013-12-05,13:22:26,042 WARN org.apache.hadoop.hdfs.DFSClient: Failed to 
> connect to /10.2.201.110:11402 for block, add to deadNodes and continue. 
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> at 
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
> at 
> org.apac

[jira] [Updated] (HDFS-5637) try to refeatchToken while local read InvalidToken occurred

2013-12-11 Thread Junping Du (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-5637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Junping Du updated HDFS-5637:
-

   Resolution: Fixed
Fix Version/s: 2.4.0
   Status: Resolved  (was: Patch Available)

I have commit this to trunk and branch-2. Thanks Liang for the patch and stack, 
colin for review!

> try to refeatchToken while local read InvalidToken occurred
> ---
>
> Key: HDFS-5637
> URL: https://issues.apache.org/jira/browse/HDFS-5637
> Project: Hadoop HDFS
>  Issue Type: Improvement
>  Components: hdfs-client, security
>Affects Versions: 2.0.5-alpha, 2.2.0
>Reporter: Liang Xie
>Assignee: Liang Xie
> Fix For: 2.4.0
>
> Attachments: HDFS-5637-v2.txt, HDFS-5637.txt
>
>
> we observed several warning logs like below from region server nodes:
> 2013-12-05,13:22:26,042 WARN org.apache.hadoop.hdfs.DFSClient: Failed to 
> connect to /10.2.201.110:11402 for block, add to deadNodes and continue. 
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Cons

[jira] [Updated] (HDFS-5637) try to refeatchToken while local read InvalidToken occurred

2013-12-09 Thread Liang Xie (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-5637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Liang Xie updated HDFS-5637:


Attachment: HDFS-5637-v2.txt

> try to refeatchToken while local read InvalidToken occurred
> ---
>
> Key: HDFS-5637
> URL: https://issues.apache.org/jira/browse/HDFS-5637
> Project: Hadoop HDFS
>  Issue Type: Improvement
>  Components: hdfs-client, security
>Affects Versions: 2.0.5-alpha, 2.2.0
>Reporter: Liang Xie
>Assignee: Liang Xie
> Attachments: HDFS-5637-v2.txt, HDFS-5637.txt
>
>
> we observed several warning logs like below from region server nodes:
> 2013-12-05,13:22:26,042 WARN org.apache.hadoop.hdfs.DFSClient: Failed to 
> connect to /10.2.201.110:11402 for block, add to deadNodes and continue. 
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> at 
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
> at 
> org.apache.hadoop.ipc.RemoteException.unwrap

[jira] [Updated] (HDFS-5637) try to refeatchToken while local read InvalidToken occurred

2013-12-05 Thread Liang Xie (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-5637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Liang Xie updated HDFS-5637:


Attachment: HDFS-5637.txt

> try to refeatchToken while local read InvalidToken occurred
> ---
>
> Key: HDFS-5637
> URL: https://issues.apache.org/jira/browse/HDFS-5637
> Project: Hadoop HDFS
>  Issue Type: Improvement
>  Components: hdfs-client, security
>Affects Versions: 2.0.5-alpha, 2.2.0
>Reporter: Liang Xie
>Assignee: Liang Xie
> Attachments: HDFS-5637.txt
>
>
> we observed several warning logs like below from region server nodes:
> 2013-12-05,13:22:26,042 WARN org.apache.hadoop.hdfs.DFSClient: Failed to 
> connect to /10.2.201.110:11402 for block, add to deadNodes and continue. 
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> at 
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
> at 
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(Remot

[jira] [Updated] (HDFS-5637) try to refeatchToken while local read InvalidToken occurred

2013-12-05 Thread Liang Xie (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-5637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Liang Xie updated HDFS-5637:


Status: Patch Available  (was: Open)

> try to refeatchToken while local read InvalidToken occurred
> ---
>
> Key: HDFS-5637
> URL: https://issues.apache.org/jira/browse/HDFS-5637
> Project: Hadoop HDFS
>  Issue Type: Improvement
>  Components: hdfs-client, security
>Affects Versions: 2.2.0, 2.0.5-alpha
>Reporter: Liang Xie
>Assignee: Liang Xie
> Attachments: HDFS-5637.txt
>
>
> we observed several warning logs like below from region server nodes:
> 2013-12-05,13:22:26,042 WARN org.apache.hadoop.hdfs.DFSClient: Failed to 
> connect to /10.2.201.110:11402 for block, add to deadNodes and continue. 
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with 
> block_token_identifier (expiryDate=1386060141977, keyId=-333530248, 
> userId=hbase_srv, blockPoolId=BP-1310313570-10.101.10.66-1373527541386, 
> blockId=-190217754078101701, access modes=[READ]) is expired.
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280)
> at 
> org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:88)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkBlockToken(DataNode.java:1082)
> at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.getBlockLocalPathInfo(DataNode.java:1033)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientDatanodeProtocolServerSideTranslatorPB.getBlockLocalPathInfo(ClientDatanodeProtocolServerSideTranslatorPB.java:112)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientDatanodeProtocolProtos$ClientDatanodeProtocolService$2.callBlockingMethod(ClientDatanodeProtocolProtos.java:5104)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> at 
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
> at 
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteExce