[
https://issues.apache.org/jira/browse/HADOOP-1536?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Nigel Daley updated HADOOP-1536:
--------------------------------
Status: Patch Available (was: Open)
The tests no longer fail with this patch. Moving to patch available.
> libhdfs tests failing
> ---------------------
>
> Key: HADOOP-1536
> URL: https://issues.apache.org/jira/browse/HADOOP-1536
> Project: Hadoop
> Issue Type: Bug
> Components: libhdfs
> Reporter: Nigel Daley
> Assignee: dhruba borthakur
> Priority: Blocker
> Fix For: 0.14.0
>
> Attachments: libhdfs.patch
>
>
> Starting today, 2 libhdfs tests are failing on Linux when I run
> "ant -Dtest.junit.output.format=xml -Dtest.output=yes -Dcompile.native=yes
> package-libhdfs tar test-core test-libhdfs"
> [exec] Exception in thread "main" org.apache.hadoop.ipc.RemoteException:
> java.io.IOException: Failure when trying to obtain lock on
> /tmp/.testfile.txt.crc
> [exec] at org.apache.hadoop.dfs.NameNode.obtainLock(NameNode.java:441)
> [exec] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [exec] at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> [exec] at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> [exec] at java.lang.reflect.Method.invoke(Method.java:585)
> [exec] at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:340)
> [exec] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:566)
> [exec] at org.apache.hadoop.ipc.Client.call(Client.java:470)
> [exec] at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:165)
> [exec] at org.apache.hadoop.dfs.$Proxy0.obtainLock(Unknown Source)
> [exec] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [exec] at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> [exec] at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> [exec] at java.lang.reflect.Method.invoke(Method.java:585)
> [exec] at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> [exec] at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> [exec] at org.apache.hadoop.dfs.$Proxy0.obtainLock(Unknown Source)
> [exec] at org.apache.hadoop.dfs.DFSClient.lock(DFSClient.java:478)
> [exec] at
> org.apache.hadoop.dfs.DistributedFileSystem$RawDistributedFileSystem.lock(DistributedFileSystem.java:195)
> [exec] at
> org.apache.hadoop.fs.ChecksumFileSystem.lock(ChecksumFileSystem.java:548)
> [exec] Call to org.apache.fs.FileSystem::lock failed!
> [exec] hdfsLock: Failed!
> [exec] Exception in thread "main" org.apache.hadoop.ipc.RemoteException:
> java.io.IOException: Failure when trying to release lock on
> /tmp/.testfile.txt.crc
> [exec] at org.apache.hadoop.dfs.NameNode.releaseLock(NameNode.java:453)
> [exec] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [exec] at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> [exec] at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> [exec] at java.lang.reflect.Method.invoke(Method.java:585)
> [exec] at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:340)
> [exec] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:566)
> [exec] at org.apache.hadoop.ipc.Client.call(Client.java:470)
> [exec] at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:165)
> [exec] at org.apache.hadoop.dfs.$Proxy0.releaseLock(Unknown Source)
> [exec] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [exec] at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> [exec] at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> [exec] at java.lang.reflect.Method.invoke(Method.java:585)
> [exec] at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> [exec] at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> [exec] at org.apache.hadoop.dfs.$Proxy0.releaseLock(Unknown Source)
> [exec] at org.apache.hadoop.dfs.DFSClient.release(DFSClient.java:498)
> [exec] at
> org.apache.hadoop.dfs.DistributedFileSystem$RawDistributedFileSystem.release(DistributedFileSystem.java:200)
> [exec] at
> org.apache.hadoop.fs.ChecksumFileSystem.release(ChecksumFileSystem.java:561)
> [exec] Call to org.apache.hadoop.fs.FileSystem::release failed!
> [exec] hdfsReleaseLock: Failed!
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.