Vanilla Hadoop 0.20.203.3 on Ubuntu with JDK 1.6.0. I don't know exactly how many failures there were, but there were a lot. Just from watching the output of ant I'd guess that about a quarter of the unit tests were failing.
I'm rerunning with "ant clean test" right now. Here's a snippet from the end of the org.apache.hadoop.cli.TestCLI log file: 2011-09-08 14:56:24,413 WARN datanode.DataNode (DataNode.java:makeInstance(1475)) - Invalid directory in dfs.data.dir: Incorrect permission for /atlas/spock/bmcneill/hadoop-deployment/hadoop/hadoop-0.20.203.0/build/test/data/df\ s/data/data1, expected: rwxr-xr-x, while actual: rwxrwxr-x 2011-09-08 14:56:24,430 WARN datanode.DataNode (DataNode.java:makeInstance(1475)) - Invalid directory in dfs.data.dir: Incorrect permission for /atlas/spock/bmcneill/hadoop-deployment/hadoop/hadoop-0.20.203.0/build/test/data/df\ s/data/data2, expected: rwxr-xr-x, while actual: rwxrwxr-x 2011-09-08 14:56:24,430 ERROR datanode.DataNode (DataNode.java:makeInstance(1481)) - All directories in dfs.data.dir are invalid. Maybe there's some umask weirdness? I'm not sure if all the errors are of this nature. PS. Today my replies to this list keep getting bounced as spam. I'm not sure why.
