I just noticed today that I could not run any test that starts a MiniDFSCluster.
The exception I got was this:
java.lang.NullPointerException
at
org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:422)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:280)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:350)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:519)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:475)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:462)
In the logs I had:
2011-10-27 14:17:48,238 WARN [main] datanode.DataNode(1540): Invalid directory
in dfs.data.dir: Incorrect permission for
/home/lars/dev/hbase-trunk/target/test-data/8f8d2437-1d9a-42fa-b7c3-c154d8e559f3/dfscluster_557b48bc-9c8e-4a47-b74e-4c0167710237/dfs/data/data1,
expected: rwxr-xr-x, while actual: rwxrwxr-x
2011-10-27 14:17:48,260 WARN [main] datanode.DataNode(1540): Invalid directory
in dfs.data.dir: Incorrect permission for
/home/lars/dev/hbase-trunk/target/test-data/8f8d2437-1d9a-42fa-b7c3-c154d8e559f3/dfscluster_557b48bc-9c8e-4a47-b74e-4c0167710237/dfs/data/data2,
expected: rwxr-xr-x, while actual: rwxrwxr-x
2011-10-27 14:17:48,261 ERROR [main] datanode.DataNode(1546): All directories
in dfs.data.dir are invalid.
And indeed I see this in
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(...):
FsPermission dataDirPermission =
new FsPermission(conf.get(DATA_DIR_PERMISSION_KEY,
DEFAULT_DATA_DIR_PERMISSION));
for (String dir : dataDirs) {
try {
DiskChecker.checkDir(localFS, new Path(dir), dataDirPermission);
dirs.add(new File(dir));
} catch(IOException e) {
LOG.warn("Invalid directory in " + DATA_DIR_KEY + ": " +
e.getMessage());
}
}
(where DEFAULT_DATA_DIR_PERMISSION is 755)
The default umask on my machine is 0002, so that would seem to explain the
discrepancy.
Changing my umask to 0022 fixed the problem!
I cannot be the only one seeing this. This is just a heads for anyone who runs
into this, as I wasted over an hour on this.
I assume this is due to the switch to hadoop 0.20.205.
As I am fairly ignorant about Maven... Is there a way to set the default umask
automatically for the test processes?
-- Lars