Hi : I've mounted my own ext4 disk ont /mnt/sdb and chmodded it to 777.
However, when starting the data node: /etc/init.d/hadoop-hdfs-datanode start I get the following error in my logs (bottom of this message) *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** What is the EPERM error caused by, and how can I reproduce it? I'm assuming that, since the directory permissions are recursively set to 777 there shouldnt be a way that this error could occur, unless somewhere intermittently the directory permissions are being changed by hdfs to the wrong thing. *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 2013-07-06 15:54:13,968 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /mnt/sdb/hadoop-hdfs/cache/hdfs/dfs/data : EPERM: Operation not permitted at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:605) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:439) at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:138) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:154) at org.apache.hadoop.hdfs.server.datanode.DataNode.getDataDirsFromURIs(DataNode.java:1659) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1638) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1575) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1598) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1751) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1772) -- Jay Vyas http://jayunit100.blogspot.com