----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/18364/#review35172 -----------------------------------------------------------
Ship it! Ship It! - Dmitro Lisnichenko On Feb. 21, 2014, 6:28 p.m., Vitalyi Brodetskyi wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/18364/ > ----------------------------------------------------------- > > (Updated Feb. 21, 2014, 6:28 p.m.) > > > Review request for Ambari, Dmitro Lisnichenko and Sid Wagle. > > > Bugs: AMBARI-4787 > https://issues.apache.org/jira/browse/AMBARI-4787 > > > Repository: ambari > > > Description > ------- > > The file defined by dfs.domain.socket.path must give +x permission for other > user. > > <property> > <name>dfs.domain.socket.path</name> > <value>/var/lib/hadoop-hdfs/dn_socket</value> > </property> > > Currently, In ambari installed cluster, /var/lib/hadoop-hdfs does not give +x > permission to other user > {code} > [root@ambari-sec-1392876050-hdfs-re-8 ~]# stat /var/lib/hadoop-hdfs/ > File: `/var/lib/hadoop-hdfs/' > Size: 4096 Blocks: 8 IO Block: 4096 directory > Device: 803h/2051d Inode: 1182008 Links: 3 > Access: (0750/drwxr-x---) Uid: ( 1005/ hdfs) Gid: ( 500/ hadoop) > Access: 2014-02-18 18:10:35.000000000 -0800 > Modify: 2014-02-20 07:50:55.274766162 -0800 > Change: 2014-02-20 07:50:55.274766162 -0800 > {code} > > Due to this Issue, hadoop commands are seeing below WARN messages. > {code} > 2014-02-18 05:54:32,734|beaver.machine|INFO|RUNNING: /usr/bin/hdfs dfs -tail > /user/hrt_qa/hdfsRegressionData/smallFiles/smallRDFile755 > 2014-02-18 05:54:35,528|beaver.machine|INFO|14/02/18 05:54:35 WARN > hdfs.BlockReaderLocal: error creating DomainSocket > 2014-02-18 05:54:35,528|beaver.machine|INFO|java.net.ConnectException: > connect(2) error: Permission denied when trying to connect to > '/var/lib/hadoop-hdfs/dn_socket' > 2014-02-18 05:54:35,528|beaver.machine|INFO|at > org.apache.hadoop.net.unix.DomainSocket.connect0(Native Method) > 2014-02-18 05:54:35,529|beaver.machine|INFO|at > org.apache.hadoop.net.unix.DomainSocket.connect(DomainSocket.java:250) > 2014-02-18 05:54:35,529|beaver.machine|INFO|at > org.apache.hadoop.hdfs.DomainSocketFactory.createSocket(DomainSocketFactory.java:158) > 2014-02-18 05:54:35,529|beaver.machine|INFO|at > org.apache.hadoop.hdfs.BlockReaderFactory.nextDomainPeer(BlockReaderFactory.java:691) > 2014-02-18 05:54:35,529|beaver.machine|INFO|at > org.apache.hadoop.hdfs.BlockReaderFactory.createShortCircuitReplicaInfo(BlockReaderFactory.java:439) > 2014-02-18 05:54:35,529|beaver.machine|INFO|at > org.apache.hadoop.hdfs.client.ShortCircuitCache.create(ShortCircuitCache.java:669) > {code} > > The expected Permissions on this location is as below. > {code} > [root@ambari-sec-1392876050-yarn-10 ~]# stat /var/lib/hadoop-hdfs/ > File: `/var/lib/hadoop-hdfs/' > Size: 4096 Blocks: 8 IO Block: 4096 directory > Device: 803h/2051d Inode: 1181767 Links: 3 > Access: (0751/drwxr-x--x) Uid: ( 1005/ hdfs) Gid: ( 500/ hadoop) > Access: 2014-02-20 18:00:06.586040913 -0800 > Modify: 2014-02-20 07:06:28.267889888 -0800 > Change: 2014-02-20 17:59:56.629052410 -0800 > {code} > > > Diffs > ----- > > > ambari-server/src/main/resources/stacks/HDP/1.3.2/services/HDFS/package/scripts/hdfs_datanode.py > b033185 > > ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HDFS/package/scripts/hdfs_datanode.py > f7d9f15 > ambari-server/src/test/python/stacks/1.3.2/HDFS/test_datanode.py 42b9fe0 > ambari-server/src/test/python/stacks/2.0.6/HDFS/test_datanode.py 838c53a > > Diff: https://reviews.apache.org/r/18364/diff/ > > > Testing > ------- > > ---------------------------------------------------------------------- > Ran 186 tests in 1.209s > > OK > ---------------------------------------------------------------------- > Total run:500 > Total errors:0 > Total failures:0 > OK > > Process finished with exit code 0 > > > Thanks, > > Vitalyi Brodetskyi > >
