[ 
https://issues.apache.org/jira/browse/BIGTOP-1307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013420#comment-14013420
 ] 

Martin Bukatovic commented on BIGTOP-1307:
------------------------------------------

{quote}
Ok. But what about certain dfsadmin tests that require to be executed under 
file system super user?
{quote}

Exactly. Some test cases would fail if not executed under hdfs root user 
account (hdfs).

So from this perspective, it makes sense switch the shell to hdfs user like 
this (code from TestCLI):

{noformat}
Shell shHDFS = new Shell("/bin/bash", "hdfs")
{noformat}

but when we look deeper, we will see that:

 * shell object is used only for initialisation and removal of the testcli 
directory ({{/tmp/testcli_TIMESTAMP}}) in hdfs and nothing more
 * all test cases (as described in {{testHDFSConf.xml}}) are executed directly 
via hadoop api ({{org.apache.hadoop.fs.FsShell}}) and so all test cases are 
running under the user who runs the test itself

With this in mind, I don't think that switching from hdfs user to hdfs user 
(via sudo) is necessary. Moreover I consider this unnecessary switching a 
little misleading.

> Some TestCLI cases fail with 'No such file or directory'
> --------------------------------------------------------
>
>                 Key: BIGTOP-1307
>                 URL: https://issues.apache.org/jira/browse/BIGTOP-1307
>             Project: Bigtop
>          Issue Type: Bug
>          Components: Tests
>    Affects Versions: 0.8.0
>         Environment: HDP 2.0.6
>            Reporter: Martin Bukatovic
>            Assignee: Martin Bukatovic
>            Priority: Critical
>              Labels: test
>         Attachments: 
> 0001-BIGTOP-1307.-Do-not-switch-user-for-shell-object.patch, filter-cases.sh, 
> testcli.nosuchfile-cases.log
>
>
> I observe weird results of xml-defined test cases of TestCLI bigtop test:
> 136 test cases failed because of 'No such file or directory' error.
> To show what the problem is, see testcase #1:
> {noformat}
> 14/05/15 16:50:40 INFO cli.CLITestHelper: 
> -------------------------------------------
> 14/05/15 16:50:40 INFO cli.CLITestHelper:                     Test ID: [1]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:            Test Description: [ls: 
> file using absolute path]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:
> 14/05/15 16:50:40 INFO cli.CLITestHelper:               Test Commands: [-fs 
> hdfs://dhcp-lab-203.local:8020 -touchz /tmp/testcli_1400165386646/file1]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:               Test Commands: [-fs 
> hdfs://dhcp-lab-203.local:8020 -ls /tmp/testcli_1400165386646/file1]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:
> 14/05/15 16:50:40 INFO cli.CLITestHelper:            Cleanup Commands: [-fs 
> hdfs://dhcp-lab-203.local:8020 -rm /tmp/testcli_1400165386646/file1]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:
> 14/05/15 16:50:40 INFO cli.CLITestHelper:                  Comparator: 
> [TokenComparator]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:          Comparision result:   
> [fail]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:             Expected output:   
> [Found 1 items]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:               Actual output:   [ls: 
> `/tmp/testcli_1400165386646/file1': No such file or directory
> ]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:                  Comparator: 
> [RegexpComparator]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:          Comparision result:   
> [fail]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:             Expected output:   
> [^-rw-r--r--( )*1( )*[a-z]*( )*hdfs( )*0( )*[0-9]{4,}-[0-9]{2,}-[0-9]{2,} 
> [0-9]{2,}:[0-9]{2,}( )*/tmp/testcli_1400165386646/file1]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:               Actual output:   [ls: 
> `/tmp/testcli_1400165386646/file1': No such file or directory
> ]
> 14/05/15 16:50:40 INFO cli.CLITestHelper:
> 14/05/15 16:50:40 INFO cli.CLITestHelper: 
> -------------------------------------------
> {noformat}
> The results looks as if there were someting wrong with hadoop/hdfs.
> Nevertheless when I checked this particular case manually, it worked just 
> fine:
> {noformat}
> [bigtop@dhcp-lab-203 testcli]$ hadoop fs -mkdir /tmp/testcli_1400165386646
> [bigtop@dhcp-lab-203 testcli]$ hadoop fs -fs hdfs://dhcp-lab-203.local:8020 
> -touchz /tmp/testcli_1400165386646/file1
> [bigtop@dhcp-lab-203 testcli]$ hadoop fs -fs hdfs://dhcp-lab-203.local:8020 
> -ls /tmp/testcli_1400165386646/file1
> Found 1 items
> -rw-r--r--   3 bigtop hdfs          0 2014-05-15 17:08 
> /tmp/testcli_1400165386646/file1
> [bigtop@dhcp-lab-203 testcli]$ hadoop fs -fs hdfs://dhcp-lab-203.local:8020 
> -rm /tmp/testcli_1400165386646/file1
> 14/05/15 17:08:27 INFO fs.TrashPolicyDefault: Namenode trash configuration: 
> Deletion interval = 21600000 minutes, Emptier interval = 0 minutes.
> Moved: 'hdfs://dhcp-lab-203.local:8020/tmp/testcli_1400165386646/file1' to 
> trash at: hdfs://dhcp-lab-203.local:8020/user/bigtop/.Trash/Current
> [bigtop@dhcp-lab-203 testcli]$
> {noformat}
> I manually checked 5 other cases with the same result: when the testcase is 
> done
> manually, it works without any problems.
> Moreover I rerun all TestCLI cases 5 times, and the set of failed cases
> was always the same.
> Have anybody seen similar behaviour? I have executed TestCLI cases via wrapper
> which sets system classpath instead of maven defined enviromnent. Can this
> caused the issue, or is it likely that the problem is the bigtop tests? Also
> feel free to propose a way to debug this further.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to