dombizita commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r708367878



##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation       HttpFS gateway test with curl commands
+Library             Process
+Library             String
+Library             BuiltIn
+Resource            operations.robot
+
+*** Variables ***
+${URL}                         http://httpfs:14000/webhdfs/v1/
+${USERNAME}                    hdfs
+
+*** Test Cases ***
+Create volume
+    ${volume} =     Execute curl command    vol1    MKDIRS      -X PUT
+    Should contain  ${volume}   true
+
+Create first bucket
+    ${bucket} =     Execute curl command    vol1/buck1          MKDIRS      -X 
PUT
+    Should contain  ${bucket}   true
+
+Create second bucket
+    ${bucket} =     Execute curl command    vol1/buck2          MKDIRS      -X 
PUT
+    Should contain  ${bucket}   true
+
+Create local testfile
+    Create file       testfile
+
+Create testfile
+    ${file} =       Execute create file command     vol1/buck1/testfile     
testfile
+    Should contain     ${file}     
http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+    ${file} =       Execute curl command    vol1/buck1/testfile     OPEN    -L
+    Should contain     ${file}     Hello world!
+
+Delete bucket
+    ${bucket} =     Execute curl command    vol1/buck2          DELETE      -X 
DELETE
+    Should contain  ${bucket}   true
+
+Get status of bucket
+    ${status} =     Execute curl command    vol1/buck1          GETFILESTATUS  
    ${EMPTY}
+    Should contain  ${status}   FileStatus  DIRECTORY
+
+Get status of file
+    ${status} =     Execute curl command    vol1/buck1/testfile          
GETFILESTATUS      ${EMPTY}
+    Should contain  ${status}   FileStatus  FILE    13
+
+List bucket
+    ${list} =       Execute curl command    vol1/buck1          LISTSTATUS     
 ${EMPTY}
+    Should contain  ${list}     FileStatus  testfile    FILE    13
+
+List file
+    ${list} =       Execute curl command    vol1/buck1/testfile          
LISTSTATUS      ${EMPTY}
+    Should contain  ${list}     FileStatus  FILE    13
+
+List directory iteratively

Review comment:
       I moved this test case before the deleting of buck2 and left the start 
at buck1, in that case it should only list buck2, because it shouldn't list the 
start as I understood from the documentation example 
(https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html#Iteratively_List_a_Directory).
 But unfortunately it doesn't work properly, it listed both buckets. Because of 
the missing functionality I commented this test case, thank you for your 
comment! 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to