Hello Joao, Thanks for review and suggestions.
On Fri, Apr 7, 2017 at 9:08 PM, Joao Pedro De Almeida Pereira < jdealmeidapere...@pivotal.io> wrote: > Hello Hackers, > We just looked into this patch and have some questions about it. > > - Where is the JSON file created? Do we need to call it with some special > argument in order for it to be created? > This file is created under* /regression/ *directory (*runtests.py*, line 401). No need to call any special argument. > - There is a print statement in the line 275 of runtests.py is it supposed > to be there? > Removed. > - The function name addSuccess does not match the styling of the code, it > should be add_success > Done. > Suggestions: > - The definition of the class_name (run_tests.py, line 229) variable looks > the same as in the if statements below and could be extracted into a > function to avoid repeating the same code. > Done. > - In the same function when we are updating error/failure/skip test > results the code looks pretty similar and can also be extracted into a > function > Done. @Dave, Please find the attached patch with necessary code changes. I have also added missing scenario names to some test cases. Thanks. > > Thanks > Joao & Sarah > > On Fri, Apr 7, 2017 at 10:15 AM, Navnath Gadakh < > navnath.gad...@enterprisedb.com> wrote: > >> Hi Dave, >> <Ignore previous email> >> >> Please find the revised patch for test result enhancement, >> which include code to write passed test cases into the JSON file along with >> skipped and failed. >> >> On Tue, Apr 4, 2017 at 11:30 AM, Navnath Gadakh < >> navnath.gad...@enterprisedb.com> wrote: >> >>> Hi Dave, >>> >>> Please find the revised patch for test result enhancement. >>> >>> What's in the patch: >>> 1. The test result summary will store in JSON file. >>> 2. Removed some redundant code from regression/test_utils.py >>> 3. Added the scenario names for feature tests. >>> 4. To print test scenario names in failed and skipped test cases, I >>> override apply_scenario() function in regression/test_utils.py >>> >>> On Mon, Apr 3, 2017 at 12:32 PM, Navnath Gadakh < >>> navnath.gad...@enterprisedb.com> wrote: >>> >>>> Hi Akshay, >>>> >>>> Please find the revised patch for test result enhancement. >>>> >>>> *What's in the patch:* >>>> 1. The test result summary will store in JSON file. >>>> 2. Removed some redundant code from *regression/test_utils.py* >>>> *3. A*dded the scenario names for feature tests. >>>> 4. To print test scenario names in failed and skipped test cases, I o >>>> verride *apply_scenario()* function in *regression/test_utils.py* >>>> >>>> On Fri, Mar 31, 2017 at 6:16 PM, Akshay Joshi < >>>> akshay.jo...@enterprisedb.com> wrote: >>>> >>>>> Hi Navnath >>>>> >>>>> I have run the updated patch. It is working fine with Python 2.7 but I >>>>> am facing following error with Python 3.5, can you please look into it: >>>>> >>>>> ====================================================================== >>>>> >>>>> ERROR: runTest (pgadmin.feature_tests.connect_to_server_feature_test. >>>>> ConnectsToServerFeatureTest) >>>>> >>>>> Test database connection which can be created from the UI >>>>> >>>>> ---------------------------------------------------------------------- >>>>> >>>>> Traceback (most recent call last): >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/base_feature_test.py", line 33, in setUp >>>>> >>>>> self.page.reset_layout() >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 33, in reset_layout >>>>> >>>>> self.click_modal_ok() >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 38, in click_modal_ok >>>>> >>>>> self.click_element(self.find_by_xpath("//button[contains(.,' >>>>> OK')]")) >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 71, in find_by_xpath >>>>> >>>>> return self.wait_for_element(lambda driver: driver. >>>>> find_element_by_xpath(xpath)) >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 128, in wait_for_element >>>>> >>>>> return self._wait_for("element to exist", element_if_it_exists) >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 162, in _wait_for >>>>> >>>>> "Timed out waiting for " + waiting_for_message) >>>>> >>>>> File "/Users/akshay/Development/Workspace/lib/python3.5/site-pack >>>>> ages/selenium/webdriver/support/wait.py", line 80, in until >>>>> >>>>> raise TimeoutException(message, screen, stacktrace) >>>>> >>>>> selenium.common.exceptions.TimeoutException: Message: Timed out >>>>> waiting for element to exist >>>>> >>>>> >>>>> >>>>> ====================================================================== >>>>> >>>>> ERROR: runTest (pgadmin.feature_tests.table_ddl_feature_test. >>>>> TableDdlFeatureTest) >>>>> >>>>> Test scenarios for acceptance tests >>>>> >>>>> ---------------------------------------------------------------------- >>>>> >>>>> Traceback (most recent call last): >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/base_feature_test.py", line 33, in setUp >>>>> >>>>> self.page.reset_layout() >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 31, in reset_layout >>>>> >>>>> self.click_element(self.find_by_partial_link_text("File")) >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 90, in click_element >>>>> >>>>> return self._wait_for("clicking the element not to throw an >>>>> exception", click_succeeded) >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/feature_u >>>>> tils/pgadmin_page.py", line 162, in _wait_for >>>>> >>>>> "Timed out waiting for " + waiting_for_message) >>>>> >>>>> File "/Users/akshay/Development/Workspace/lib/python3.5/site-pack >>>>> ages/selenium/webdriver/support/wait.py", line 80, in until >>>>> >>>>> raise TimeoutException(message, screen, stacktrace) >>>>> >>>>> selenium.common.exceptions.TimeoutException: Message: Timed out >>>>> waiting for clicking the element not to throw an exception >>>>> >>>>> >>>>> >>>>> ---------------------------------------------------------------------- >>>>> >>>>> Ran 153 tests in 45.493s >>>>> >>>>> >>>>> FAILED (errors=2, skipped=16) >>>>> >>>>> >>>>> ====================================================================== >>>>> >>>>> Test Result Summary >>>>> >>>>> ====================================================================== >>>>> >>>>> >>>>> Traceback (most recent call last): >>>>> >>>>> File "runtests.py", line 354, in <module> >>>>> >>>>> skipped_cases) >>>>> >>>>> File "/Users/akshay/Development/pgadmin4/web/regression/python_te >>>>> st_utils/test_utils.py", line 442, in get_scenario_name >>>>> >>>>> key, value = case_name_dict.items()[0] >>>>> >>>>> TypeError: 'dict_items' object does not support indexing >>>>> >>>>> Resolved. >>>> >>>> Thanks! >>>> >>>> On Thu, Mar 30, 2017 at 8:04 PM, Navnath Gadakh < >>>>> navnath.gad...@enterprisedb.com> wrote: >>>>> >>>>>> Hi Dave, >>>>>> >>>>>> Please find the revised patch for test result enhancement. >>>>>> >>>>>> *What's in the patch:* >>>>>> 1. The test result summary will store in JSON file. >>>>>> 2. Removed some redundant code from *regression/test_utils.py* >>>>>> *3. A*dded the scenario names for feature tests. >>>>>> 4. To print test scenario names in failed and skipped test cases, I o >>>>>> verride *apply_scenario()* function in *regression/test_utils.py* >>>>>> >>>>>> I have also attached the sample JSON file with the test result as per >>>>>> your suggestions. >>>>>> >>>>>> Thanks! >>>>>> >>>>>> >>>>>> >>>>>> On Wed, Mar 29, 2017 at 6:03 PM, Dave Page <dp...@pgadmin.org> wrote: >>>>>> >>>>>>> On Wed, Mar 29, 2017 at 4:12 AM, Navnath Gadakh >>>>>>> <navnath.gad...@enterprisedb.com> wrote: >>>>>>> > Hi, >>>>>>> > >>>>>>> > On Mon, Mar 27, 2017 at 5:37 PM, Dave Page <dp...@pgadmin.org> >>>>>>> wrote: >>>>>>> >> >>>>>>> >> Hi >>>>>>> >> >>>>>>> >> On Mon, Mar 27, 2017 at 12:15 AM, Navnath Gadakh >>>>>>> >> <navnath.gad...@enterprisedb.com> wrote: >>>>>>> >> > Hello Dave, >>>>>>> >> > >>>>>>> >> > On Fri, Mar 24, 2017 at 9:10 PM, Dave Page <dp...@pgadmin.org> >>>>>>> wrote: >>>>>>> >> >> >>>>>>> >> >> Hi >>>>>>> >> >> >>>>>>> >> >> On Fri, Mar 24, 2017 at 3:13 PM, Navnath Gadakh >>>>>>> >> >> <navnath.gad...@enterprisedb.com> wrote: >>>>>>> >> >> > >>>>>>> >> >> >> When running with the patch: >>>>>>> >> >> >> >>>>>>> >> >> >> 1) The browser isn't closed, and the script never exits - >>>>>>> it just >>>>>>> >> >> >> sits >>>>>>> >> >> >> indefinitely at: >>>>>>> >> >> >> >>>>>>> >> >> >> ===== >>>>>>> >> >> >> Please check output in file: >>>>>>> >> >> >> /Users/dpage/git/pgadmin4/web/regression/regression.log >>>>>>> >> >> >> >>>>>>> >> >> >> make: *** [check] Error 1 >>>>>>> >> >> >> ===== >>>>>>> >> >> >> >>>>>>> >> >> >> without returning to a shell prompt. The browser exits >>>>>>> when I hit >>>>>>> >> >> >> Ctrl+C. >>>>>>> >> >> >>>>>>> >> >> The above is still a problem. In fact, not only do I have to >>>>>>> hit >>>>>>> >> >> Ctrl+C, but then the browser prompts me to check I really do >>>>>>> want to >>>>>>> >> >> exit. >>>>>>> >> >> >>>>>>> >> >> There's also another problem that just showed up. I got the >>>>>>> following >>>>>>> >> >> failure on PG 9.4 (due to a known intermittent bug that Ashesh >>>>>>> and >>>>>>> >> >> Tira@Pivotal are working on). Note how it's not reported in >>>>>>> the >>>>>>> >> >> summary (or the JSON output): >>>>>>> >> > >>>>>>> >> > >>>>>>> >> > I found the issue, In the feature tests we need to add a >>>>>>> scenario name >>>>>>> >> > for >>>>>>> >> > each test case. the purpose of this patch is to print the >>>>>>> failed/skipped >>>>>>> >> > test class name with the scenario name like: >>>>>>> >> > >>>>>>> >> > 152 tests passed >>>>>>> >> > >>>>>>> >> > 1 test failed: >>>>>>> >> > >>>>>>> >> > LoginRoleGetTestCase (Check Role Node) >>>>>>> >> > >>>>>>> >> > 16 tests skipped: >>>>>>> >> > >>>>>>> >> > SynonymGetTestCase (Fetch synonym Node URL) >>>>>>> >> > >>>>>>> >> > But our in-built test framework does not provide that scenario >>>>>>> name with >>>>>>> >> > failed/skipped test case that's why I override apply_scenario() >>>>>>> >> > function. >>>>>>> >> > >>>>>>> >> > def apply_scenario(scenario, test): >>>>>>> >> > >>>>>>> >> > name, parameters = scenario >>>>>>> >> > >>>>>>> >> > parameters["scenario_name"] = name >>>>>>> >> > >>>>>>> >> > While printing the result, I have checked the if >>>>>>> 'scenario_name' in test >>>>>>> >> > as >>>>>>> >> > we need to print scenario name in test summary as well as in >>>>>>> JSON file. >>>>>>> >> > >>>>>>> >> > I can do it without scenario name but for better understanding >>>>>>> which >>>>>>> >> > test >>>>>>> >> > scenario is failed it's good to add a scenario name with each >>>>>>> test case. >>>>>>> >> >>>>>>> >> OK. >>>>>>> >> >>>>>>> >> > See this is how test cases looks like while printing on console >>>>>>> >> > >>>>>>> >> > API: >>>>>>> >> > >>>>>>> >> > runTest >>>>>>> >> > >>>>>>> >> > (pgadmin.browser.server_groups.servers.databases.schemas.typ >>>>>>> es.tests.test_types_put.TypesUpdateTestCase) >>>>>>> >> > >>>>>>> >> > Update type under schema node ... ok >>>>>>> >> > >>>>>>> >> > Feature tests: >>>>>>> >> > >>>>>>> >> > runTest >>>>>>> >> > >>>>>>> >> > (pgadmin.utils.tests.test_versioned_template_loader.TestVers >>>>>>> ionedTemplateLoader) >>>>>>> >> > ... ok >>>>>>> >> > >>>>>>> >> > No scenario name in feature tests. >>>>>>> >> > >>>>>>> >> >>>>>>> >> OK, is that easy to fix while you're at it? >>>>>>> > >>>>>>> > >>>>>>> > I have two solutions- >>>>>>> > >>>>>>> > 1. Need a little hack to skip scenario/test name if that does not >>>>>>> exist, but >>>>>>> > that's not the best idea. >>>>>>> > >>>>>>> > 2. Owner of feature tests should add scenario/test name to each >>>>>>> feature >>>>>>> > test. In the summary also we will know for which scenario test is >>>>>>> failing or >>>>>>> > skipping. >>>>>>> > This is ideal and long term solution and I prefer it. >>>>>>> >>>>>>> Agreed - and as there are only 2 feature tests, you should be able to >>>>>>> fix them up pretty quickly :-p >>>>>>> >>>>>>> Once code is in the repo, it's "ours", meaning the entire >>>>>>> communities. >>>>>>> I wouldn't expect us to ping all issues back to Pivotal - we're one >>>>>>> team on this. >>>>>>> >>>>>>> Thanks! >>>>>>> >>>>>>> -- >>>>>>> Dave Page >>>>>>> Blog: http://pgsnake.blogspot.com >>>>>>> Twitter: @pgsnake >>>>>>> >>>>>>> EnterpriseDB UK: http://www.enterprisedb.com >>>>>>> The Enterprise PostgreSQL Company >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Regards, >>>>>> Navnath Gadakh >>>>>> >>>>>> EnterpriseDB Corporation >>>>>> The Enterprise PostgreSQL Company >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org >>>>>> ) >>>>>> To make changes to your subscription: >>>>>> http://www.postgresql.org/mailpref/pgadmin-hackers >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> *Akshay Joshi* >>>>> *Principal Software Engineer * >>>>> >>>>> >>>>> >>>>> *Phone: +91 20-3058-9517 <+91%2020%203058%209517>Mobile: +91 >>>>> 976-788-8246 <+91%2097678%2088246>* >>>>> >>>> >>>> >>>> >>>> -- >>>> Regards, >>>> Navnath Gadakh >>>> >>>> EnterpriseDB Corporation >>>> The Enterprise PostgreSQL Company >>>> >>>> >>>> >>> >>> >>> -- >>> Regards, >>> Navnath Gadakh >>> >>> EnterpriseDB Corporation >>> The Enterprise PostgreSQL Company >>> >>> >>> >> >> >> -- >> Regards, >> Navnath Gadakh >> >> EnterpriseDB Corporation >> The Enterprise PostgreSQL Company >> >> >> >> >> -- >> Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org) >> To make changes to your subscription: >> http://www.postgresql.org/mailpref/pgadmin-hackers >> >> > -- Regards, Navnath Gadakh EnterpriseDB Corporation The Enterprise PostgreSQL Company
tests_result_enhancement_v5.patch
Description: Binary data
-- Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgadmin-hackers