Hi Aleks - firstly, many thanks for your excellent work! The list of improvements you’ve put forward is very impressive and will be a huge help in speeding up and improving the developer experience for Fineract. Great job!
On this testing one specifically, I think we should be careful to ensure the tests pass consistently before merging this. There’s been quite a lot of work to try to reduce the “flakiness” of the existing tests by fixing the ordering etc. I think overall we are in a much better position now than a year ago - typically now the integration tests will pass without too much pain. Perhaps the best way would be to have this running in parallel for a while - i.e. incorporate the new way of running tests as a separate workflow whilst keeping the existing one running as well? We could then compare the failure rate to ensure they were equally stable before decommissioning the existing workflow. I think the same applies for performance… I hope this will improve testing performance, but currently (with the ‘old method’) the integration tests typically complete in around 1 hour 15 minutes… so if this now takes up to 2 hours, we should see if there are further ways to improve. Thanks again for all your great work! Cheers Petri > On 4 Jan 2022, at 17:35, Aleksandar Vidakovic <[email protected]> > wrote: > > Hi everyone, > > ... on Github Actions the tests ran just under 2 hours, but with 63 failures > (I removed for now the "--fail-fast" flag to get an idea how long this takes > overall). > > Some of the tests seem to depend implicitly on another one to provide some > data so that the other test needs to run first. Some tests other tests do not > seem to be thread safe. > > A big chunk of the current failures are quite trivial (all image, import and > document management tests). They fail, because they expect some local files > to be available (read: in the Docker container they run in), but those files > are extracted from the test sources outside the container... not a big > surprise that these files are obviously not visible to the fineract instance, > easy fix. > > And then finally the 2FA and OAuth2 tests are failing, because I thought I > could run them together with the other tests... when I execute them > separately then things work just fine... so probably I'll just move them > again to their current separate modules. > > A quick count shows that 44 of the currently failing tests are easy to fix. > Which brings the current success rate closer to 99%. > > After this work is done we should migrate those tests as fast as possible to > Cucumber/BDD tests that don't have the need for a full blown Fineract > instance. That type of testing should be much faster. > > Just FYI. > > Cheers, > > Aleks
