On Thu, Jan 11, 2018 at 8:31 AM, Peter kovacs <[email protected]> wrote:
> No you have not. > Look at the numbers. We have a team of 2 + myself to build test cases and > a test concept. > > Without automatisation of user based test this will be very slow. > > So yes please, go through the technical test we have and if you could > setup a list in one wiki this is wonderful. I would very much like to end > up with one big test scenario. Not devise technical and user focus for > blaming reasons. This is not necessary. > I will certainly do what I can to retreive these reports from Googletest and make them readable by everyone. > I will sign on QA soon and try help to work on creating test cases. > T he old test cases are still available on AOO's Testlink. See: http://aootesting.adfinis-sygroup.org/ But navigating around this can be a bit confusing. I actually was an admin on this at one point, but today my old account did not work and I needed to reapply and was given "guest" access. What this means is we don't seem to have an "admin" to give folks permisisns to upload new cases to run; or do any other admin type functions on this. NO idea who the super-admin is for this -- maybe none? In any case, the old test cases might be a good starting point, in addition to the information on how to run BVT tests etc. as described on -- https://wiki.openoffice.org/wiki/QA/Testlink and https://wiki.openoffice.org/wiki/QA > > Btw is it possible to get tooling for this? > If you mean tooling on dealing with the output from the builds, I think we can do quite a lot within the build scripts, but not sure about porting info to a host outside the build hosts environment. We could try and see what happens. > I mean of apache will > > All the best > Peter > > Am 10. Januar 2018 17:43:00 MEZ schrieb Kay Schenk <[email protected]>: > >On Tue, Jan 9, 2018 at 1:46 PM, Andrea Pescetti <[email protected]> > >wrote: > > > >> Kay Schenk wrote: > >> > >>> I know we have a number of Google tests incorporated currently. I > >don't > >>> know much about how to construct these, but I'd like to help as > >well. > >>> > >> > >> Sure, this would be very nice to have! > >> > >> But the discussion here (especially on the QA list) is mostly focused > >on > >> manual testing by humans. So writing simple testcases down for the > >typical > >> use of Writer, Calc and so on and then coordinating to execute them > >and > >> share results. > >> > >> Let's keep the discussion about automated testing for the dev list; > >indeed > >> Damjan already enabled some them in trunk and it would be good to > >have > >> more. But this is a matter for developers, while QA volunteers will > >stay > >> focused on the manual part. > >> > > > >OK. I may have misconstrued Peter's initial comments about keeping the > >"testing plan" on both the QA and DEV lists. In any case, I have found > >the > >XML output from gtest buried in my build output, and will get back to > >DEV > >on how we can use this. > > > > > > > >> > >> Any thoughts on this? I would be happy to investigate. > >>> > >> > >> It would be wonderful to have automated coverage for the 4.1.4 > >regressions > >> to start with. This way we can guarantee that those bugs won't occur > >again > >> in next releases. > >> > >> Regards, > >> Andrea. > >> > >> > >> --------------------------------------------------------------------- > >> To unsubscribe, e-mail: [email protected] > >> For additional commands, e-mail: [email protected] > >> > >> > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [email protected] > For additional commands, e-mail: [email protected] > > -- ---------------------------------------------------------------------- MzK "Ring out the false, ring in the true." -- poem "In Memoriam", Alfred Lord Tennyson
