Thanks Brian

The testcases haven't been high priority for me (*1), and I'd
[mentally] pushed the testcase revisions to next cycle given we've
missed testing week already for kinetic.

I'd tried to focus on pushing the Testing Week and responding to
anything I see on social media, discord etc  for kinetic, alas not a
lot has happened there.

My focus for the little time remaining with the kinetic cycle is
testing, which in my opinion looks good for Lubuntu (*2)

Thanks again

Chris g.


*1  I actually like what we're [Lubuntu; Leó & myself mostly] doing
now, though acknowledge it's far from ideal for occasional testers, or
newbies

* 2  
https://discourse.lubuntu.me/t/lubuntu-kinetic-kudu-22-10-issue-tracker/3589
  where the only issues remaining will be documented in our release
notes


On 10/12/22, Brian Murray <br...@ubuntu.com> wrote:
> On Thu, Sep 01, 2022 at 03:24:52PM -0700, Brian Murray wrote:
>> On Mon, Aug 29, 2022 at 11:17:55AM +1000, Chris Guiver wrote:
>> > Lubuntu's QA is mostly handled on
>> > https://phab.lubuntu.me/w/release-team/testing-checklist/ with a note
>> > at the top that doc is supposed to be unnecessary after
>> > https://phab.lubuntu.me/T56 which pushes us (Lubuntu) back to
>> > iso.qa.ubuntu.com...
>> >
>> > fyi: most install tests are done by Leó (Leokolb) & myself; and
>> > personally I like our checklist as I can choose the oldest test
>> > performed & redo/update it.. instead of a fresh page for each daily...
>> >
>> > We have some 'updated' checklists, created long ago which never
>> > completed review.
>> >
>> > 1.   Is there a guide to the formatting, ie. I wanted more than one
>> > line at the top so created two <em> lines... (emphasis?) but is there
>> > somewhere where what I-can-do | I-cannot-do is defined?  I gather
>> > labels are DL = descriptive.list, DT = description term within DL, DD=
>> > description & EM = emphasis/strong
>>
>> As far as I know there is not a guide to the formatting and looking at
>> the admin portion of the iso.qa.ubuntu.com site the test cases sections
>> says "some html is allowed" which isn't terribly helpful. However, I
>> tried multiple <em> lines and that did work. I am also happy to try
>> other experiments as necessary.
>>
>> > 2.   Tests end up MANDATORY or OPTIONAL, where is that set?
>>
>> That is set in the admin portion of the site when creating a "testsuite"
>> of test cases.
>>
>> > 3.   Is there a tool where I can view the created testcase in somewhat
>> > PREVIEW state (without codes) so I can re-read & hopefully detect
>> > errors?
>>
>> Not at this point in time but I tried copying and pasting a test case
>> into the body section at
>> https://www.w3schools.com/html/tryit.asp?filename=tryhtml_intro and it
>> worked okay. However, the numbered list and bullet points did not
>> appear. So maybe that isn't useful.
>>
>> > 4.   Is there a guide for reviewers I can read?
>> >
>> > Walter (wxl) originally created the list in our QA checklist; the
>> > issue is we now have lots (a guide to understanding them I created
>> > here -
>> > https://discourse.lubuntu.me/t/testing-checklist-understanding-the-testcases/2743),
>> > and I sure don't want them all mandatory. I also consider the FOUR
>> > BIOS installs as roughly equivalent (variations of encryption,
>> > internet & swap), our FOUR EFI installs the ~same, as well as FOUR
>> > Secure-uEFI.. To counter this I've a testcase [script] that attempts
>> > to get a tester to select one & run it (so one bios can be mandatory,
>> > one uefi mandatory, one secure-uefi..) but that's also more complex
>> > than the four testcases each script replaced..
>>
>> It's not clear to me exactly what you are asking for here. However, I'd
>> much rather there be separate test cases for each different scenario
>> than have people choose an installation type and enter which one they
>> chose in a comment. The latter would make it harder for people to know
>> which ones are left to test and for the release team to know which tests
>> are outstanding. I imagine you were trying to reduce duplicating the
>> same test cases with minor variations (install type) but I think my next
>> comments address that.
>>
>> > Any advice or direction would be appreciated.
>>
>> I do want to mention some changes that Dave made to the Raspberry Pi
>> test cases that are quite useful though. In the definitions folder[1] of
>> ubuntu-manual-tests there is a pi_desktop_cases.xml file which contains
>> a series of tests and then has multiple test case ids which reference
>> those tests. A script[2] is then run to generate the test case files
>> which are put into the iso.qa.ubuntu.com site. This reduces the amount
>> of duplicate information and the need to update multiple test case files
>> e.g. if something changes in the installer you can update it in the xml
>> file and generate new test cases instead of having to edit each test
>> case. If you are going to be working on adding new test cases I'd
>> strongly suggest starting with this new format.
>
> I've worked on converting some of the desktop test cases to use the xml
> template and that can be found in this file:
>
> https://git.launchpad.net/ubuntu-manual-tests/tree/definitions/basic_installation.xml
>
> The end of the file contains the actual test case ids and which tests
> they include:
>
> https://git.launchpad.net/ubuntu-manual-tests/tree/definitions/basic_installation.xml#n196
>
> I hope that makes things a little more clear.
>
> Cheers,
> --
> Brian Murray
>

-- 
Ubuntu-quality mailing list
Ubuntu-quality@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-quality

Reply via email to