Hello Sebastien. Great questions; let me give you some answers!
> - do you have any idea of what applications you would like to see tested?
Yes, I am targetting the default installed desktop applications.
However, this is in no way limited to the default applications.
Ultimately it would be awesome to see testcases for all of the "popular"
applications people like to use on ubuntu.
> - who will run those tests and when?
These tests will be run during the beta1 and beta2 cycles. That's this
week, and the last week in March for beta2. The tests will be run by
"normal" users; varying from our normal set of testers (awesome work you
guys do, thank you!)  who do iso and sru testing, as well as those folks
who just want to try out the new release and are the more casual
tester/user.
> - who will deal with the feedback, when and in which way?
Right now I plan on gathering the feedback and submitting it for
everyone to publicly see. All of the results will be on the results
tracker in launchpad so they can be seen as they are submitted by
anyone. I plan to share aggregate details on my blog, and hopefully a
public facing website showing the top testers, etc. In addition, I am
happy to share additional details with anyone upon request :-)

As far as your concerns about sorting thru all the testing and getting
good bug reports, etc, you are concerns are valid. This approach doesn't
scale to thousands of tests and users, but for the moment the volume
will be low enough to allow manual processing. Even if thousands of
cases are submitted, we can look at the aggregate data simple enough and
get useful information out of it. The testers themselves will be
encouraged to submit bug reports, but the aggregated data will be
collected by me and I will share with the upstreams involved to ensure
any anomalies are researched and potential bugs filed if found.

Finally I will add that this entire testing loop is still a work in
progress. We want to refine this loop and the tools we use as we go into
next cycle. Expect to see some sessions at UDS around this topic. I hope
to get some feedback and ideas we can use to help shape the Q cycle testing.

Thanks!

Nicholas

On 02/24/2012 05:24 AM, Sebastien Bacher wrote:
> Le 24/02/2012 01:33, Nicholas Skaggs a écrit :
>> As part of the precise cycle, the ubuntu QA team has been looking to
>> increase manual application testing. As part of this, I have extended
>> checkbox to serve up manual tests to testers to test ubuntu
>> applications post installation. We need your help! If your an
>> application developer who wants testing on his application I would
>> like your testcases included in the checkbox application tests for
>> beta1. 
> Hey,
>
> Thanks for that, it's always great to have people looking at improving
> the desktop ;-)
>
> I've some question though:
>
> - do you have any idea of what applications you would like to see tested?
> - who will run those tests and when?
> - who will deal with the feedback, when and in which way?
>
> Having things tested is great but I think we should figure how we deal
> with the feedback before starting doing lot of testing this way.
>
> I've been working a bit with unity-checkbox to help Didier in the
> previous unity update round, and dealing with the infos collected is
> quite some work. It's useful for unity where we are upstream and have
> resources to deal with the issues raised, I'm less sure we can do an
> useful job of it on the application with our current structure and
> "workforce"... we don't have the people to do upstream work and to be
> fair we already know about quite some issues that we should fix and
> didn't yet through bug report.
>
> Could you picture how you would see the feedback loop work? Would the
> QA team read those reports and turn issues in bugs for those which are
> not already known? Or...?
>
> Cheers,
> Sebastien Bacher
>
>

-- 
ubuntu-desktop mailing list
ubuntu-desktop@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-desktop

Reply via email to