Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-05 Thread Nino Novak
On Monday 05 March 2012, 10:41:06 Petr Mladek wrote:

 No, we already have a support for translating the test cases. I am sure
 that we will be able to do it even more cleanly in the future.

Is it correct that the Litmus UI is not localized yet? (Is it localizable at 
all?)

Is there (or will there be) a possibility to tag a test case by a tester? So 
that tests can be grouped deliberately? (I'm dreaming about an individual set 
of test cases which I'm sort of subscribed to: thus a great coverage could 
be achieved, if there are a few people and everybody subscribes to a different 
- individual - set).

Thanks,
Nino
___
List Name: Libreoffice-qa mailing list
Mail address: Libreoffice-qa@lists.freedesktop.org
Change settings: http://lists.freedesktop.org/mailman/listinfo/libreoffice-qa
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://lists.freedesktop.org/archives/libreoffice-qa/


Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-05 Thread Cor Nouws

Petr Mladek wrote (05-03-12 10:41)


I agree that reviewing strings is important. I just try to explain that
we could do much more tests when we separate functional and translations
tests.


I hope it also makes Litmus testing more attractive, when people 
(realise they) can choose the tests in the areas that are most 
interesting for them.


Also I expect there are people more interested in checking translation 
and others more interested in features. And of course people interested 
in both.


In the description, promotion, explanation at the wiki and such, it 
might help to state e.g  Using function frequently ? Then take the 
change to test this in our latest release! 


--
 - Cor
 - http://nl.libreoffice.org

___
List Name: Libreoffice-qa mailing list
Mail address: Libreoffice-qa@lists.freedesktop.org
Change settings: http://lists.freedesktop.org/mailman/listinfo/libreoffice-qa
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://lists.freedesktop.org/archives/libreoffice-qa/


Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-05 Thread Nicholas Skaggs
I'm chiming in here just to mention that the ubuntu QA community, and
the ubuntu one qa folks are looking at using the new tool written by
mozilla called case conductor. It's the successor to litmus. How this
will fit into our workflow and testing needs in the future is still TBD.
If your group is interested, I'd like to include you in these
discussions and prototyping.

Nicholas

On 03/02/2012 03:13 PM, Michael Meeks wrote:
 On Fri, 2012-03-02 at 19:21 +0100, Petr Mladek wrote:
 Name: Translation check of creating a new database
 ...
   * the database wizard open: all strings in the dialog box and
 window are correctly localized to your own language.
   So - this looks pretty odd to me :-) This string cropping problem is
 a constant annoyance, and yet it seems (to me) that we can verify at
 compile time that there is little-to-no string cropping (assuming a UI
 font with a sane width).

   Surely, as we translate each dialog, we can (at least on Linux with
 freetype etc.) calculate the size of each string, and check that it
 doesn't overlap with any other strings in the dialog.

   Surely we can also check that we have 100% translation by other means -
 checking the .po files etc.

   Of course, some sanity check is good to, but if we could automate this
 - would it save a lot of time ?

   All the best,

   Michael.


___
List Name: Libreoffice-qa mailing list
Mail address: Libreoffice-qa@lists.freedesktop.org
Change settings: http://lists.freedesktop.org/mailman/listinfo/libreoffice-qa
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://lists.freedesktop.org/archives/libreoffice-qa/


Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-02 Thread Petr Mladek
Sophie Gautier píše v Pá 02. 03. 2012 v 14:33 +0100:
 - are the tests on Litmus good enough or should I write more?

Great point. We have only very few test cases in Litmus (less than 50)
and the quality is debatable :-/

For example, I see test cases:

+ create empty Writer document
+ create empty Calc document

I do not think that we need manual tests for this. This basic operation
is part of any other complex test. In addition, exactly this is tested
within few seconds using the smoketest.


Another bunch of tests sounds like:

+ Translation check of creating a new database
+ Translation check when creating a table in a database
+ Translation check for Formula Editor


Of course, we need to check that the application is translated but we
can't check every dialog manually. Instead of the above particular
dialogs, we should check that different elements are localized, for
example:

+ File/New menu - because it consists of optional components
  that are added from xml registry files
+ main menu and one submenu
+ a dialog with tabs, check boxes, combo boxes, itemized list,
  and other elements
+ help - because it using another technology than the other
 dialogs
+ KDE/GNOME safe dialog because they are done another technology
 as well
+ extensions - because the translation is done slightly
different way

If one submenu is localized, the other submenus should be localized as
well if the strings are in pootle.


IMHO, we need to discuss what test cases make sense and create a
reasonable test cases first.

We are still looking for an experienced QA guy who could step in, teach
people and drive this forward.


Best Regards,
Petr

___
List Name: Libreoffice-qa mailing list
Mail address: Libreoffice-qa@lists.freedesktop.org
Change settings: http://lists.freedesktop.org/mailman/listinfo/libreoffice-qa
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://lists.freedesktop.org/archives/libreoffice-qa/

Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-02 Thread Petr Mladek
Petr Mladek píše v Pá 02. 03. 2012 v 16:03 +0100:
 IMHO, we need to discuss what test cases make sense and create a
 reasonable test cases first.

BTW: My (draft) thoughts about how a good test case looks like can be
found at
http://wiki.documentfoundation.org/QA/Testing/Test_Case#Good_Test

Feel free to update the wiki page. It is not mine. I just entered the
initial content ;-)


Best Regards,
Petr

___
List Name: Libreoffice-qa mailing list
Mail address: Libreoffice-qa@lists.freedesktop.org
Change settings: http://lists.freedesktop.org/mailman/listinfo/libreoffice-qa
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://lists.freedesktop.org/archives/libreoffice-qa/

Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-02 Thread Sophie Gautier

Hi Petr,

On 02/03/2012 16:03, Petr Mladek wrote:

Sophie Gautier píše v Pá 02. 03. 2012 v 14:33 +0100:

- are the tests on Litmus good enough or should I write more?


Great point. We have only very few test cases in Litmus (less than 50)
and the quality is debatable :-/

For example, I see test cases:

+ create empty Writer document
+ create empty Calc document

I do not think that we need manual tests for this. This basic operation
is part of any other complex test. In addition, exactly this is tested
within few seconds using the smoketest.


Another bunch of tests sounds like:

+ Translation check of creating a new database
+ Translation check when creating a table in a database
+ Translation check for Formula Editor


Well, I don't think you get the purpose of what Litmus was done for. It 
was for community testing at large, so very easy and short tests to 
bring interest to the testing. It should have help also localizer to 
test there version. Just as we did by the past and it worked well. Some 
spend only 30mn others more that 3 hours because the online tests was 
only the very basis of larger tests with a set of documents. So it's 
more about the life of a team, than only a basic test. Unfortunately we 
don't have the good tool here and no money to develop what could suite 
our needs. Mozilla was developing a tool but it's not yet done either.



Of course, we need to check that the application is translated but we
can't check every dialog manually.


We had that by the past with the VCLTestool.
 Instead of the above particular

dialogs, we should check that different elements are localized, for
example:

+ File/New menu - because it consists of optional components
   that are added from xml registry files
+ main menu and one submenu
+ a dialog with tabs, check boxes, combo boxes, itemized list,
   and other elements
+ help - because it using another technology than the other
  dialogs
+ KDE/GNOME safe dialog because they are done another technology
  as well
+ extensions - because the translation is done slightly
 different way

If one submenu is localized, the other submenus should be localized as
well if the strings are in pootle.


It's not about localization only (but it's good for CTL and CJK) , but 
also about the design of the dialog that allow to see the whole string 
and then adapt the dialog or the l10n. It's not about to see if it 
works, it's about the quality of the l10n and the design.



IMHO, we need to discuss what test cases make sense and create a
reasonable test cases first.

We are still looking for an experienced QA guy who could step in, teach
people and drive this forward.


So lets wait for that guy.

Kind regards
Sophie
--
Founding member of The Document Foundation
___
List Name: Libreoffice-qa mailing list
Mail address: Libreoffice-qa@lists.freedesktop.org
Change settings: http://lists.freedesktop.org/mailman/listinfo/libreoffice-qa
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://lists.freedesktop.org/archives/libreoffice-qa/

Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-02 Thread Sophie Gautier

Petr,

First, I don't take anything personal in your mail. I disagree with you 
but it's nothing personal :)


On 02/03/2012 17:26, Petr Mladek wrote:

Sophie Gautier píše v Pá 02. 03. 2012 v 16:20 +0100:

Another bunch of tests sounds like:

+ Translation check of creating a new database
+ Translation check when creating a table in a database
+ Translation check for Formula Editor


Well, I don't think you get the purpose of what Litmus was done for. It
was for community testing at large, so very easy and short tests to
bring interest to the testing. It should have help also localizer to
test there version. Just as we did by the past and it worked well. Some
spend only 30mn others more that 3 hours because the online tests was
only the very basis of larger tests with a set of documents. So it's
more about the life of a team, than only a basic test. Unfortunately we
don't have the good tool here and no money to develop what could suite
our needs. Mozilla was developing a tool but it's not yet done either.


I appreciate that you want to teach people using Litmus. Though, I am
afraid that you did not get my point.


I don't want to teach them using Litmus, I want them to get an interest, 
get fun and don't feel harassed by the task.




Please, read the above mentioned test cases. One test describe how get
into one dialog and asks to check that all strings are translated.
Another test cases describes how to reach another dialog where the
strings need to be checked
The check for the translation is a second purpose of the test, the first 
purpose is to check the basics functionalities such as Save as, Open, 
Copy, Past... etc.


IMHO, there are hunderts or thousands of dialogs. IMHO, we do not want
a test case for every single dialog. We do not have enough people who
could create, translate, and process all such test cases.


We are testing functionalities and by the same way are checking for 
basic i18n conversion (numbers, accentuated characters, date, size of 
the string...)


Also I am not sure if would be effective to use Litmus for this type of
testing. It might take few seconds to check that all strings are in a
given language. It might take longer time until you enter your result in
Litmus and select another test case.


Litmus should be an entry for approaching QA for the community at large 
i.e. no language barrier, no technical barrier, a team behind to guide 
you further in more complex testing. Unfortunately, it's not a tool 
adapted to our needs.


IMHO, we could do much better job here. If we have strings translated in
pootle and the build works correctly, all translated strings are used.
By other words, if you have translation for 1000 dialogs in pootle, it
is enough to QA only 1 dialog. The strings are extracted from pootle by
a script and applied in sources by another tool. If one string is used,
others are used as well[*].


As said, I'm not speaking about translation. The contents of the test 
may confuse you when it speaks about localization, but it's only a 
second purpose of the test, a *while you are here*, please check that 
the dialog has the good special characters in your language


You might say that you need to check layout of the strings that they are
not shrinked. Well, we need not check all strings here. It might be
enough to check only strings that look risky (translation is much
longer) than the original string.


No, it's not enough, because most of the time, the team doing the 
translation is one person only, so you can't remember where and when the 
translation is longer than the original, and for some languages it's 
always true.


You might say that we should check quality of the translation. I mean if
the translation makes sense in the context of the given dialog. Well,
this is not mentioned in the current test case. Also, I am not sure if
it is worth the effort. We do not change all strings in every release.
So, we do not need to check all translations.


When you see the amount of strings for the number of people doing 
translation, having a proof reading of the dialog during QA is not a 
luxury ;) But I agree, as said it's not the first aim of the tests




Of course, we need to check that the application is translated but we
can't check every dialog manually.


We had that by the past with the VCLTestool.


Hmm, how VCLTesttool helped here? Did it checked that a string was
localized? Did it checked if a translation was shrinked or confusing?


It took a snapshot of each dialog, menu, submenu, etc. When you want to 
reach a certain amount of quality for you version, it was very useful 
because you were sure that everything was checked. I don't say that you 
run it on each version but I did it on each major OOo versions.



   Instead of the above particular

dialogs, we should check that different elements are localized, for
example:

+ File/New menu - because it consists of optional components
that are 

Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-02 Thread Sophie Gautier

Hi Petr,
On 02/03/2012 19:21, Petr Mladek wrote:

Sophie Gautier píše v Pá 02. 03. 2012 v 18:02 +0100:

Petr,

First, I don't take anything personal in your mail. I disagree with you
but it's nothing personal :)


I hope that we could learn from each other :-)


yes :)



On 02/03/2012 17:26, Petr Mladek wrote:

I appreciate that you want to teach people using Litmus. Though, I am
afraid that you did not get my point.


I don't want to teach them using Litmus, I want them to get an interest,
get fun and don't feel harassed by the task.


Sure. This is my target as well. The translations checks looked boring
on the first look.


QA has a great potential to get boring ;)



We are testing functionalities and by the same way are checking for
basic i18n conversion (numbers, accentuated characters, date, size of
the string...)


Ok, so one example of the current test:

--- cut ---
Name: Translation check of creating a new database

Steps to Perform:

*Open a new database file (New → Database) and check [Create a
 database] then click on Next button.

   * Check [Yes, I want the wizard to register the database] and
 [Open the database for edition] and click on Finish.
   * Enter a name for the database (using special characters in your
 language) in the dialog box and click OK.

Expected Results:

   * the database wizard open: all strings in the dialog box and
 window are correctly localized to your own language.
--- cut ---

Ok, it checks translation and functionality.

Do we really need to check the functionality in all 100 localizations?


It's only checked in 5 or 6 language, even less if you look at the poll 
I've ran on the l10n list.



IMHO, if the database opens in English, it opens in all licalizations.
We do not need to force 100 people to spend time on this functional
test.

Do we need to check translation even when the strings were not changed
between the releases?


yes, because the amount of strings in the database is really big and you 
need more than two eyes to check for the quality.


=  I strongly suggest to separate translation and functional checks. It
is very ineffective to test them together.


you spare some resources, most of the time tests are done by people in 
their native language. Do you want to run them only in English?


Thanks to Rimas, we could mark test cases as language dependent and
independent, so we have a great support for this separation.

Yes but again, this won't change a lot about the translation of the test 
cases, testers will need to run them in their language.



Litmus should be an entry for approaching QA for the community at large
i.e. no language barrier, no technical barrier, a team behind to guide
you further in more complex testing. Unfortunately, it's not a tool
adapted to our needs.


I agree with you. I just say that many of the current test cases sounds
crazy as they are and might point people in a wrong direction.


yes, this is why Litmus is not adapted.




As said, I'm not speaking about translation. The contents of the test
may confuse you when it speaks about localization, but it's only a
second purpose of the test, a *while you are here*, please check that
the dialog has the good special characters in your language


Yes, it is confusing because they mix the translation and functional
tests. All I want to say is that it is not effective and we should not
go this way.


Ok lets try without checking for the translation, we can remove the 
specific directions about language in the test.




No, it's not enough, because most of the time, the team doing the
translation is one person only, so you can't remember where and when the
translation is longer than the original, and for some languages it's
always true.


We could use some scripting here. Andras is interested into the
translations stuff. I wonder if he has time and could help here.



You might say that we should check quality of the translation. I mean if
the translation makes sense in the context of the given dialog. Well,
this is not mentioned in the current test case. Also, I am not sure if
it is worth the effort. We do not change all strings in every release.
So, we do not need to check all translations.


When you see the amount of strings for the number of people doing
translation, having a proof reading of the dialog during QA is not a
luxury ;) But I agree, as said it's not the first aim of the tests


Sure. On the other hand, checking 1000 dialogs because you changed only
20 of them is not luxury as well.


agreed




We had that by the past with the VCLTestool.


Hmm, how VCLTesttool helped here? Did it checked that a string was
localized? Did it checked if a translation was shrinked or confusing?


It took a snapshot of each dialog, menu, submenu, etc. When you want to
reach a certain amount of quality for you version, it was very useful
because you were sure that everything was checked. I don't say that you
run it on each version but I 

Re: [Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?

2012-03-02 Thread Michael Meeks

On Fri, 2012-03-02 at 19:21 +0100, Petr Mladek wrote:
 Name: Translation check of creating a new database
...
   * the database wizard open: all strings in the dialog box and
 window are correctly localized to your own language.

So - this looks pretty odd to me :-) This string cropping problem is
a constant annoyance, and yet it seems (to me) that we can verify at
compile time that there is little-to-no string cropping (assuming a UI
font with a sane width).

Surely, as we translate each dialog, we can (at least on Linux with
freetype etc.) calculate the size of each string, and check that it
doesn't overlap with any other strings in the dialog.

Surely we can also check that we have 100% translation by other means -
checking the .po files etc.

Of course, some sanity check is good to, but if we could automate this
- would it save a lot of time ?

All the best,

Michael.

-- 
michael.me...@suse.com  , Pseudo Engineer, itinerant idiot

___
List Name: Libreoffice-qa mailing list
Mail address: Libreoffice-qa@lists.freedesktop.org
Change settings: http://lists.freedesktop.org/mailman/listinfo/libreoffice-qa
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://lists.freedesktop.org/archives/libreoffice-qa/