Hi Thomas,
Thomas Roswall schrieb:
1) the scenario-concept
I understand the idea, that it should be sort of a meningsfull grouping of
tests. But in fx calc - formatting, we have almost only tests that are
allready tested in calc1+calc2. that means extra work, if the tree scenarios
is not assigned to the same tester. and extra work for testers makes the
process slower.
Each scenario is a subset of all available test cases. Scenarios need
not to be distinct.
The calc1 - calc2 - writer1 - writer2 ... scenarios are test sets
grouped by application. At the moment they are almost complete sets for
those applications (meaning the group all currently available test cases
for one application).
I assign those scenarios to have most of the test cases assigned to a
tester (e.g. for "Beta"-testing or localization testing).
The release-sanity scenario is a reduced set of all those tests. I use
this scenario for (final) RC tests.
The formatting scenario is focused on spezial functionality (formatting)
across all applications. This is only usefull, if you do a spezial
"formatting" test.
All these three "classes" of scenarios should be used separately.
This should be put to the wiki in a special section "TCM Concepts -
scenarios". Feel free to take my text (if it is undestandable to you)
and put it to the wiki.
2) typos/errors in the testcases
in the wikipage it is not clear what to do in the situation where the testcase
is not very clear or insufficient regarding your own language. I have a few
tests that fils if you follow the case word by word (fx test case 109274),
but certainly is passed in my opinion. and some tests are not quite clear, fx
the drag and drop between OOo-apps (109298), text and fields can be copied
with drag and drop, while objects only can be copied with copy-paste. now
that copy paste works, i am now shure whether the test is passed or not.
For typos in your language the answer is easy: any SQE / MGR can fix
those typos. (This is what should be written at the SQE - TRanslating
test cases section).
If there are errors in the english description (or it is unclear) you
may ask here at the list or file an issue. We do not have a special
process in place for that .. so I cannot tell how to document the process.
Third problem - test failed as it does not met the expected result but
is passed in the context of your daily work. (Or vice versa - the test
meets the expected result but there are other visible errors).
I'd like to see a option "passed with errors" for that. We would need to
go through the TCM-wishes again and work on implementing them.
For the moment I see only the solution to set passed / failed
case-by-case and leave a comment.
André
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]