Hi Bert,
let me try to clarify a couple of things.
The processes we are talking about are 2 different ones:
- linguistic review
- l10n testing
Linguistic review is performed on translated files and the main aim of
linguistic review is to scope errors like, grammar, spelling,
terminology inconsistencies, translation errors, etc. and such a review
is always performed by native speakers because they are the only ones
able to guarantee language quality. The errors found during review are
mostly corrected by the translators and in order for them to know in
which files to correct the mistakes we use a spreadsheet, what we call a
*Q[uality]A[ssurance] form* (similar to the ones Alpha provided you when
you performed linguistic review of the SW messages) - see attachment as
an example of a QA matrix we used for Italian.
L10n testing is performed on built products following test cases. L10n
testing is usually performed by engineers or persons with more technical
skills and are not necessary native speakers but need to understand the
language. During l10n testing you would check things like corrupted
characters, corrupted or truncated messages (because the translation is
too long), if the main menu mnemonics are not set correctly and any
other localization relevant issues... For l10n testing, we recommend to
use TCM since you have there all most updated test cases and it helps
carrying out testing in a more systematic way. All issues found are
filed in issuezilla and follow a precise bug handling process. Note that
TCM has only test cases for the UI and not for the Help. There are no
test cases to test the Help because for the content you usually perform
a linguistic review....
I was able to provide you with builds in which the UI contains most of
the fixes carried out so far. On this build, you can perform l10n testing.
As for the Help, the main problem was that our technical writers carried
out these changes in the source and the changes were reflected in the
translated languages only recently ... and the volume was much larger
than expected. Alpha reacted promptly and they are really doing all the
best in order to have this large volume translated in such short time....
I hear your frustration and I am very sorry about this. How can I help?
Rafaella
Bert Meersma wrote On 10/17/05 19:18,:
Hi Rafaella,
Comments inline.
Rafaella Braconi wrote:
Hi Bert,
Bert Meersma wrote On 10/15/05 20:12,:
Hi Rafaella, and all others,
We did a quick scan of the latest version of the help. Unfortunately
we found a lot of pages that are still in English.
Yes, most fixes we integrated into the build refer to the UI. For the
Help, Alpha is translating a quite large volume of Help files which
they will deliver to us next week. Only after Alpha's delivery, I
will again be able to provide you with a build, and this time it
should contain full translated Help.
Since Alpha is still translating a large portion of the files it
doesn't make much sense to me to start reviewing now. We would
encounter a lot of issues, that were probably solved already by Alpha.
It also wouldn't make sense if Ian started fixing the issues in the
current build when there will be an update from Alpha.
<snip>
Until now, there are 3 people (including me) who volunteered to
review the help. With only these 3 persons it would take quite a
while before the work is finished. Is there anything you can do
about this?
How much time have you scheduled for review? If I can make a
suggestion, I would not review all the help .... You could decide for
example to perform 10% spot review accross the Help files. Usually
with 10% review you find the most common mistakes (which for example
need to be fixed accross the whole Help or at least in that
particular module) such as grammar, spelling, terminology and in
order to scope the most common ones it is not necessary to perform a
100% review on all files.
We didn't really schedule time for it, but the way it looks now it
would take months. Reviewing only 10% would also not be satifactory.
But I'm confident that once Alpha has finished translating and we get
a build that has a fully translated set of help files we could review
it in a couple of weeks (days perhaps).
I was already under the impression that Alpha had already finished and
that the build we got had the fully translated help files. Apparently
this not the case, so that's what all this is about.
Also, in order to track all this valuable feedback you are gathering
during review, I would encourage you to use the QA form spreadsheet
we successfully used with other N-L Communities which performed review.
Can you tell me where this spreadsheet is? And what the procedure is
in using this spreadsheet? And another thing is that I had a
discussion with Petr and Ian about the way we should create issues. I
suggested the use of a spreadsheet in the first place, but it was
dismissed. So do Petr and Ian know about this QA spreadsheet?
Let me know if this is something you would like to try and I can
provide the QA form templates, etc.
If we're going to wait for Alpha to finish the translation I think we
have some time to look into this. But I must say that I'm getting a
little confused by all the systems and procedures (or the lack
thereof). What is the relation between TCM and the QA forms you're
referring to here? Let's say that if we were going to use TCM, would
we still need to use the QA forms?
Kind regards,
Bert Meersma
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]