Tora: See comments below:
-----Original Message----- >From: Tora <[EMAIL PROTECTED]> >Sent: Dec 15, 2006 8:01 PM >To: [email protected] >Subject: Re: [qa-dev] writing an easy QA howto > ============================================================== > * UI Freeze : October 5th 2006 (New !! all new strings need ... > * Feature_freeze : October 12th 2006 (New !! New features ... > * Translation update 2.1 start: October 12th > * Translation update 2.1 delivery: October 26th, 2006 > * code_freeze: November 2nd > * l10n CWS builds available: November 2nd > * TCM l10n testing on CWS builds: November 2nd - 8th This is very quick. This also does not allow for the NON Sun builds to be tested and approved. > * pre release candidate (en_US only): November 9th 2006 Why only en_US? This should go out to all NLC teams for testing. This should be the START of the test cycle. Milestone builds do find problems that DO NOT exist in the en_US version. > * Last_cws_integration for fixes: November 16th 2006 Problems can and will be found in the RC's. This should be dated about three days before the actual release of RC1 to catch CWSs that exist for Priority 1 items. NO RC SHOULD GO OUT UNTIL ALL PRIORITY ONE ITEMS ARE EITHER FIXED OR MEDIATED (that is downgraded or a workaround that ordinary users will accept pending a fix is found.) > * release candidate for all languages: November 23th 2006 > * release candidate 2 for all languages: December 1st 2006 This worked this time. However, allowances should be made for possible follow on RCs. > * Product release: December 12th, 2006 This should be one week after all testing is completed on NON Sun builds and after it is determined that no new Priority one issues exist. This date slipped one week with this release. > ============================================================== > >In the list, the local test is positioned in the very end of the >process, from release candidate to product release. Actually, when the L10N release is made, local testing should start. > >Many tests had been done before the local test. Each test was done >from different perspective of view. So, now we have a question. > >=== >What and what for should we test on our local builds to release them? >=== > >The question could be divided into several small questions: > - What should we test mainly at a period of local test and why? Simple answer. Anything that can be localized should be. That means complete translation into the appropriate language of all Helps, menu items, dialog entries, etc. These should be examined for completeness and truthfulness. Of course there are going to be disagreements, but this needs to be done for many of the ISO languages. > - What should not we test at the moment, which would spend time Any functionality that does not work should not be tested to see if it is fixed. Just work through the areas that should be localized. Any area missing localization should be looked into and completed. > and make a delay of release process. The release process should not be delayed pending localization issues. Only program functionality issues should delay the release of the program. Some issues will result because localization is not applied properly and causes the program to cease functioning correctly. > - (For the next step) How can we eliminate redundancy of test efforts > among local communities? The primary functionalities of OpenOffice.org > are language independent and identical to every local version. > Theoretically, local communities could share test results on the > primary functionalities. This should be true. If I fully test OpenOffice.org on the Mac Intel platform for en_US (the base platform) there should be no need to test the German language version for those same functionalities but rather just for localization issues. However, THIS IS NOT TRUE. Due to the manner in which OpenOffice.org is built, problems can arise with any of the localized builds, thus they have to be completely tested, both for functionality issues as well as localization issues. Only after a NLC test team has judged that a localized version does not have any issues that will cause ordinary users to stop using the program or that use of the program will not cause data loss in ordinary use, can that version be classified as ready for release to the general public as a final release. This is not to say that issues do not still exist that need further work or localization issues do not exist. We cannot put out perfect software, we don't have the resources. >One thing that we could consider: > The primary and language independent functionalities had been tested > by development team and QA team before our local test. What had not > been tested so far? What is difference among local environments? > That would be keyboard, input method, font rendering, date-time-currency > format, printing to printers that locally purchased, interoperability > with competitor's software turned and tweaked to the local market, > and so on. As I stated, functionality still needs to be tested. There can and are issues that will be unique to the localized build that will not exist on any other build. This has happened in the past and will do so into the near future (OpenOffice.org 2.x versions for the time being.) > > We should test language dependent functionalities sometime during > the entire development process. But, they obviously should not be > done at release test. It is too late and time consuming. But, but, > if nobody had tested them at all, some of them should be tested > before giving OK to our local builds. I agree. Localization and functionality tests should start with the L10N releases and continue to and past release. Issues need to be found by us, those who choose to test OpenOffice.org and not by ordinary users. This means the number and length of testing will increase. The net result is a much better product for our users. Given that I have a long time and experience testing software, these are my thoughts on the QA process: 1. All milestone builds should be run through the complete QA process. Smoketests are not enough to stop the regression process. Automated tests are not enough to find functionality and localization issues. 2. Code freeze means that. No new functionalities will be incorporated unless: a. An issue exists for the code to be added that addresses a functional flaw in the product. b. The inclusion of the new code will not cause regression issues. 3. L10N releases are to address language issues. This is not true. L10N releases should be reasonably free from defects and represent the final release without full localization. Localization should be the PRIMARY purpose of these releases, but not the only one. 4. Release candidates MUST be put through FULL functional testing. This means using the product as it would be in a full production environment. If a new use for the product is found or a possible misuse case is found, it must be fully documented and added to both automated testing (if applicable) and to manual TCM testing. An issue in the Issue Tracker WILL BE created to track the problem and to document its existence. This issue can be immediately closed if it is a case of possible misuse if the product successfully rejects improper input, but if it does not, the issue must be worked within the existing issue system. 5. NO FINAL RELEASE should have unaddressed issues! If this means delaying the release by one or two weeks, that should be how it is. Remember, we are a VOLUNTEER effort with limited support from Sun Microsystems. There are many releases that are produced outside of Sun's control. These releases need to be run through the QA process as well as Sun's products as they may still have unresolved problems or problems that are/were not detected in Sun's limited test environment. All of this is in the interest of giving our users, regardless of their native language, the best product we can give them. James McKenzie Mac OS X QA Test Coordinator and Mac OS X QA Test Lead, American English --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
