On Fri, 17 Sep 2010 19:13:33 +0000, Dan Nessett wrote:

> On Fri, 17 Sep 2010 18:40:53 +0000, Dan Nessett wrote:
> 
>> I have been tasked to evaluate whether we can use the parserTests db
>> code for the selenium framework. I just looked it over and have serious
>> reservations. I would appreciate any comments on the following
>> analysis.
>> 
>> The environment for selenium tests is different than that for
>> parserTests. It is envisioned that multiple concurrent tests could run
>> using the same MW code base. Consequently, each test run must:
>> 
>> + Use a db that if written to will not destroy other test wiki
>> information.
>> + Switch in a new images and math directory so any writes do not
>> interfere with other tests.
>> + Maintain the integrity of the cache.
>> 
>> Note that tests would *never* run on a production wiki (it may be
>> possible to do so if they do no writes, but safety considerations
>> suggest they should always run on a test data, not production data). In
>> fact production wikis should always retain the setting
>> $wgEnableSelenium = false, to ensure selenium test are disabled.
>> 
>> Given this background, consider the following (and feel free to comment
>> on it):
>> 
>> parserTests temporary table code:
>> 
>> A fixed set of tables are specified in the code. parserTests creates
>> temporary tables with the same name, but using a different static
>> prefix. These tables are used for the parserTests run.
>> 
>> Problems using this approach for selenium tests:
>> 
>>  + Selenium tests on extensions may require use of extension specific
>> tables, the names of which cannot be elaborated in the code.
>> 
>> + Concurrent test runs of parserTests are not supported, since the
>> temporary tables have fixed names and therefore concurrent writes to
>> them by parallel test runs would cause interference.
>> 
>> + Clean up from aborted runs requires dropping fossil tables. But, if a
>> previous run tested an extension with extension-specific tables, there
>> is no way for a test of some other functionality to figure out which
>> tables to drop.
>> 
>> For these reasons, I don't think we can reuse the parserTests code.
>> However, I am open to arguments to the contrary.
> 
> After reflection, here are some other problems.
> 
> + Some tests assume the existence of data in the db. For example, the
> PagedTiffHandler tests assume the image Multipage.tiff is already
> loaded. However, this requires an entry in the image table. You could
> modify the test to clone the existing image table, but that means you
> have problems with:
> 
> + Some tests assume certain data is *not* in the db. PagedTiffHandler
> has tests that upload images. These cannot already be in the images
> table. So, you can't simply clone the images table.
> 
> All of this suggests to me that a better strategy is:
> 
> + When the test run begins, clone a db associated with the test suite.
> 
> + Switch the wiki to use this db and return a cookie or some other state
> information that identifies this test run configuration.
> 
> + When the test suite runs, each wiki access supplies this state so the
> wiki code can switch in the correct db.
> 
> + Cleanup of test runs requires removing the cloned db.
> 
> + To handled aborted runs, there needs to be a mechanism to time out
> cloned dbs and the state associated with the test run.

Regardless of how we implement the persistent storage for managing test 
runs, there needs to be a way to trigger it use. To minimize the changes 
to core, we need a hook that runs after processing LocalSettings (and by 
implication DefaultSettings), but before any wiki state is accessed 
(e.g., before accessing the db, the images directory, any cached data). I 
looked at the existing hooks, but so far have not found one that appears 
suitable.

So, either we need to identify an appropriate existing hook, or we need 
to add a hook that meets the requirements.

-- 
-- Dan Nessett


_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to