On Thu, 23 Sep 2010 18:10:37 -0700, Brion Vibber wrote:

> On Thu, Sep 23, 2010 at 4:03 PM, Dan Nessett <[email protected]> wrote:
> 
>> Thinking about this a bit, we seem to have come full circle. If we use
>> a URL per regression test run, then we need to multiplex wiki
>> resources. When you set up a wiki family, the resources are permanent.
>> But, for a test run, you need to set them up, use them and then reclaim
>> them. The resources are the db, the images directory, cache data, etc.
>>
>>
> Computers can delete files as well as create them. Drop the database and
> remove the uploads directory, now it's gone *poof magic*.
> 
> Nothing has to be "multiplexed" or "locked": you create it before you
> start using it, and if you don't need it you can remove it when you're
> done with it. Nothing else will try to use it because nothing else has
> the same test & test run ID -- other tests will use *their* dedicated
> wikis. There is no overlap.
> 
> This is trivial given the idea of "make one dedicated wiki for each test
> run" and a basic understanding of how bulk MediaWiki installations
> operate. If you're trying to help plan how to run automated regression
> testing for MediaWiki *without* having done your basic research on how
> the system works, I strongly recommend you stop and do that first before
> continuing.
> 
> -- brion

I appreciate your recent help, so I am going to ignore the tone of your 
last message and focus on issues. While a test run can set up, use and 
then delete the temporary resources it needs (i.e., db, images directory, 
etc.), you really haven't answered the question I posed. If the test run 
ends abnormally, then it will not delete those resources. There has to be 
a way to garbage collect orphaned dbs, images directories and cache 
entries.

I can think of a number of ways to do this. A cron job could periodically 
sweep the locations where those resources reside and reclaim orphans. 
This would require marking them with a lifetime after which they would be 
retired. Alternatively, each time a new test run begins, it could sweep 
for orphaned resources (again using a lifetime marker).

In regards to locking, this is required for the fixed URL scheme I 
originally described, not for multiplexing a common code base over 
multiple wikis, which is what the wikipedia family mechanism implements.

My personal view is we should start out simple (as you originally 
suggested) with a set of fixed URLs that are used serially by test runs. 
Implementing this is probably the easiest option and would allow us to 
get something up and running quickly. This approach doesn't require 
significant development, although it does require a way to control access 
to the URLs so test runs don't step on each other. Once we have some 
experience, we can identify any shortcomings of this approach and 
incrementally improve it.

-- 
-- Dan Nessett


_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to