Re: [Wikitech-l] Dump processes seem to be dead

2009-02-25 Thread Mark (Markie)
afaik there are hands in amsterdam that can be called upon to do stuff as necessary in the centre like any other hosting customer, but the need is not quite of the same level as tampa due to size, servers there etc. seoul no longer operates so this is not an issue. regards mark On Tue, Feb 24,

Re: [Wikitech-l] Front-end performance optimization

2009-02-25 Thread Sergey Chernyshev
On Tue, Feb 24, 2009 at 7:31 PM, Aryeh Gregor simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com wrote: On Tue, Feb 24, 2009 at 7:17 PM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: How do we go about doing this? Can it be tied into Usability project (

Re: [Wikitech-l] Dump processes seem to be dead

2009-02-25 Thread Marco Schuster
2009/2/25 John Doe phoenixoverr...@gmail.com: Id recommend either 10m or 10% of the database which ever is larger for new dumps to screen out a majority of the deletions. what are your thoughts on this process brion (and the rest of the tech team)? Another idea: If $revision is

Re: [Wikitech-l] Front-end performance optimization

2009-02-25 Thread Michael Dale
Sergey Chernyshev wrote: Yes, of course - I checked it out and that's why I quoted it in my original email. My brief overview made me feel that it wasn't enough. I just didn't want this to be only in context of localization as performance is more related to overall user experience then to

Re: [Wikitech-l] Dump processes seem to be dead

2009-02-25 Thread Platonides
Marco Schuster wrote: Another idea: If $revision is deleted/oversighted/whateverhowmadeinvisible, then find out the block ID for the dump so that only this specific block needs to be re-created in next dump run. Or, better: do not recreate the dump block, but only remove the offending