El May 13, 2009, a las 7:13, Daniel Kinzler <[email protected]>  
escribió:

>> Now, to be even more useful, database dumps should be produced on
>> *regular* intervals.  That way, we can compare various measures
>> such as article growth, link counts or usage of certain words,
>> without having to introduce the exact dump time in the count.
>
> On a related note: I noticed that the meta-info dumps like
> stub-meta-history.xml.gz etc appear to be generated from the full  
> history dump -
> and thus fail if the full history dump fails, and get delayed if the  
> full
> history dump gets delayed.

Quite the opposite; the full history dump is generated from the stub  
skeleton.

-- brion
>
>
> There are a lot of things that can be done with the meta-info alone,  
> and it
> seems that dump should be easy and fast to generate. So I propose to  
> genereate
> it from the database directly, instead of making it depend on the  
> full history
> dump, which is slow and the most likely to break.
>
> -- daniel
>
> _______________________________________________
> Wikitech-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to