Thanks Joshua. I am intending to try two approaches. The first being to 
use the xml2sql and then fill the rest of the tables with the individual 
dumps of the Tables that are already provided in SQL. The second would 
be using Mwdumper – and then import the rest of the Tables using the SQL 
Dumps already provided to see if there are any differences.

Joshua C. Lerner wrote:
>> Thanks for making this attempt. Let me know if your rebuildall.php has 
>> memory issues.
> 
> Seems fine - steady at 2.2% of memory available.
> 
>> This is really getting confusing for me – because there are so many ways – 
>> all of which guaranteed to work – that work, and the one that is recommended 
>> – does not seem to work.
> 
> I think you mean "all of which are *not* guaranteed to work".
> 
>> I would try out your approach too – but it would take time as I only have 
>> one computer to spare.
> 
> If you want I can just send you a database dump. Either now, or after
> rebuildall.php all finishes. Right now, it's now refreshing the links
> table, but only up to page_id 34,100 out of over 2 million pages.
> It'll be running for days.
> 
> Joshua

Thanks for posting your experience with rebuildall.php.  I think I might 
be able to live with the bad syntax that I get – if I cannot manage to 
get this to work.
Thanks again,
O. O.



_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to