Are you using importDump.php or the web interface? ie SpecialPages import. 

Tom

On Aug 9, 2013, at 12:13 PM, "John W. Foster" <[email protected]> wrote:

> So far Ive tried the dumpBackup.php and that only gets part of it. It
> has been suggested that its a php script timeout issue and that's
> possible. It is a large site with over 5000 articles on it so it will be
> large. I would appreciate any tips on how to do this. I've also looked
> at XCloner as another suggested and it does not appear to provide the
> functionality to do what I need. Does fine on existing hard pages
> in /html  directory, but does not seem to be able to pull files from a
> mediawiki and place them into a .xml file for importing. Even something
> that could break up a backup so that it gets everything could help.
> Dumping the databse ( mysql) will not work as that part of the issue.
> The existing data base in somewhat filled with old no longer relevant
> tables that SLOW it way down. 
> Any tips please?
> Thanks
> John
> 
> 
> _______________________________________________
> MediaWiki-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to