Hi John

dumpBackup isn't a full wiki backup. 

Sounds like script is timing out. Did you look at the data with an xml tool to 
see if all articles are in the file. Could be a time out on upload too. Most 
host have a max execution time. 

Just use a command line MySQL dump of the database on your local. Then a 
command line SQL restore on the host. Makes backups and moving db easy. 

If you don't have a command line there are other methods to run a full backup 
and move db. XCloner V3.3 stand alone will do website files and db or one or 
the other. 

Tom

On Aug 6, 2013, at 10:08 PM, "John W. Foster" <[email protected]> wrote:

> Ive used this to dump what I hoped would be a complete backup of my
> local hosted mediawiki. Th purpose of which was to import the .xml file
> to a new working server. The script to import it did the job. however
> the dumped.xml file did not contain all the articles. It uploaded 476 of
> a site that contains 5391 articles. Just wondering why. Ive done it 3
> times with no different results.
> john
> 
> 
> _______________________________________________
> MediaWiki-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to