On 15/06/12 19:44, Eric K wrote:
> For most of us, our sites are very important to us and we want to make sure 
> they continue existing. One way is to make sure that data backups exist in 
> multiple places.
> What's the best way to have a publicly downloadable backup, which is simply 
> the present text in all the pages, and perhaps the uploaded files too? I 
> wouldn't want the full database backup because it contains usernames and 
> passwords in the user tables. Additionally it could also be small in size (so 
> earlier versions dont have to be included) and it could be generated by 
> demand so it would be automatic.
> I've seen the XML backup option and I wonder if any of you are doing that 
> with some specific options or doing something else.
> Eric

Yes, XML dumps are what you want.
They don't contain user passwords, deleted pages, etc.
You can choose wether to provide the full history or only the present text.
Generated on demand.. not so much, it could take a long time, depending
on your wiki size. Your best bet is to put a nightly cron job to create
the dump.

PS: Remember that you still need to backup that non-public data.


_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to