How about a script that Googles site:www.myurl.com and then walks every page 
and copies it ;)


 

 

 

-----Original Message-----
From: John Foster <[email protected]>
To: mediawiki-list <[email protected]>
Cc: MediaWiki announcements and site admin list 
<[email protected]>
Sent: Mon, Oct 21, 2013 4:58 pm
Subject: Re: [MediaWiki-l] Mediawiki articla export


On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote: 
> John W. Foster wrote:
> > Is there any way to export ALL the articles & or pages from a very slow
> > but working mediawiki. I want to move them to a much faster upgraded
> > mediawiki server.
> > I have tried the dumpbackup script in /maintainence, but that didn't get
> > all the pages, only some, a& I dont know why. Any tips are appreciated.
> > Thanks
> > john
> >
> >   
> If it's the same version of mediawiki you can always try dumping the database 
directly and importing it into mysql on the new server.  I'm not sure but you 
might have to create the exact file structure as well....
> 
Thanks.
I am aware of that solution & in fact it is my preferred method for
moving a wiki. However; the reason the mediawiki is slow is a totally
messed up MySql database system, & I don't know how to fix it. I tried
for over a year, as the wiki has thousands of pages/articles. Therefore
I don't want to move the table structure for this db into the new,
properly functioning wiki.
Anything else, maybe.


_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l


 
_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to