I'll try to weigh in with a bit of useful information, but it probably  
won't help that much.

You'll need a quite impressive machine to host even just the current  
revisions of the wiki. Expect to expend 10s to even hundreds of  
gigabytes on the database alone for Wikipedia using only the current  
versions.

There instructions for how to load the data that can be found by  
googling "wikipedia dump".

Several others have inquired for more information about your goal, and  
I'm going to echo that. The mechanics of hosting this kind of data  
(volume, really) are highly related to the associated task.

This data used for academic research would be handled differenty than  
for a live website, for example.

Nobody likes to be told they can't do something, or get a bunch of  
useless responses to a request for help. Very sincerely, though,  
unless you find enough information from the dump instruction pages to  
point you on the right direction and are able to ask more specific  
questions, you are in over your head. Your solution at that point  
would be to hire somebody.

Sent from my phone,
Jeff

On Jan 27, 2009, at 12:34 PM, Stephen Dunn <swd...@yahoo.com> wrote:

> Hi Folks:
>
> I am a newbie so I apologize if I am asking basic questions. How  
> would I go about hosting wiktionary allowing search queries via the  
> web using opensearch. I am having trouble fining info on how to set  
> this up. Any assistance is greatly appreciated.
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to