On Tue, Dec 2, 2008 at 9:58 PM, Prasanta Baruah
<[EMAIL PROTECTED]> wrote:
>> We are using the whole dump given by Wikipedia regularly. Plus we are
>>
>
> Downloading the wikipedia regularly? This changes very rapidly.
>
> In Gnoware, if the article was not found locally it goes to the internet,
> downloads it and stores it in the database for future references.

The code is right off the oven so it's still pretty premature stage.
We do not download the dump regularly. A similar feature will help
keep it updated.

>> doing it for Indian Local Languages as well. And planning to provide
>>
>
> There was the localization also for 5 languages with an interface.

Interesting. How much space does the whole setup take? And if we
include the whole of English Wikipedia?

Kind Regards
Nandeep

_______________________________________________
ilugd mailinglist -- ilugd@lists.linux-delhi.org
http://frodo.hserus.net/mailman/listinfo/ilugd
Archives at: http://news.gmane.org/gmane.user-groups.linux.delhi 
http://www.mail-archive.com/ilugd@lists.linux-delhi.org/

Reply via email to