Hello Lazarus-List, Friday, January 15, 2010, 8:32:09 PM, you wrote:
>> Well, no, only text information (not needed the InnoDB table) and the >> needed fields to catch the URL to point to that page. An export of >> that tables, even partially to test is enoght. VS> Maybe you can use: VS> http://wiki.lazarus.freepascal.org/Special:Export That's fine for most purposes, but not for this case as a "search engine" that searchs across 20-100 pages is not a great challenge. The problem is when there are too many pages with the same terms here and there, which happends in the Laz/fpc wiki where things like "component" appears almost everywhere so a search like "create new component" will spot the rigth article at position 100-200 while a page like the FAQ appears almost always, whichever you search, at first position, maybe because its page creation date is one of the oldest :-? and contains almost any searchable term. As the dump seems to be a not possible thing, no problem at all, I'll try to perform the experiment with other texts, maybe I can found something like a dump from any other database with many articles, or maybe a bunch of large PDFs talking about the same topic and if it works more or less as expected (I'm not sure about this) integrate it in an empty wiki and populate 10 articles to verify if it works. Thank you anyway to everybody. -- Best regards, JoshyFun -- _______________________________________________ Lazarus mailing list [email protected] http://lists.lazarus.freepascal.org/mailman/listinfo/lazarus
