HI everyone
I saw that there are a lot of open source site-search scripts available on
the web. I'm wondering how exactly they do their job. Apart from indexing
every single word in every single page, how do they sort the 'keywords'
according to relevance to each page so that the results are more accurate.
Or, in other words, when i search for a particular word, it probably lists
all pages that contain the word. What i want to know is what is the
algorithm determining the order of results. The question still might seem
too general.
I want a script to parse a certain collection of pages and create keyword to
page relationships giving weight to each relation, which represents how
relevant that keyword is to that particular page.
Does such a script exist? Because I can't seem to find such a thing. If it
doesn't, I can't even think of a suitable way to implement this. Can someone
shed light here please?
Thanks
Navjot Kukreja
_______________________________________________
ilugd mailinglist -- [email protected]
http://frodo.hserus.net/mailman/listinfo/ilugd
Archives at: http://news.gmane.org/gmane.user-groups.linux.delhi 
http://www.mail-archive.com/[email protected]/

Reply via email to