Bob La Quey, Sep 14, 2007 at 12:16:58PM -0700, wrote:: > On 9/14/07, Wade Curry <[EMAIL PROTECTED]> wrote: > > Andrew Lentvorski([EMAIL PROTECTED])@Fri, Sep 14, 2007 at > > 01:05:27AM -0700: > > > > > > What I really want from a search engine is the "Junk" button > > > from Thunderbird which is used to help train your spam > > > filter. When I run a search, I want to be able to classify > > > sites as "Junk" so that they start dropping in Googlerank for > > > me. > > > > Bayesian filtering of search results -- now this idea sounds > > useful. It puts power in the users' hands, and would make it > > /much/ more difficult for any content provider to even /guess/ > > how search results are ranked for any given user, regardless of > > the choice of search engine. > > > Yes. I agree with this. It could also be done on the server side > ... they after all have your input.
I like the idea of having a server do this, but I'm thinking more along the lines of a local service for a LAN. If search results will be aggregated from several sources, I'd rather not have that be controlled by the search providers. A sort of search proxy that runs on my LAN appeals to me because I'd have more control over what gets thrown in the faces of my kids. > In fact I have had this very idea under active discussion for > several months now with my Buddy Brad Collins who is deply inot > this and the mre general problem of metadata for the web and > beyond. > > Note that to some extent the social bookmarking sites like > del.icio.us are starting to provide a mechanism for alternative > ratings of sites. This data could easily be gathered by search > engines and used as yet another ranking mechanism. I've heard del.icio.us mentioned numerous times, but have never tried it. Seems that if you aggregate results, the rankings of *any* of the contributing engines could be used that way, especially if a proxy did that work. Wade Curry syntaxman -- [email protected] http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list
