Matt,

Does a PC become more vulnerable to viruses, worms, Trojan horses, root
kits, and other web attacks if it becomes part of a P2P network? And if so
why and how much.  

Ed Porter

-----Original Message-----
From: Matt Mahoney [mailto:[EMAIL PROTECTED] 
Sent: Thursday, December 06, 2007 3:01 PM
To: agi@v2.listbox.com
Subject: RE: Distributed search (was RE: Hacker intelligence level [WAS Re:
[agi] Funding AGI research])


--- Ed Porter <[EMAIL PROTECTED]> wrote:

> I have a lot of respect for Google, but I don't like monopolies, whether
it
> is Microsoft or Google.  I think it is vitally important that there be
> several viable search competators.  
> 
> I wish this wicki one luck.  As I said, it sounds a lot like your idea.

Partly.  The main difference is that I am also proposing a message posting
service, where messages become instantly searchable and are also directed to
persistent queries.

Wikia has a big hurdle to get over.  People will ask "how is this better
than
Google?" before they bother to download the software.  For example, Grub
(distributed spider) uses a lot of bandwidth and disk without providing much
direct benefit to the user.  The major benefit of Wikia seems to be that
users
provide feedback on relevance to query responses, which in theory ought to
provide a better ranking algorithm than something like Google's PageRank.
But
assuming they get enough users to get to this level, spammers could still
game
the system by flooding the network with with high rankings for their
websites.

In a distributed message posting service, each peer would have its own
policy
regarding which messages to relay, keep in its cache, or ignore.  If a
document is valuable, then lots of peers would keep a copy.  A client could
then rank query responses by the number of copies received weighted by the
peer's reputation.  Spammers could try to game the system by adding lots of
peers and flooding the network with advertising, but this would fail because
most other peers would be configured to ignore peers that don't provide
reciprocal services by routing their own outgoing messages.  Any peer not so
configured would quickly be abused and isolated from the network in the same
way that open relay SMTP servers get abused by spammers and blacklisted by
spam filters.

Of course a message posting service would have a big hurdle too.  Initially,
the service would have to be well integrated with the existing Internet. 
Client queries would have to go to the major search engines, and there would
have to be websites set up as peers without the user having to install
software.  Most computers are not configured to run as servers (dynamic IP,
behind firewalls, slow upload, etc), so peers will probably need to allow
message passing over client HTTP (website polling), by email, and over
instant
messaging protocols.

File sharing networks became popular because they offered a service not
available elsewhere (free music).  But I don't intend for the message
posting
service to be used to evade copyright or censorship (although it probably
could be).  The protocol requires that the message's originator and
intermediate routers all be identified by a reply address and time stamp.
It
won't work otherwise.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=73293460-0b3fcd

Reply via email to