While I think that the commentary on NNTP is essentially correct -- we are
rebuilding the foundation of NNTP, I want to raise a couple of issues that
may justify just WHY such a rebuilding is necessary.

I started working with NNTP back in 1992 ... it was in fact my first
experience with the Internet, prior in fact to HTTP/HTML. At the time, NNTP
was small, largely free, was run by a coterie of competent amateurs for the
love of the medium and was very much devoted to handling the issues
associated with maintaining threading across the boundaries of the nascent
Internet. 

In a decade, the threats that have been lingering at the edge of e-mail has
pretty much devoured NNTP. NNTP is difficult to moderate, difficult to
search, difficult to archive, difficult to set up. If you want NNTP access,
you often have to pay extra from your ISP, and there is no guarantee that
the newsgroups that YOU need are going to be available via the server. The
bulk of material circulating on Usenet is porn span, sent not by legitimate
users of the servers but by companies that seem to feel that extreme (and
typically disgusting) acts of sexual display will drive people to their
sites. The high volume and poor archiving formats also insure that
newsgroups are short memory archives at best. Finally, the role of the web
has changed enough that most people are simply not aware that Usenet exists,
even in those cases where it is available.

Contrast that to what's going on with the current Drupal modules and RSS
syndication. I've written chapters in a couple of books on RSS, and
consequently have had a lot of chance to think about what exactly this
medium is. RSS is significant in that it provides a way to aggregate links
and associate that aggregation with some form of editorial filtering and
annotation. Why is that important? In great part because it is a function
which currently is not done very well within the confines of web pages. Many
web pages contain links and editorial content on those links, but in most
cases such information is not terribly filterable, is reliant upon
webmasters remaining on top of their link pages on a regular basis
(something that very seldom occurs in practice) and such feeds cannot be
merged together to provide a large stream of aggregation. In other words,
the meta-content that Web Pages are able to offer are far less than what RSS
can do.

A Drupal node can be thought of as a distributor of RSS feeds of varying
types, which may or may not also be a transport mechanism for content
itself. In most cases RSS is most efficient when the only payload
information it does carry is abstracts of contents and linkages, perhaps
with enough overhead in terms of production dates and authors to allow
verification systems to work effectively.

RSS abstracts and categories can be archived and persisted, can be formatted
any number of different ways with relatively little work and because of its
XML base works well in web services environments. You don't need a
specialized server to use it, which isn't true of NNTP, and you aren't
dependent upon having to go through a community process to create a new
newsgroup, minimizing the alt.* phenomenon. 

-- Kurt


On Sun, Jul 27, 2003 at 09:32:01PM -0700, Ka-Ping Yee wrote:
> On Sun, 27 Jul 2003, Jay R. Ashworth wrote:
> > NNTP.
> 
> You do realize that what we are doing is rebuilding much of
> what NNTP is supposed to do, don't you?

Of course I do.  That's precisely why I recommended you use the
infrastructure and tools already extant.

> That's slightly tongue-in-cheek -- but only slightly.  Multiple
> sites aggregating articles, sharing articles with each other,
> updating each other on new posts: it's been done, and it's called
> Usenet.  Of course we're adding user authentication, nice graphics,
> and more structured data -- but it's worth noting that Usenet
> didn't work by having every site poll every other site for updates.
> 
> Just something to think about.

And it's *also* worth noting that it's *miserable* -- I mean *REALLY REALLY*
painful, to follow more than about 4 web forums, run on different sites,
hosted by different software packages, with different command structures,
and
different signons.

Stipulated, some percentage of the crowd will *only* ever go here...

but I'm inclined to think that's a smaller percentage than might seem
obvious... and that the proper solution is to build a web-based NNTP client
front end and use the already existant infrastructure which is tuned for
that, instead of rebuilding the wheel.

MIME is not real popular on traditional Usenet, but no reason you can't use
it in a custom implementation on top...

Cheers,
-- jra
-- 
Jay R. Ashworth
[EMAIL PROTECTED]
Member of the Technical Staff     Baylink                             RFC
2100
The Suncoast Freenet         The Things I Think
Tampa Bay, Florida        http://baylink.pitas.com             +1 727 647
1274

   OS X: Because making Unix user-friendly was easier than debugging Windows
        -- Simon Slavin, on a.f.c

Reply via email to