That's am extremely salient perspective. Data of itself is a nonsense
without reference frameworks. Data -> information -> knowledge. It's
transition interfaces which are vital to the user such that the interface
mechanisms are transparent.

I see the W3C as an aggregate experience born of necessity. I have neither
the mental agility, focus luxury or - importantly - financial comfort
cushion with which to pursue pure vision to a goal. Which is why I will
trust my more fortunate and adept peers to guide and set global standards --
which I will adopt in good faith.

The need to rationalise a coherent, global information interchange mechanism
has, I believe, been largely addressed by W3C and X(HT)ML (SOAP excluded).
Boy do I wish such standards were more than merely emergent in 1996. I had
to drive and develop a 1/4 billion forecasting system, viable and proved
across mainframe/pc/cellphone and laptop environments. My solution: CSV,
comma separated variable files.

For all of our discussion on standards and the interpretation of the letter
of the compliant law, we still must deliver cross-spectrum applications to
disparate hardware and software.

The minutia is very interesting; but my clients' eyes glaze.

Mike Pepper
(knackered) Accessible Web Developer
www.seowebsitepromotion.com

Administrator
www.gawds.org

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Behalf Of Trusz, Andrew
Sent: 08 July 2004 19:25
To: '[EMAIL PROTECTED]'
Subject: RE: [WSG]headers




-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Lee Roberts
Sent: Thursday, July 08, 2004 12:10 PM
To: [EMAIL PROTECTED]
Subject: RE: [WSG]headers

Let's look at the Introduction to the Semantic Web.

[quote] Facilities to put machine-understandable data on the Web are
becoming a high priority for many communities. The Web can reach its full
potential only if it becomes a place where data can be shared and processed
by automated tools as well as by people. For the Web to scale, tomorrow's
programs must be able to share and process data even when these programs
have been designed totally independently. The Semantic Web is a vision: the
idea of having data on the web defined and linked in a way that it can be
used by machines not just for display purposes, but for automation,
integration and reuse of data across various applications.[/quote]

Now, let us examine the last sentence of that quote.  [quote]The Semantic
Web is a vision: the idea of having data on the web defined and linked in a
way that it can be used by machines not just for display purposes, but for
automation, integration and reuse of data across various
applications.[/quote]
===========================================================================

How about we look at the second sentence of the first paragraph

"The Web can reach its full potential only if it becomes a place where data
can be shared and processed by automated tools as well as by people."

You insist on making this about machines when even the w3c which is
primarily concerned with how to make the machine end of it works keeps
inserting people. Yes the machines and applications will process data, writ
large, but it will be done as a result of the value-chains and proofs
requested by the humans.


More Lee:
Prior to RDF, XML and the like it was virtually impossible to share
information across platforms and applications.  Well, it was not exactly
impossible; it was more a security risk.  So, now we have the Semantic web
that allows a shopping cart owner to send an XML feed to Froogle.  Or, we
have RSS which allows us to share news feeds between news sources.  Even
weblogs allow RSS feeds to occur now.

All that joined together allows computers to use the same information for
various applications.  Even business data can be shared without the concern
that the database would be hacked and confidential information released.

=====================================================================

We'll have a semantic web which allows the shopping cart user to check the
bona fides of the merchant and to check the reliability of the product using
rdf and xml perhaps rendered in xml, html, or xhtml. And we can check other
"proofs" from self selected "trusted sources" to evaluate the content of the
RSS news feed. It isn't about just shuffling data it's about evaluating the
data, giving it human related meaning. It is about humans using an
effective, efficient tool which employs common taxonomies and inference
rules to make an effective ontology.

Data has no meaning with interpretation. Make a pile of data. It does
nothing. It says nothing. It's inert until it's interpreted. The semantic
web both gets the data based on shared rules and then possibly applies
additional human chosen interpretive filters. It is, to use an overworked
and usually misapplied word, a synergistic process. But then many things
involving people have unanticipated results.

drew


*****************************************************
The discussion list for http://webstandardsgroup.org/
See http://webstandardsgroup.org/mail/guidelines.cfm
for some hints on posting to the list & getting help
*****************************************************

*****************************************************
The discussion list for http://webstandardsgroup.org/
See http://webstandardsgroup.org/mail/guidelines.cfm
for some hints on posting to the list & getting help
***************************************************** 

Reply via email to