[us] for UniverSal, not specific to France or United Kingdom :-).

I am very appreciative of the ongoing substantive discussion on this
topic. I really think we are getting somewhere new and hopefully useful.

Specifics about ontologies below.

On Sun, 13 May 2001, philippe Ameline wrote:
...
> Now ontologies :
>
> As I already said, genuine table representation uses lots of implicit
> meaning, while ontologies are a way to express things with explicit meanings
> (kind of "controled langage").

I agree. However, what I am attempting to understand is whether ontologies
1) need to have a "global" scope and 2) need to be "pre-defined".

Relating to human languages, both 1) and 2) above are clearly untrue. If
you don't agree with this point, please let me know and I will be glad to
elaborate with examples.

> I said that it was possible to export table representation into "ontological
> langage" :
>
> field : weight
> value : 70
>
> becomes (Odyssee tree representation)
>
> "patient's weight"
>           + 70 "kg"
>
> Usually, you cannot reverse it (thats to say import a tree in a table),
> since tree has no deterministic structure, and with the simple field = value
> approach, it is a nightmare to express :
>
> "patient's weight"
>          + "normal"
>
> or
>
> "patient's weight"
>          + "before"
>           |       + "regimen"
>           |                + 70 "kg"
>           + "after"
>                   + "regimen"
>                             + 65 "kg"
>
> or
>
> "patient's weight"
>           + 2810 "g"
>
> Andrew seems to manage "normal"
>
> > 1) rename - weight:x "kg" + weight:{normal,abnormal} -> weight:x "kg",
> > weight_in_range:{normal,abnormal}
> >
> > 2) recode - weight:x "kg" + weight:{normal,abnormal} -> weight:x "kg"
> > where weight=70 if weight=normal.
>
> but you have to do it by hand, for each value.

Indeed, it is a "brute-force" approach of documenting ontologies. However,
these "rules" can be re-used and thus leading to some automation. Note
that the prevailing implementations of ontologies (including your own
approach), cannot be fully automatic either. In addition, you spend much
time writing a "dictionary" that will most certainly be out-of-date the
moment you create it.

Remember, languages, semantics, and ontologies are invented to serve a
communication function.  New terms and new meanings for existing terms
cannot be avoided. Communications are always local in nature (from the
transmitter to the recipient). This is the theoretical basis for the
simplified approach to ontologies that OIO uses.

> You need a local
> administrator.

No, you don't need an "administrator" at all. You need a transmitter (who
speaks) and a recipient (who listens).

> And if someone says "Patient's weight" : "average" ?

The recipient says, "what do you mean by average"? :-) I would argue that
you need more than traditional ontologies to understand what "average
weight" means. For example, you *may* need to know who measured it, how
many raw values are averaged, over what period of time, with or without
shoes, etc etc. This infinite list of possible requirements vary from term
to term and from application to application.

I am not saying that traditional tree or network-based approach does not
serve any purpose - but I am merely pointing out that "manual over-ride"
by human operators is still necessary and perhaps typical.

> My point of view is that if you want to enter knowledge management scope,
> you will sooner or later use ontologies.

Philippe, as I said before, I am not against ontologies - just how they
are constructed.

> Why ?
>
> Because Odyssee and GEHR fixed the langage

I am not sure GEHR fixed the language. The flexible and extensible
Archetypes and translators/mapping between Archetypes appears to be
identical to OIO's approach.

> (ontologie + tree for Odyssee) ,
> and still have lots of work having users express valid documents (GEHR's
> archetype, Odyssee's Fils guides) then building automatic interpretation.

Thomas stated that hierarchical organizations will be used to automate
GEHR Archetype mapping whenever possible. However, he also admitted
that one-archetype to another-archetype translator will also be used.

Regarding "validation" of archetypes, I believe the requirement is that
they conform to the GEHR object model (GOM) - which is a syntactical
constraint and not a semantic constraint. The GOM is equivalent to OIO's
metamodel (e.g. forms, reports, patients).

> If you build the langage "by speaking", I am afraid :
> 1) You need a whole staff of full time cogniticians

This is where the "flexible scoping" concept makes the solution practical.
When merging data collected using two forms, one need not worry about the
other 5000 forms in the system. The practicality of the approach comes
from the "just-in-time" and "as-needed" construction of translators.

Cogniticians are not needed at all. (What are they suppose to do anyways
:-)?

> 2) Elsewhere you have a very, very long way before you have a genuine
> knowledge management system, and not a "locally coherent system of forms".

Yes, it may be a long way before the system becomes a fully automated
knowledge repository. However, just like a child needs to learn one word
and one ontologic association at a time, I am afraid "force feeding" a
system through fancy ontological short-cuts have so far yielded very
limited systems that consume lots of feeding upfront and not maintainable
over time.

> Important ?
>
> Yes, because if it is possible to computerize the knowledge of a company
> with a local system, medicine demands global knowledge systems.

This I agree with. We have different approaches to building this global
knowledge system. I take the piece-by-piece, one association by one
association approach while you want to begin with a comprehensive
ontological tree.

> And to be
> perfectly honnest, this is the prime reason why Odyssee became open source.

There needs to be a way to weave "new" terms and meanings into your global
tree.  The OIO system gives individual users the freedom to define their
own terms and use them as they wish for their local tasks. These local
pieces are weaved together as necessary for data merging and
communications. I believe this is also the approach that GEHR is taking.

Limitations of top-down approaches based on constructs such as global data
dictionaries and ontologies are well-known. That is why I am very excited
to read Thomas' paper and see that GEHR is also taking a bottom-up
approach to building knowledge repository.

...

> PS : I am starting 3 open source projects into Odysse :
> 1) A "real time" epidemyology web server using various classifications
> (mail -> SQL -> stat module -> web pages)
> 2) A piping tool (win32) to capture datas in patient record softwares (using
> Windows spying) in order to feed the web server
> 3) The porting of Odysse on Linux OS
> Genuine web pages soon ; you commentaries are welcome.

Looking forward to reviewing them!

Best wishes,

Andrew
---
Andrew P. Ho, M.D.
OIO: Open Infrastructure for Outcomes
TxOutcome.Org (hosting OIO Library #1)
Assistant Clinical Professor
Department of Psychiatry, Harbor-UCLA Medical Center
University of California, Los Angeles

Reply via email to