** Feel free to remove that gosh-awful subject line **
The threads seem to be merging; and that's a GoodThing(tm).
Joseph's post appears to be an invitation to provide a kind of
project summary in regards to the issue of 'not operating in a
vacuum'.
Background: (Caution! US centric viewpoint)
This very broad subject is somewhat based on the point of
reference/view of the participant.
....
The more we massaged the ERDs, the bigger the holes got to be. No
wonder these system cost so much, this medical stuff is hard to
model! Now, I had poked around OO design stuff for a few years
but it was VERY strange. Just didn't seem to be very practical.
Besides, you had to store the objects somewhere, usually in a
RDBMS. I read a little more about object persistence and
OODBMS'. When I read a mention about Python and Bobobase it lead
me to a product called Principia that the company was combining
into what we now know as Zope. I saw some pretty impressive web
sites that Digital Creations had built with this stuff but every
time I tried to use it, it broke in very strange ways.
Persistence (pun intended) paid off.
The Zope/Python/C combination is powerful!
I'm a little puzzled why you found the OO design stuff VERY strange.
It is a paradigm that makes computing much cleaner and clearer to me
rather than a hodgepodge of procedures and data. I understand
that Zope/Python/C, etc. is very powerful, although I find the documentation
for Zope to be rather hard to follow. I couldn't find the on-line tutorial.
....
What does this have to do with inter-operability? Gambling &
flexibility. I admire the work done by OpenEMed. Two years ago I
was certain that CORBAmed was the solution. I still think it
will play a KEY role in the solution to the world-wide federated
electronic medical record system. I'm just not so sure it'll
happen in my lifetime. (It's a political issue not a technology
issue). When that WW-EMR is unfurled, it will be passing
archetyped data, because that will already be the proven
inter-application communication methodology.
You may be right, but I'm convinced the principles behind what
is being done in the OMG HDTF will have to be done if we
are to have interoperability. The implementation may be in other
(less efficient, IMHO) technologies, but the process that has
been going on their has to be done. Otherwise 20 years from
now we will still have a bunch of islands that can't talk to one another.
XML isn't the solution, it is just a technology that people may
choose to use to implement the approach. It is actually
slowing things down now, because people are off reinventing
things that have been around for 5-10 years. The other factor
is that many businesses don't think it is a good idea to have
interoperability, because this will make it easier on the end user and
lower their profits. Using XML will increase the profits of memory, disk
,OS and networking folks because all the gratuitous parsing that has to
go on.
Right now, FreePM can directly exchange data with FreePM. How
exciting!? But, the infrastructure is designed so that it can
pretty easily exchange data with all RDBMS' that I'm aware of
(except native DB2 and we are internally working on that one) and
anything with an ODBC interface. FreePM can export XML files and
import them as a DOM tree. There is no XML-EMR DTD defined to
export as; yet. Can we all concede that the WW-EMR is a few
years away? Then, every EMR needs a home. Not only for technical
reasons, but privacy ones as well. Although a case COULD be made
for privacy through distribution I suppose <g>.
OpenEMed can do those things too, but they have little to do with interoperability
by themselves, just in the convenience of being able to run the system in
many different environments.
I believe 'that home' should be on a server in the primary care
physicians office. FreePM provides the ability to grant external
logins via an http (and other protocols) connection. This can be
the patient with access to only their EMR. Or a specialist that
you often refer patients to can be granted access to all of those
records. If you choose, it could be read only access.
I think you have made the case for a virtual patient record, since
the specialist will be generating information on the patient, too,
and unless they can be pulled together, the knowledge of
the patient's condition and treatment procedure will suffer.
Tell me, which is the more privacy aware and cost efficient
situation; (a) specialist logins in to review patient record &
adds encounter note, (b) you give the record to a clerk to fax
pages to the specialists office where a clerk (hopefully) picks
them up and passes them to your nurse for you to review.
Hopefully you sent all the info the specialist needs. Then the
specialist dictates a note. Once it's transcribed they give it to
a clerk to fax to the clerk in your office where it may or may
not get filed promptly and correctly after you review it or (c)
both offices have FreePM and the specialist exports the Encounter
and emails it to you after reviewing the EMR on your server. You
then import the encounter into the patient record. BTW: Imported
encounters do not affect the patient account but it does become
part of the history.
At least initially, specialists will continue to maintain their
separate medical records system. But, in the ...
What you describe is very labor intensive and prone to error. Also
totally non-transparent to the patient who might like to review his
medical record. What about duplicate information?
Future:
All patient data will adhere to GOM_v15 and encrypted archetype
packets will be sent to GOM routers that can resolve the
destination url and validate the transmitting authority. I would
expect that each application would perform a caching function and
particular archetypes might cary their own persistence level. So
a specialist might maintain patient data for three months or two
years or ... Then if they needed to refer to that data after it
had expired, they could retrieve it from the master patient
record.
Something like that. The ability to resolve destinations and validate
transmitting authority, locate information, etc., is available today.
The standard archetypes are not.
Summary;
Now I know that our resident security specialists (meant with the
utmost respect, btw) are having a coronary about right now. We
accept, with all of it's vulnerabilities, SSL as a secure
electronic communications standard. Balancing all of the
trade-offs, this has to be GES (Good Enough Security). Granted
we may create a whole new class of criminals (the EMR-Vandal,
steals EMR info and markets it to the highest bidder or uses it
for blackmail). But; look at the clerk to clerk to clerk scenario
that happens now. Is that REALLY secure? I also acknowledge
that the AMOUNT of data at risk in swoop is very different also,
but
SSL is not equal to security. How do you authenticate and how
do you authorize access. SSL by itself doesn't deal with those things
although it has some infrastructure that helps out. X509 certificates and all that.
But how do I know I can trust you (or anyone else)? What about HIPAA?
We need the RAD functionality, even between systems within a hospital.
We can call it something else, if we like, but the functionality provided
by that existing specification is crucial.
Our other option is to, wait... and wait.... and ...
I'm not happy with waiting. I'm trying to do something now.
Dave
Joseph, aren't you glad you asked? <vbg>
--
Tim Cook, President - FreePM,Inc.
http://www.FreePM.com Office: (731) 884-4126
ONLINE DEMO: http://www.freepm.org:8080/FreePM
Computer and Computational Sciences
Los Alamos National Laboratory
505-665-1907
