The argument that we could lose our collective behinds in the event of a
significant security failure seems consistently incapable of generating
interest or resource allocations from senior management. Basically, the
issue comes down to a legal deparment deciding whether or not they can
convince a jury that actions were reasonable and prudent given the resources
and situation.  They have absolutely no interest or concern that a forensic
technical investigation would find design or implementation lacking, thereby
ruining the careers of the dedicated engineers that have worked on the
project.

To a large extent, this is a result of poor of statistical data regarding
security breaches because organizations with something to lose are quick to
hush things up when they go wrong. It is a basic part of the damage control
strategy of any large organization.

In most cases, it is impossible to keep small scale security breaches under
control. Copy and paste are still effective means of extracting data and
putting it into a document, mail message, database, etc that is outside of
the EHR system security. Anyone with reasonable access can do this, and one
statistic I am familiar with is that 60% of all information security
breaches involve disgruntled employees.

We deal with these issues as a matter of policy, auditing, and lawyers, not
expensive technical means. Until someone is able to convince senior
management that their careers, reputations, etc. are going to suffer from a
security breach, I suspect this will continue to be our strategy. 

The engineering staff, of course, keeps detailed documentation regarding our
recommendations and the eventual decisions that were made in all of these
matters...;-)

Best Regards,

Ken


-----Original Message-----
From: Tim Churches
To: Tim Cook
Cc: OpenEHR Technical
Sent: 3/6/2004 7:14 PM
Subject: Re: Data Security was: Basic EHR functionality

On Sun, 2004-03-07 at 10:18, Tim Cook wrote:
> On Sat, 2004-03-06 at 14:17, Tim Churches wrote:
> > In general, caches should be
> > held on encrypted filesystems, either on-disc or in-memory, with the
> > keys (or a key to the keys) to the encryption/decryption managed by
a
> > daemon which purges the keys from memory when asked (eg locking the
> > device) or automatically after a short period of disuse.
> 
> Well, now that would certainly be a secure way to handle caching.  If
I
> were worrying about national secrets.  

Personal health information is more important than national secrets to
the individuals concerned. Furthermore, it only takes the compromise of
a handful of individuals' confidential information, and publication of 
this fact, before public confidence in your EHR evaporates. So I don't
think that is overkill. Note, however, the use of the subjunctive
"should". That's the way it ought to be done, and it is technically
achievable. Unfortunately, browser and OS vendors/writers don't chose to
do that by default. But certainly it can be done - on Linux systems, it
is quite easy to set up encrypted filesystems and to store the browser
cache on these. Likewise on Windows - individual directories can be
encrypted (although there are distinct flaws in the way the encryption
keys are handled in Windows - still, better than not encrypted).

> Do you go to this extreme now (as a manager) when doing your risk
> assessments?  I am wondering what the total (additional) costs of
system
> design and hardware resources is when these facilities are
implemented. 

Risk assessment: client workstations are often shared between users and
located in insecure locations, laptops are stolen or lost all the time.
Thus confidential information which is captured in a cache on these
systems needs to be secured. Note that if the EHR user is, say, a
physician, then there may be details of hundreds of patients in their
workstation/laptop cache.

Does this represent a challenge to applications,especially Web browser
applications? Yup.

Are technical solutions possible? Yup - see above.

Is all of this costly? Well, my view is that additional hardware
security devices are probably unnecessary (and almost all are
unnecessarily proprietary anyway), and the software required to
implement what I describe above is free (at least for Linux - on
Windows, file system encryption is only available on server versions, I
think - at least that is the case with Windows 2000 - not sure with
Windows XP/XP Pro). Does the administration and training involved cost
money? Definitely, security doesn't come free. Is the expense worth it?
See above - only takes a handful of confidentiality breaches before you
can kiss confidence in your EHR goodbye for several years.

> 
> I think that in most cases we can reliably depend on locked doors and
> holding people responsible for protecting data they are entrusted
with. 

Surely you jest? Client workstations, even in large hospitals (or
especially in large hospitals) have to be considered insecure, likewise
desktop PCs in doctor's offices - common targets for drug-related
burglary, and especially laptops and handheld devices which are pinched
or misplaced with monotonous regularity.

The same applies to EHR/EMR servers, especially servers which are not
housed in dedicated, secured data centres, although even the latter are
far from invulnerable - see for example
http://www.smh.com.au/articles/2003/09/04/1062548967124.html - and then
there is the off-site back-up media etc to consider.

> I will agree that security training needs to include this awareness so
> that users know how to properly store each of these devices when not
in
> use.

Security engineering is all about building systems which fail
gracefully. Certainly training users is vital, but relying entirely on
users, or system administrators, or anyone, to always do the right thing
is a recipe for inevitable security failure. It is always better to
build additional protection into the fabric of information systems, as
long as the cost is justified - and that comes back to risk assessment
as you note. 
-- 

Tim C

PGP/GnuPG Key 1024D/EAF993D0 available from keyservers everywhere
or at http://members.optushome.com.au/tchur/pubkey.asc
Key fingerprint = 8C22 BF76 33BA B3B5 1D5B  EB37 7891 46A9 EAF9 93D0


-
If you have any questions about using this list,
please send a message to d.lloyd at openehr.org

Reply via email to