On Sun, 2004-03-07 at 11:56, Thompson, Ken wrote:
> The argument that we could lose our collective behinds in the event of a
> significant security failure seems consistently incapable of generating
> interest or resource allocations from senior management. Basically, the
> issue comes down to a legal deparment deciding whether or not they can
> convince a jury that actions were reasonable and prudent given the resources
> and situation.  They have absolutely no interest or concern that a forensic
> technical investigation would find design or implementation lacking, thereby
> ruining the careers of the dedicated engineers that have worked on the
> project.

Agreed. Part of the problem is that it is viewed from the point of view
of legal exposure only, ignoring the fact that regardless of whether a
case is proved in court, the fact that the security of an EHR
implementation is seriously in question will be incredibly damaging to
that EHR (and probably to some degree to all other EHRs, by
association). This is made worse by the proponents of community-wide
EHRs stating things like "it will be bullet-proof" etc. I can understand
the motivation behind such statements, but generally they are made
without the slightest appreciation of what is actually involved in
achieving the promised, or hinted-at, level of security.


> 
> To a large extent, this is a result of poor of statistical data regarding
> security breaches because organizations with something to lose are quick to
> hush things up when they go wrong. It is a basic part of the damage control
> strategy of any large organization.

Absolutely. There is a need for mandated reporting of information system
security breaches, in the same way that many countries have mandated
financial reporting requirements for public companies etc.

> In most cases, it is impossible to keep small scale security breaches under
> control. Copy and paste are still effective means of extracting data and
> putting it into a document, mail message, database, etc that is outside of
> the EHR system security. Anyone with reasonable access can do this, and one
> statistic I am familiar with is that 60% of all information security
> breaches involve disgruntled employees.

I absolutely agree, but it is nevertheless important to close
inadvertent security holes, such as invisible-to-the-user browser
caching. Certainly the biggest threat is from within - from people who
are already "authorised users". Many security models focus entirely on
keeping out "unauthorised users", thus missing the majority of the
threat. It is also necessary to think clearly about what is meant by an
"authorised user" - in particular, do you mean the actual person, or do
you really mean anyone with access to that person's credentials/login
password. The two are not necessarily the same.

> We deal with these issues as a matter of policy, auditing, and lawyers, not
> expensive technical means. Until someone is able to convince senior
> management that their careers, reputations, etc. are going to suffer from a
> security breach, I suspect this will continue to be our strategy. 

Yes, that is what I have observed also. But that is not the way it ought
to be. And because centralised EHRs significantly increase the size of
the hazard associated with security breaches, I don't think that the
current methods of addressing security issues, as you describe, are
sufficient - they need to be supplemented by architectural and technical
safeguards as well.


> The engineering staff, of course, keeps detailed documentation regarding our
> recommendations and the eventual decisions that were made in all of these
> matters...;-)

Yes, maintenance of "I told you so.." files is vital.


Tim C

> 
> Best Regards,
> 
> Ken
> 
> 
> -----Original Message-----
> From: Tim Churches
> To: Tim Cook
> Cc: OpenEHR Technical
> Sent: 3/6/2004 7:14 PM
> Subject: Re: Data Security was: Basic EHR functionality
> 
> On Sun, 2004-03-07 at 10:18, Tim Cook wrote:
> > On Sat, 2004-03-06 at 14:17, Tim Churches wrote:
> > > In general, caches should be
> > > held on encrypted filesystems, either on-disc or in-memory, with the
> > > keys (or a key to the keys) to the encryption/decryption managed by
> a
> > > daemon which purges the keys from memory when asked (eg locking the
> > > device) or automatically after a short period of disuse.
> > 
> > Well, now that would certainly be a secure way to handle caching.  If
> I
> > were worrying about national secrets.  
> 
> Personal health information is more important than national secrets to
> the individuals concerned. Furthermore, it only takes the compromise of
> a handful of individuals' confidential information, and publication of 
> this fact, before public confidence in your EHR evaporates. So I don't
> think that is overkill. Note, however, the use of the subjunctive
> "should". That's the way it ought to be done, and it is technically
> achievable. Unfortunately, browser and OS vendors/writers don't chose to
> do that by default. But certainly it can be done - on Linux systems, it
> is quite easy to set up encrypted filesystems and to store the browser
> cache on these. Likewise on Windows - individual directories can be
> encrypted (although there are distinct flaws in the way the encryption
> keys are handled in Windows - still, better than not encrypted).
> 
> > Do you go to this extreme now (as a manager) when doing your risk
> > assessments?  I am wondering what the total (additional) costs of
> system
> > design and hardware resources is when these facilities are
> implemented. 
> 
> Risk assessment: client workstations are often shared between users and
> located in insecure locations, laptops are stolen or lost all the time.
> Thus confidential information which is captured in a cache on these
> systems needs to be secured. Note that if the EHR user is, say, a
> physician, then there may be details of hundreds of patients in their
> workstation/laptop cache.
> 
> Does this represent a challenge to applications,especially Web browser
> applications? Yup.
> 
> Are technical solutions possible? Yup - see above.
> 
> Is all of this costly? Well, my view is that additional hardware
> security devices are probably unnecessary (and almost all are
> unnecessarily proprietary anyway), and the software required to
> implement what I describe above is free (at least for Linux - on
> Windows, file system encryption is only available on server versions, I
> think - at least that is the case with Windows 2000 - not sure with
> Windows XP/XP Pro). Does the administration and training involved cost
> money? Definitely, security doesn't come free. Is the expense worth it?
> See above - only takes a handful of confidentiality breaches before you
> can kiss confidence in your EHR goodbye for several years.
> 
> > 
> > I think that in most cases we can reliably depend on locked doors and
> > holding people responsible for protecting data they are entrusted
> with. 
> 
> Surely you jest? Client workstations, even in large hospitals (or
> especially in large hospitals) have to be considered insecure, likewise
> desktop PCs in doctor's offices - common targets for drug-related
> burglary, and especially laptops and handheld devices which are pinched
> or misplaced with monotonous regularity.
> 
> The same applies to EHR/EMR servers, especially servers which are not
> housed in dedicated, secured data centres, although even the latter are
> far from invulnerable - see for example
> http://www.smh.com.au/articles/2003/09/04/1062548967124.html - and then
> there is the off-site back-up media etc to consider.
> 
> > I will agree that security training needs to include this awareness so
> > that users know how to properly store each of these devices when not
> in
> > use.
> 
> Security engineering is all about building systems which fail
> gracefully. Certainly training users is vital, but relying entirely on
> users, or system administrators, or anyone, to always do the right thing
> is a recipe for inevitable security failure. It is always better to
> build additional protection into the fabric of information systems, as
> long as the cost is justified - and that comes back to risk assessment
> as you note. 
-- 

Tim C

PGP/GnuPG Key 1024D/EAF993D0 available from keyservers everywhere
or at http://members.optushome.com.au/tchur/pubkey.asc
Key fingerprint = 8C22 BF76 33BA B3B5 1D5B  EB37 7891 46A9 EAF9 93D0


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part
URL: 
<http://lists.openehr.org/mailman/private/openehr-technical_lists.openehr.org/attachments/20040307/8f8b06bc/attachment.asc>

Reply via email to