"It" is a number of things that I will elucidate, Jon; but "it" is
definitely not raw security.
"It" is:
* a recognition that a company in business using other people's
money has a fiduciary responsibility for managing it with prudence;
* an awareness that computerized trading has the potential to
dramatically reduce visibility of those who have the responsibility
to protect shareholders' and customers' assets;
* an understanding that computers and networks are far less safe than
they were 30 years ago when they operated from "glass houses";
* knowledge of the debacles at LTCM, Enron, Global Crossing, Barings,
Adelphia, etc. and how a lack of controls destroyed so many human
lives (literally, financially and psychologically);
* an appreciation that a failure of controls designed to protect
financial markets can lead to losses of confidence, market-runs,
depressions and potentially, social upheaval;
* an acknowledgment that while it is impossible to stop a determined
rogue trader, trading systems can be easily programmed to trigger
alerts to higher and higher levels of management as trades exceed
preset limits, so they may exercise over-riding controls on the
trades if needed;
This is "it"; if business people truly got this, we wouldn't see what
we're seeing in the marketplace today.
You have defined some very clever formulae showing the opportunity
cost of using too much security and would have us believe that
decision-makers at such companies actually do something like this when
making decisions on how much risk-mitigation to put in place.
If they were endowed with so much intelligence, I would argue that they
might also have calculated the probability of a rogue within the ranks,
the probability of losses resulting from rogue-trades, the probability
of a loss of confidence in the company, the resulting opportunity cost
of lost business, the increased cost of implementing new controls
across the industry (and the opportunity cost of those investments),
the resulting opportunity cost of lost economic value as people pull
back from financial markets, the resulting opportunity cost of
legitimate companies being unable to raise capital in markets to invent
that new life-saving drug or the new carbon-free energy source or
you get the picture.
I would, but I won't because you and I know they do nothing like this
when making these security decisions. It is mostly a "gut feeling",
made-up ROI numbers that are mostly meaningless, what the rest of the
lemmings are doing in the industry, what the press is screaming about
this year and who just got burned and for what.
One hopes that as society evolves, with better levels of education,
better tools, technologies and standards of living, we would recognize
the need to invest "ounces of prevention" to avoid the "pounds of cure".
Sadly, I find that the "Las Vegas" mentality has permeated businesses
to the point that we're taking bigger and bigger risks without really
doing the analysis - going on just "gut feel" - resulting in situations
like at Societe' Generale.
Arshad Noor
StrongAuth, Inc.
Jon Callas wrote:
On Feb 4, 2008, at 1:55 PM, Arshad Noor wrote:
Do business people get it? Do security professionals get it?
Apparently not.
Arshad Noor
StrongAuth, Inc.
Huge losses reported by Société Générale were apparently enabled
by forgotten low-level IT chores such as password management.
http://www.infoworld.com/article/08/02/04/Poor-password-management-may-have-led-to-bank-meltdown_1.html
Yes, but get what? "It" is a vague noun.
The reporter showed some wit by using the word "may."
This was an attack by an evil (or crazy) insider. Evil insider attacks
are the hardest to protect against. If the insider decided that he was
going to start making trades for whatever reason, then he'd find a weak
point that would allow him to make trades, and use it, no matter what it
is. (My personal hypothesis is a variant of a mad-scientist attacker --
"They laughed at me when I told them my trading theories! Laughed! But
I'll show them! I'll show them ALL!!!")
If this person had worked for 1000 hours to get a hardware token, he
would have just done the work. The result may have been an order of
magnitude more. High-security procedures tend to be more brittle for
psychological reasons. If you have the magic dingus, then you are
authorized, and no one ever questions the dingus.
Also, one must look at the economics and psychology of the situation.
Traders are prima-donna adrenaline junkies who trade vast sums of money
all the time and are not shy about expressing their frustrations.
Looking at the sheer economics first:
* A trader trades C units of currency every hour, with an average profit
of P (for example 5% profit is P=1.05).
* There are T traders in the organization.
* The extra authentication produces a productivity drop of D. For
example, let us suppose a trader has to authenticate once per hour, and
it