Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-19 Thread George Capehart
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Crispin Cowan wrote:



>
>> This is particularly interesting to me because I just had a doctoral
>> student come to me with an idea for dissertation research that
>> included an hypothesis that organizations at SEI 1 were better able to
>> estimate software development time and costs than organizations at SEI
>> 5.  He didn't seem to grasp the implications to quality, security,
>> life cycle maintenance, etc.
>
>
> Or it could be that the student is positing that the methods mandated in
> the SEI are a grand waste of time, which would be an interesting
> hypothesis to test. Certainly the successes of open source development
> models make a mockery of some of the previously thought hard rules of
> Brooks' "Mythical Man Month", and I dare say that traditional software
> engineering methods deserve questioning.

Sorry about the tardy comment.  The topic in the subject line comes up
over and over and each time the threads end up differently.  It reminds
me of the old Indian story of the blind men and the elephant.  I've been
working on trying to a general set of comments that I hope will provide
for a whole new round of discussion, but one in which we discuss the
elephant rather than ropes, tree trunks, fans, etc.  That's still a bit
of a way off.  In the meantime, I'd like to speak to this comment
because it's fairly specific and it's not directly addressing "security."

The CMM is a mechanism for describing how well disciplined an
organization's software development (or lifecycle) process is.  It does
*not* specify a particular methodology.  It only describes how well
disciplined and formalized the process is.  It aims to provide the same
kind of descriptive context for the software development process that
TQM, Six Sigma, etc. do for manufacturing.  The whole idea is to reduce
the variability in the process and make each step more repeatable.  By
definition, the more control one has over processes and the repeatable a
process becomes, the smaller the variance and standard deviation.  One
is much better able to estimate whatever parameter is of interest
(assuming it's meaningful and measurable).  The CMM provides a model and
some metrics that are useful in describing and measuring the software
development process.

Like every other tool, the CMM can be misused, and very frequently is .
. . in many different ways.  But used correctly, it is the perfect
mechanism for challenging and honing an organization's engineering
methods . . . whatever they are.  It doesn't specify a methodology, but
it *is* very good at showing an organization how well they are excuting
the methodology they have chosen to use.  If it's used to bring more and
more discipline into the process, the process becomes more predictable
and therefore, estimates will become more accurate and precise.  Problem
is, implementing the kind of discipline that the CMM measures is not
easy nor cheap, even though, in the long run, it pays off handsomely.
It typically implies major changes in the way processes are managed, and
that's typically painful.  It's also frequently unsuccessful in the
absence of strong change management process . . .

My 0.02$CURRENCY.

Cheers,

George Capehart

>
> Crispin
>

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.2.1 (MingW32)

iD8DBQFCZT0zmuGMnN1wNOoRAhrjAJ9GkZ2AYQ7K5Zn2xisKi3w29PxYwACgig/V
/QdXrnErrAtleBH6g5viWlE=
=rM1y
-END PGP SIGNATURE-




RE: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-10 Thread Yousef Syed
Hi, 
There are a number of different players here with a number of points of
view. 
Final End user (e.g. User of Windows XP).
Corporation requesting a particular piece of software to be developed. 
The Company where the actual software is being developed: Managers and
Programmers. 

As for the final End User, then they have little if any say in the arena of
software security and will generally just take whatever they are given and
expect it to work properly. Nor are the usually educated to enough to make
an enlightened choice in the matter between two competing brands. E.g Their
Dell Computer shipped with MS XP etc... The governments are the ones that
should be protecting these guys. They don't understand click-through
user-licenses; they've never read them and nor would they ever - the users
aren't lawyers. 
The more intelligent users can choose with their wallets, and rest of us can
educate the poor ignorant masses...

Corporate: If you are in a Bank, then you have to make sure that the
agreement that you sign with whoever is tasked with developing the software,
incorporates User Acceptance Testing that includes tests for security -
preferably carried out by a third-party company. Security should be assumed
and expected as part of any application's requirements, and as such, should
always be included in systems testing phase; Unit testing or otherwise. This
is one of the few environments where security is an issue and would be given
some weight, but mostly under certain circumstances. (Sarbanes-Oxley has
suddenly lit a fire under the shorts of a number of our clients - had their
not been a significant dollar value attached to achieving compliance, I
doubt many of the banks would have bothered spending so much of their money
in this area). Funnily enough, SOX isn't from the government, rather from
the SEC!

Software Managers. These guys answer to the guy holding the purse strings.
That said, there are many ways to ensure that the system is relatively
secured, even if the purse strings are tight. The first is when it is
realized the security isn't an add-on, or a feature; it should be inherent
in the product. Thus, when Architecting, architect securely. When designing,
design securely. Ensure that code is written securely (standard code reviews
should spot these problems just as they'll spot other problems). And testing
phases should incorporate security tests. If a manger delivers a product on
time, on budget, but with a few security flaws (which aren't noticed until
the app has been out in the open for 6 months), then he has retained his job
and probably got a promotion. 


Coders. Lowest on the food chain these days, and thus, with the least
incentive to work that little bit harder to produce "proper" code. Highly
unlikely to put any extra effort into producing better code unless it is
required by a manager. 9/10 they simply want to shift their workload. 
Conscientious and professional developers should still be producing proper
code. Bad code is a product of laziness, apathy and incompetence. There is
plenty of incompetence out there. That is where training comes into the
picture. However, the coder knows that if they deliver late, they will get
sacked; if they deliver on time, but a little buggy, they'll retain their
jobs.

Personally, I think that most of the problems will go away once Security is
no longer viewed as an add-on feature to a product. Secondly, when financial
liability for security breaches are passed on, people will start taking
note. Money talks! 

Another thing would be to name and shame those corporations that have their
systems hacked and lose countless Credit card details. It would be nice if
the SEC would demand full and prompt disclosure in the event of a security
breach and instant multi-million dollar fines for the slack corporations.
When corporations are made to feel the burden of their slack security, then
they'll take it seriously... maybe...

Ys
--
Yousef Syed


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Michael Silk
Sent: 06 April 2005 23:45
To: Dave Paris
Cc: Secure Coding Mailing List
Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?

Inline

On Apr 7, 2005 1:06 AM, Dave Paris <[EMAIL PROTECTED]> wrote:
> And I couldn't disagree more with your perspective, except for your
> inclusion of managers in parenthesis.
> 
> Developers take direction and instruction from management, they are not
> autonomous entities.  If management doesn't make security a priority,

See, you are considering 'security' as something extra again. This is
not right.

My point is that management shouldn't be saying 'Oh, and don't forget
to add _Security_ to that!' The developers should be doing it by
default.


> then only so much secure/defensive code can be written before the
> developer is admonished for be

Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-08 Thread Crispin Cowan
Julie JCH Ryan, D.Sc. wrote:
Other students chimed in on the argument positing that the programming 
challenge was an inaccurate measure of student programming capability 
because the contestant was not allowed to do research on the internet 
during the challenge.  Another said the problem was that the challenge 
was too long and required contestants to have memorized too much.
Formal contests are always inaccurate abstractions of the real world. As 
you raise the value of the contest, this inevitably pressures 
contestants to "game the system" and target the artificial artifacts of 
the game rules instead of the real world. Whether this has happened to 
the ACM Programming contest is a subjective opinion. IMHO, a closed-book 
contest is no longer very relevant to the real world, where Google is 
always just seconds away.

This is particularly interesting to me because I just had a doctoral 
student come to me with an idea for dissertation research that 
included an hypothesis that organizations at SEI 1 were better able to 
estimate software development time and costs than organizations at SEI 
5.  He didn't seem to grasp the implications to quality, security, 
life cycle maintenance, etc.
Or it could be that the student is positing that the methods mandated in 
the SEI are a grand waste of time, which would be an interesting 
hypothesis to test. Certainly the successes of open source development 
models make a mockery of some of the previously thought hard rules of 
Brooks' "Mythical Man Month", and I dare say that traditional software 
engineering methods deserve questioning.

Crispin
--
Crispin Cowan, Ph.D.  http://immunix.com/~crispin/
CTO, Immunix  http://immunix.com



Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-08 Thread Julie JCH Ryan, D.Sc.
This is a little off topic, but I'm wondering if anyone would like to 
comment.

One of our students posited that US computer science students have lost 
their edge because they haven't done well in the ACM programming 
challenge recently.  He wrote, among other things, that:

"Interesting factoids: The last US Champion was Harvey Mudd College in 
1997.  No North American school has won since 1999 when the Univ. of
Waterloo took the prize.  The first foreign school to win the 
competition since it started in 1977 was Univ. of Otago (New Zealand) 
in 1990.  Since 1990, only 4 times has a US school won."

[Ed. FYI, a summary of the ACM challenge and the overall results can be 
found at: http://www.tmcnet.com/usubmit/2005/Apr/1131800.htm  KRvW]

Other students chimed in on the argument positing that the programming 
challenge was an inaccurate measure of student programming capability 
because the contestant was not allowed to do research on the internet 
during the challenge.  Another said the problem was that the challenge 
was too long and required contestants to have memorized too much.

A professor (not me) weighed into the discussion and agreed, saying:
"it could be that the contest is not a true representation of good 
programming!  from what I understand it is heavily skewed towards math 
type problems."

One other student posted this comment (reproduced accurately):
"I do not have to be good in Programming guys! We outsource all of the 
programming jobs to oversee!!! So, why do we 
have to train well in programming any way?  Good luck with our future 
scientists, and I think that included me!"

So I'm wondering what all you folks out there in real world land think 
about this.

This is particularly interesting to me because I just had a doctoral 
student come to me with an idea for dissertation research that included 
an hypothesis that organizations at SEI 1 were better able to estimate 
software development time and costs than organizations at SEI 5.  He 
didn't seem to grasp the implications to quality, security, life cycle 
maintenance, etc.




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-07 Thread ljknews
At 7:44 AM -0400 4/7/05, Dave Paris wrote:

> What you're proposing is that the ironworker should reengineer the
> bridge in-situ (as if he even has the authority!)

or the expertise.
-- 
Larry Kilgallen




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-07 Thread secureCoding2dave
Blue Boar <[EMAIL PROTECTED]> wrote:

 > [Security] is extra.  It's extra time and effort.  And extra
 > testing.  And extra backtracking and schedule slipping when
 > you realize you blew something. All before it hits beta.

...if you're lucky.  (Or if you're doing development right, but IME 
that's damn rare too.)

-Dave


Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-07 Thread Michael Silk
Dave,

> What you're proposing is that the ironworker should reengineer the
> bridge in-situ (as if he even has the authority!), causing weeks of
> delay, cost overruns, and possibly lead to his employer never getting a
> bridge contract again.

That's not at all what I'm suggesting... guess my point wasn't so obvious :)

I'm not saying the programmer should totally redesign the application
if it's not secure.

I am saying that they _SHOULD_ do the simple, 'trivial' things within
the context of their current job. Validating input, handling errors
properly, ensuring ownership of various id's, etc. All of these things
fall in this category.

Yet these days, when programmers actually _DO_ do these things, they
call these things 'security features' and themselves 'secure
programmers' (or whatever). And that's what I think is ridiculous.

Other things, like designing a secure protocol/management of your
encryption system can be though of as 'something extra' but these
types of problems aren't the _MAJOR_ problems (they are big problems,
however) that we deal with.

There are, of course, design concepts that shouldn't really be billed
as 'extra' either [like appropriate user management/access systems,
etc].

And back to the main point of this discussion, is that lack of these
things in application shouldn't be blamed on consumers (or even
management) for not asking. These things should be assumed...

-- Michael

> 
> Somehow, that just doesn't hold water.
> Kind Regards,
> -dsp




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-07 Thread Dave Paris
Michael Silk wrote:
[...]
Consider the bridge example brought up earlier. If your bridge builder
finished the job but said: "ohh, the bridge isn't secure though. If
someone tries to push it at a certain angle, it will fall". You would
be panic stricken. Fear would overcome you, and you might either run
for the hills, or attempt to strangle the builder... either way, you
have a right to be incredibly upset by the lack of 'security' in your
bridge. You WONT (as is the case now) sit back and go: "Oh well, fair
enough. I didn't ask you to implement that, I just said 'build a
bridge'. Next time i'll ask. Or make sure the public specifically
requests resistence to that angle of pushing".
Hopefully my point is obvious...
[...]
Actually, it's obvious - but I still can't agree with it.
Using the bridge example, it's fairly trivial for the ironworker to 
realize there should be guesset plate where one wasn't called for.  It's 
a small piece, fairly quickly and easily fabricated and attached - so 
the ironworker puts it in.  No, it wasn't part of the specification - 
but the ironworker has built enough bridges and can explain away the 
extra half-day and slight cost it took to add the appropriate guessets.

On the other hand, the ironworker has worked on enough bridges to 
realize that  will lead to 
a functional, but flawed bridge.  It won't fall down, but it won't be as 
robust as a bridge should be.  The bridge was specified to support 10 
tons.  Five years from now, the road traffic will have 15 ton trucks and 
the bridge may fail.

What you're proposing is that the ironworker should reengineer the 
bridge in-situ (as if he even has the authority!), causing weeks of 
delay, cost overruns, and possibly lead to his employer never getting a 
bridge contract again.

Somehow, that just doesn't hold water.
Kind Regards,
-dsp



Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-07 Thread Margus Freudenthal
Michael Silk wrote:
Consider the bridge example brought up earlier. If your bridge builder
finished the job but said: "ohh, the bridge isn't secure though. If
someone tries to push it at a certain angle, it will fall".
All bridges have certain limits. There is difference between a 
footbridge and bridge that can be driven over with a tank. The 
difference is also reflected in cost. You are advocating always building 
"tank" bridge. Which is understandable attitude - this way you are 
mostly safe. However, in some cases it is *economically feasible* to 
just build a simpler bridge and accept the fact that it will break under 
some conditions.

Ultimately it is a matter of economics. Sometimes releasing something 
earlier is worth more than the cost of later patches. And 
managers/customers are aware of it.

--
Margus



Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-07 Thread Blue Boar
Michael Silk wrote:
> See, you are considering 'security' as something extra again. This is
> not right.

It is extra.  It's extra time and effort.  And extra testing.  And extra
backtracking and schedule slipping when you realize you blew something.
 All before it hits beta.

Any solution that ends up with us having "secure" software will
neccessarily need to address this step as well as all others.  The
"right" answer just might end up being "suck it up, and take the
resource hit."  It might be "switch to the language that lends itself to
you coding securly at 75% the productivity rate of sloppy coding."  I
don't know enough about the languages involved to participate in that
debate.

Strangely enough, for the last year and a half or so, I've been sitting
here being QA at a security product company.  Doing software right takes
extra resources.  I are one.

 Ryan




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-07 Thread Michael Silk
On Apr 7, 2005 12:43 PM, Blue Boar <[EMAIL PROTECTED]> wrote:
> Michael Silk wrote:
> > See, you are considering 'security' as something extra again. This is
> > not right.
> 
> It is extra.  It's extra time and effort.  And extra testing.  And extra
> backtracking and schedule slipping when you realize you blew something.
> All before it hits beta.

All of this is part of _programming_ though.  To me it should be on
the same level as, say, using an 'Array' at an appropriate point in a
program. You won't say to management: "Oh, I didn't use an array there
because you didn't ask me to.". It's ridiculous to even consider. And
so it should be with so-called 'Security' that is added to
applications.

Consider the bridge example brought up earlier. If your bridge builder
finished the job but said: "ohh, the bridge isn't secure though. If
someone tries to push it at a certain angle, it will fall". You would
be panic stricken. Fear would overcome you, and you might either run
for the hills, or attempt to strangle the builder... either way, you
have a right to be incredibly upset by the lack of 'security' in your
bridge. You WONT (as is the case now) sit back and go: "Oh well, fair
enough. I didn't ask you to implement that, I just said 'build a
bridge'. Next time i'll ask. Or make sure the public specifically
requests resistence to that angle of pushing".

Hopefully my point is obvious...

-- Michael

> Any solution that ends up with us having "secure" software will
> neccessarily need to address this step as well as all others.  The
> "right" answer just might end up being "suck it up, and take the
> resource hit."  It might be "switch to the language that lends itself to
> you coding securly at 75% the productivity rate of sloppy coding."  I
> don't know enough about the languages involved to participate in that
> debate.
> 
> Strangely enough, for the last year and a half or so, I've been sitting
> here being QA at a security product company.  Doing software right takes
> extra resources.  I are one.
> 
>Ryan
>




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Michael Silk
Inline

On Apr 7, 2005 1:06 AM, Dave Paris <[EMAIL PROTECTED]> wrote:
> And I couldn't disagree more with your perspective, except for your
> inclusion of managers in parenthesis.
> 
> Developers take direction and instruction from management, they are not
> autonomous entities.  If management doesn't make security a priority,

See, you are considering 'security' as something extra again. This is
not right.

My point is that management shouldn't be saying 'Oh, and don't forget
to add _Security_ to that!' The developers should be doing it by
default.


> then only so much secure/defensive code can be written before the
> developer is admonished for being slow/late/etc.

Then defend yourself ... ! Just as you would if the project was too
large due to other reasons. Don't allow security to be 'cut off'.
Don't walk in and say 'Oh, I was just adding "security" to it,'. A
manager will immediately reply: "Oh, we don't care about that...".
Instead say: "Still finishing it off..". (This _has_ worked for me in
the past, by the way...)

 
> While sloppy habits are one thing, it's entirely another to have
> management breathing down your neck, threatening to ship your job
> overseas, unless you get code out the door yesterday.

Agreed. (Can't blame consumers for this issue, however..)

 
> I'm talking
> about validation of user input, 

This is something that all programmer should be doing in _ANY_ type of
program. You need to handle input correctly for your app to function
correctly, otherwise it will crash with a dopey user.


> ensuring a secure architecture to begin
> with, and the like.  

'Sensible' architecture too, though. I mean, that's the whole point of
a design - it makes sense. For example, an app may let a user update
accounts based on ID's, but it doesn't check if the user actually owns
the ID of the account they are updating. They assume it's true because
they only _showed_ them ID's they own.

You'd hope that your 'sensible' programmer would note that and confirm
that they did, indeed, update the right account. Not only for security
purposes, but for consistency of the _system_. The app just isn't
doing what it was 'specified' to do if the user can update any
account. It's _wrong_ - from a specification point of view - not just
'insecure'.

You would, I guess, classify this as something the managers/consumers
need to explicitly ask for. To me, it seems none of their business. As
a manager, you don't want to be micromanaging all these concepts (but
we are - CIO's...) they should be the sole responsibility of the
programmer to get right.


> The later takes far more time to impliment than is
> given in many environments.  The former requires sufficient
> specifications be given upfront 

Agreed.

-- Michael


> Michael Silk wrote:
> > Quoting from the article:
> > ''You can't really blame the developers,''
> >
> > I couldn't disagree more with that ...
> >
> > It's completely the developers fault (and managers). 'Security' isn't
> > something that should be thought of as an 'extra' or an 'added bonus'
> > in an application. Typically it's just about programming _correctly_!
> >
> > The article says it's a 'communal' problem (i.e: consumers should
> > _ask_ for secure software!). This isn't exactly true, and not really
> > fair. Insecure software or secure software can exist without
> > consumers. They don't matter. It's all about the programmers. The
> > problem is they are allowed to get away with their crappy programming
> > habits - and that is the fault of management, not consumers, for
> > allowing 'security' to be thought of as something seperate from
> > 'programming'.
> >
> > Consumers can't be punished and blamed, they are just trying to get
> > something done - word processing, emailing, whatever. They don't need
> > to - nor should. really. - care about lower-level security in the
> > applications they buy. The programmers should just get it right, and
> > managers need to get a clue about what is acceptable 'programming' and
> > what isn't.
> >
> > Just my opinion, anyway.
> >
> > -- Michael
> [...]
>




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Michael Silk
On Apr 7, 2005 1:16 AM, Goertzel Karen <[EMAIL PROTECTED]> wrote:
> I think it's a matter of SHARED reponsibility. Yes, the programmers and
> their managers are directly responsible. But it's consumers who create
> demand, and consumers who, out of ignorance, continue to fail to make
> the connection between bad software security and the viruses, privacy,
> and other issues about which they are becoming increasingly concerned.

Quite frankly I don't think consumers need to care at all about this.

Do you, when buying chips, ask how they were cooked? Do you go back
and inspect the kitchen? Do you ask for a report on their compliance
to local health laws? No. The most you might do is glance at a box
with some ticks on it.

Why should software be any different? Why place the burden on
consumers to now evalutate the security of your products? Not only
don't they care, nor do they have the time, they wouldn't know where
to start!


> The consumer can't be held responsible for his ignorance...

Exactly!


> Because practioners of "safe software" have not done a very good
> job of getting the message out in terms that consumers, vs. other
> software practioners and IT managers, can understand.
> 
> I propose that the following is the kind of message that might make a
> consumer sit up and listen:
> 
> "We understand that you buy software to get your work or online
> recreation done as easily as possible. But being able to get that work
> done WITHOUT leaving yourself wide open to exploitation and compromise
> of YOUR computer and YOUR personal information is also important, isn't
> it?

Answer: Duh.

 
> "A number of software products, including some of the most popular ones,
> are full of bugs and other vulnerabilities that DO leave those programs
> wide open to being exploited by hackers so they can get at YOUR personal
> information, and take over YOUR computing resources.

Answer: So? I need to use them.

 
> "Why is such software allowed to be sold at all? Because no-one
> regulates the SECURITY of the software products that these the companies
> put out, least of all the programmers who write that software. And, more
> importantly, because you the consumer hasn't been told before that you
> can make a difference. You can vote with your feet.

Answer: But how will I pay my GST next month if I can't use my
accounting program? I don't want to waste time transferring all my
data to another product...


> "Demand that the software you use not be full of holes and 'undocumented
> features' that can be exploited by hackers.

Answer: How? I buy my software at a department store.

 
> If we can start to raise consumer awareness 

It's easy to blame the consumer - it means we
programmers/management/whatever don't need to do anything until they
ask us. But they will _never_ be able to ask all the right questions.
_Never_. So to put that requirement on them is just our 'easy way out'
of the problem.

-- Michael

 
> --
> Karen Goertzel, CISSP
> Booz Allen Hamilton
> 703-902-6981
> [EMAIL PROTECTED]
> 
> > -----Original Message-----
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED] On Behalf Of Michael Silk
> > Sent: Wednesday, April 06, 2005 9:40 AM
> > To: Kenneth R. van Wyk
> > Cc: Secure Coding Mailing List
> > Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?
> >
> > Quoting from the article:
> > ''You can't really blame the developers,''
> >
> > I couldn't disagree more with that ...
> >
> > It's completely the developers fault (and managers). 'Security' isn't
> > something that should be thought of as an 'extra' or an 'added bonus'
> > in an application. Typically it's just about programming _correctly_!
> >
> > The article says it's a 'communal' problem (i.e: consumers should
> > _ask_ for secure software!). This isn't exactly true, and not really
> > fair. Insecure software or secure software can exist without
> > consumers. They don't matter. It's all about the programmers. The
> > problem is they are allowed to get away with their crappy programming
> > habits - and that is the fault of management, not consumers, for
> > allowing 'security' to be thought of as something seperate from
> > 'programming'.
> >
> > Consumers can't be punished and blamed, they are just trying to get
> > something done - word processing, emailing, whatever. They don't need
> > to - nor should. really. - care about lower-level security in the
> > applications they buy. The programm

Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Michael Silk
Jeff,

On Apr 7, 2005 11:00 AM, Jeff Williams <[EMAIL PROTECTED]> wrote:
> > I would think this might work, but I - if I ran a software development
> > company - would be very scared about signing that contract... Even if
> > I did everything right, who's to say I might not get blamed? Anyway,
> > insurance would end up being the solution.
> 
> What you *should* be scared of is a contract that's silent about security.

If you're silent you can claim ignorance :D

But of course, I agree. "Security" should be mentioned under the part
of applications "Working Right".

What I meant I would be scared of, however, is that if the contract
didn't fully specify what I would be taking responsibility for. I.e. I
could be blamed if some misconfiguration on the server allowed a user
to run my tool/component as admin and enter some information or do
whatever.

The contract would have to be specific (technical?) so-as to avoid
problems like this. But I presume you have had far more experience
with these issues than I have... can you share any w.r.t to problems
like that?

Because I can imagine [if I wasn't ethical] trying to blame a security
problem in My Big Financial Website on a 3rd party tool if I could.


> Courts will have to interpret (make stuff up) to figure out what the two
> parties intended.  I strongly suspect courts will read in terms like "the
> software shall not have obvious security holes".  They will probably rely on
> documents like the OWASP Top Ten to establish a baseline for trade practice.
> 
> Contracts protect both sides.  Have the discussion.  Check out the OWASP
> Software Security Contract Annex for a
> template.(http://www.owasp.org/documentation/legal.html).

Yes, I've read the before, and even discussed it with you! :)

-- Michael

> 
> --Jeff
> 
> >
> >> - Original Message -
> >> From: "Michael Silk" <[EMAIL PROTECTED]>
> >> To: "Kenneth R. van Wyk" <[EMAIL PROTECTED]>
> >> Cc: "Secure Coding Mailing List" 
> >> Sent: Wednesday, April 06, 2005 9:40 AM
> >> Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?
> >>
> >> > Quoting from the article:
> >> > ''You can't really blame the developers,''
> >> >
> >> > I couldn't disagree more with that ...
> >> >
> >> > It's completely the developers fault (and managers). 'Security' isn't
> >> > something that should be thought of as an 'extra' or an 'added bonus'
> >> > in an application. Typically it's just about programming _correctly_!
> >> >
> >> > The article says it's a 'communal' problem (i.e: consumers should
> >> > _ask_ for secure software!). This isn't exactly true, and not really
> >> > fair. Insecure software or secure software can exist without
> >> > consumers. They don't matter. It's all about the programmers. The
> >> > problem is they are allowed to get away with their crappy programming
> >> > habits - and that is the fault of management, not consumers, for
> >> > allowing 'security' to be thought of as something seperate from
> >> > 'programming'.
> >> >
> >> > Consumers can't be punished and blamed, they are just trying to get
> >> > something done - word processing, emailing, whatever. They don't need
> >> > to - nor should. really. - care about lower-level security in the
> >> > applications they buy. The programmers should just get it right, and
> >> > managers need to get a clue about what is acceptable 'programming' and
> >> > what isn't.
> >> >
> >> > Just my opinion, anyway.
> >> >
> >> > -- Michael
> >> >
> >> >
> >> > On Apr 6, 2005 5:15 AM, Kenneth R. van Wyk <[EMAIL PROTECTED]> wrote:
> >> >> Greetings++,
> >> >>
> >> >> Another interesting article this morning, this time from
> >> >> eSecurityPlanet.
> >> >> (Full disclosure: I'm one of their columnists.)  The article, by
> >> >> Melissa
> >> >> Bleasdale and available at
> >> >> http://www.esecurityplanet.com/trends/article.php/3495431, is on the
> >> >> general
> >> >> state of application security in today's market.  Not a whole lot of
> >> >> new
> >> >> material there for SC-L readers, but it's still nice to see the
> >> >> software
> >> >> security message getting out to more and more people.
> >> >>
> >> >> Cheers,
> >> >>
> >> >> Ken van Wyk
> >> >> --
> >> >> KRvW Associates, LLC
> >> >> http://www.KRvW.com
> >> >>
> >> >
> >> >
> >>
> >>
> 
>




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Michael Silk
On Apr 7, 2005 1:35 AM, Jeff Williams <[EMAIL PROTECTED]> wrote:
> Michael,
> 
> Don't hate the player, hate the game (quoting Ice-T). 

True.. the game has let them get away with it, but IMHO the players
are the ones in the best position to change how they play ;)


> Developers aren't
> going to just write code differently because we say so. Speaking frankly,
> today there's really no incentive for them to write code securely. And no
> amount of guidelines, super-complex code scanners, or jumping up and down is
> going to change that.

Yes, I agree I guess ... The only incentive I've found is self-respect
for what you do :)


> Nothing will change until we intervene in the software market in ways that
> fix these problems. There are many ways that government and industry can
> change the market, some more intrusive than others. Calls for a product
> liabilty regime from Schneier and others are interesting, but not likely to
> succeed politically.  

>From a government p.o.v? They don't have much of a say, do they? If
companies called on vendors for contracts that specified full
responsibility for security problems, that's a private issue, no?

I would think this might work, but I - if I ran a software development
company - would be very scared about signing that contract... Even if
I did everything right, who's to say I might not get blamed? Anyway,
insurance would end up being the solution.

This would make it, then, compulsory for small businesses to have
insurance to cover the cost of being sued by large corporations - and
that amount of coverage might not be possible for small companies.


> See you at OWASP England.

Unfortunately not, a little bit too far for me at this time :)

-- Michael


> - Original Message -
> From: "Michael Silk" <[EMAIL PROTECTED]>
> To: "Kenneth R. van Wyk" <[EMAIL PROTECTED]>
> Cc: "Secure Coding Mailing List" 
> Sent: Wednesday, April 06, 2005 9:40 AM
> Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?
> 
> > Quoting from the article:
> > ''You can't really blame the developers,''
> >
> > I couldn't disagree more with that ...
> >
> > It's completely the developers fault (and managers). 'Security' isn't
> > something that should be thought of as an 'extra' or an 'added bonus'
> > in an application. Typically it's just about programming _correctly_!
> >
> > The article says it's a 'communal' problem (i.e: consumers should
> > _ask_ for secure software!). This isn't exactly true, and not really
> > fair. Insecure software or secure software can exist without
> > consumers. They don't matter. It's all about the programmers. The
> > problem is they are allowed to get away with their crappy programming
> > habits - and that is the fault of management, not consumers, for
> > allowing 'security' to be thought of as something seperate from
> > 'programming'.
> >
> > Consumers can't be punished and blamed, they are just trying to get
> > something done - word processing, emailing, whatever. They don't need
> > to - nor should. really. - care about lower-level security in the
> > applications they buy. The programmers should just get it right, and
> > managers need to get a clue about what is acceptable 'programming' and
> > what isn't.
> >
> > Just my opinion, anyway.
> >
> > -- Michael
> >
> >
> > On Apr 6, 2005 5:15 AM, Kenneth R. van Wyk <[EMAIL PROTECTED]> wrote:
> >> Greetings++,
> >>
> >> Another interesting article this morning, this time from eSecurityPlanet.
> >> (Full disclosure: I'm one of their columnists.)  The article, by Melissa
> >> Bleasdale and available at
> >> http://www.esecurityplanet.com/trends/article.php/3495431, is on the
> >> general
> >> state of application security in today's market.  Not a whole lot of new
> >> material there for SC-L readers, but it's still nice to see the software
> >> security message getting out to more and more people.
> >>
> >> Cheers,
> >>
> >> Ken van Wyk
> >> --
> >> KRvW Associates, LLC
> >> http://www.KRvW.com
> >>
> >
> >
> 
>




Re: [SC-L] Application Insecurity - Who is at Fault?

2005-04-06 Thread Greenarrow 1
Government is not the answer.  Just how would one get the numerous 
governments to agree on a law
that most likely be impossible to enforce?  Soft ware made in the European 
Union is not enforceable in the United States and visa versa, ie.

Mapping out a plan to the various companies' management would be a better 
goal but how to enforce this plan is the question?  Showing companies the 
actual costs to patch flaws comparing securing the soft ware at the onset 
might shock them into reality.  Who is to take charge to implement or start 
a project like this?  Does a company have to implement recommendations made 
by anyone?

If one could actually prove to the makers that the costs of patching could 
actually exceed the cost of the program then maybe this would achieve some 
lead way into secured coding.  Money is the answer and if someone could 
prove this I feel it would be a start into securing soft wares.

I am open to any and all suggestions that would benefit showing the way to 
proceed.

Regards,
George
Greenarrow1
InNetInvestigations-Forensics 




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Jeff Williams
I would think this might work, but I - if I ran a software development
company - would be very scared about signing that contract... Even if
I did everything right, who's to say I might not get blamed? Anyway,
insurance would end up being the solution.
What you *should* be scared of is a contract that's silent about security. 
Courts will have to interpret (make stuff up) to figure out what the two 
parties intended.  I strongly suspect courts will read in terms like "the 
software shall not have obvious security holes".  They will probably rely on 
documents like the OWASP Top Ten to establish a baseline for trade practice.

Contracts protect both sides.  Have the discussion.  Check out the OWASP 
Software Security Contract Annex for a 
template.(http://www.owasp.org/documentation/legal.html).

--Jeff

- Original Message -
From: "Michael Silk" <[EMAIL PROTECTED]>
To: "Kenneth R. van Wyk" <[EMAIL PROTECTED]>
Cc: "Secure Coding Mailing List" 
Sent: Wednesday, April 06, 2005 9:40 AM
Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?
> Quoting from the article:
> ''You can't really blame the developers,''
>
> I couldn't disagree more with that ...
>
> It's completely the developers fault (and managers). 'Security' isn't
> something that should be thought of as an 'extra' or an 'added bonus'
> in an application. Typically it's just about programming _correctly_!
>
> The article says it's a 'communal' problem (i.e: consumers should
> _ask_ for secure software!). This isn't exactly true, and not really
> fair. Insecure software or secure software can exist without
> consumers. They don't matter. It's all about the programmers. The
> problem is they are allowed to get away with their crappy programming
> habits - and that is the fault of management, not consumers, for
> allowing 'security' to be thought of as something seperate from
> 'programming'.
>
> Consumers can't be punished and blamed, they are just trying to get
> something done - word processing, emailing, whatever. They don't need
> to - nor should. really. - care about lower-level security in the
> applications they buy. The programmers should just get it right, and
> managers need to get a clue about what is acceptable 'programming' and
> what isn't.
>
> Just my opinion, anyway.
>
> -- Michael
>
>
> On Apr 6, 2005 5:15 AM, Kenneth R. van Wyk <[EMAIL PROTECTED]> wrote:
>> Greetings++,
>>
>> Another interesting article this morning, this time from 
>> eSecurityPlanet.
>> (Full disclosure: I'm one of their columnists.)  The article, by 
>> Melissa
>> Bleasdale and available at
>> http://www.esecurityplanet.com/trends/article.php/3495431, is on the
>> general
>> state of application security in today's market.  Not a whole lot of 
>> new
>> material there for SC-L readers, but it's still nice to see the 
>> software
>> security message getting out to more and more people.
>>
>> Cheers,
>>
>> Ken van Wyk
>> --
>> KRvW Associates, LLC
>> http://www.KRvW.com
>>
>
>





Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Jeff Williams
Michael,
Don't hate the player, hate the game (quoting Ice-T). Developers aren't 
going to just write code differently because we say so. Speaking frankly, 
today there's really no incentive for them to write code securely. And no 
amount of guidelines, super-complex code scanners, or jumping up and down is 
going to change that.

The software market is seriously broken.  There are dramatic asymmetric 
information problems (see 
http://nobelprize.org/economics/laureates/2001/public.html) that make it 
impossible to tell secure software from junk.  There are also many 
externalities (see http://fpc.state.gov/documents/organization/43393.pdf) 
that prevent those who take risks from bearing the costs.

Nothing will change until we intervene in the software market in ways that 
fix these problems. There are many ways that government and industry can 
change the market, some more intrusive than others. Calls for a product 
liabilty regime from Schneier and others are interesting, but not likely to 
succeed politically.  Perhaps this is changing with the recent disclosure 
scandals.

See you at OWASP England.
--Jeff
[Ed. Ice-T quotes in SC-L...  What hath we wrought?!  :-\  KRvW]
- Original Message - 
From: "Michael Silk" <[EMAIL PROTECTED]>
To: "Kenneth R. van Wyk" <[EMAIL PROTECTED]>
Cc: "Secure Coding Mailing List" 
Sent: Wednesday, April 06, 2005 9:40 AM
Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?


Quoting from the article:
''You can't really blame the developers,''
I couldn't disagree more with that ...
It's completely the developers fault (and managers). 'Security' isn't
something that should be thought of as an 'extra' or an 'added bonus'
in an application. Typically it's just about programming _correctly_!
The article says it's a 'communal' problem (i.e: consumers should
_ask_ for secure software!). This isn't exactly true, and not really
fair. Insecure software or secure software can exist without
consumers. They don't matter. It's all about the programmers. The
problem is they are allowed to get away with their crappy programming
habits - and that is the fault of management, not consumers, for
allowing 'security' to be thought of as something seperate from
'programming'.
Consumers can't be punished and blamed, they are just trying to get
something done - word processing, emailing, whatever. They don't need
to - nor should. really. - care about lower-level security in the
applications they buy. The programmers should just get it right, and
managers need to get a clue about what is acceptable 'programming' and
what isn't.
Just my opinion, anyway.
-- Michael
On Apr 6, 2005 5:15 AM, Kenneth R. van Wyk <[EMAIL PROTECTED]> wrote:
Greetings++,
Another interesting article this morning, this time from eSecurityPlanet.
(Full disclosure: I'm one of their columnists.)  The article, by Melissa
Bleasdale and available at
http://www.esecurityplanet.com/trends/article.php/3495431, is on the 
general
state of application security in today's market.  Not a whole lot of new
material there for SC-L readers, but it's still nice to see the software
security message getting out to more and more people.

Cheers,
Ken van Wyk
--
KRvW Associates, LLC
http://www.KRvW.com




RE: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Michael S Hines
Wonder what happens if we apply that same logic to building design or bridge 
design and
contstruction? 

Those who don't place blame at the source are just trying to blame shift.   Bad 
idea..  

Mike Hines
---
Michael S Hines
[EMAIL PROTECTED] 

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of
Michael Silk
Sent: Wednesday, April 06, 2005 8:40 AM
To: Kenneth R. van Wyk
Cc: Secure Coding Mailing List
Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?

Quoting from the article:
''You can't really blame the developers,''

I couldn't disagree more with that ...

It's completely the developers fault (and managers). 'Security' isn't
something that should be thought of as an 'extra' or an 'added bonus'
in an application. Typically it's just about programming _correctly_!

The article says it's a 'communal' problem (i.e: consumers should
_ask_ for secure software!). This isn't exactly true, and not really
fair. Insecure software or secure software can exist without
consumers. They don't matter. It's all about the programmers. The
problem is they are allowed to get away with their crappy programming
habits - and that is the fault of management, not consumers, for
allowing 'security' to be thought of as something seperate from
'programming'.

Consumers can't be punished and blamed, they are just trying to get
something done - word processing, emailing, whatever. They don't need
to - nor should. really. - care about lower-level security in the
applications they buy. The programmers should just get it right, and
managers need to get a clue about what is acceptable 'programming' and
what isn't.

Just my opinion, anyway.

-- Michael


On Apr 6, 2005 5:15 AM, Kenneth R. van Wyk <[EMAIL PROTECTED]> wrote:
> Greetings++,
> 
> Another interesting article this morning, this time from eSecurityPlanet.
> (Full disclosure: I'm one of their columnists.)  The article, by Melissa
> Bleasdale and available at
> http://www.esecurityplanet.com/trends/article.php/3495431, is on the general
> state of application security in today's market.  Not a whole lot of new
> material there for SC-L readers, but it's still nice to see the software
> security message getting out to more and more people.
> 
> Cheers,
> 
> Ken van Wyk
> --
> KRvW Associates, LLC
> http://www.KRvW.com
>





Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Dave Paris
And I couldn't disagree more with your perspective, except for your 
inclusion of managers in parenthesis.

Developers take direction and instruction from management, they are not 
autonomous entities.  If management doesn't make security a priority, 
then only so much secure/defensive code can be written before the 
developer is admonished for being slow/late/etc.

While sloppy habits are one thing, it's entirely another to have 
management breathing down your neck, threatening to ship your job 
overseas, unless you get code out the door yesterday.

It's an environment that fosters insecure habits and resultant products. 
 I'm not talking about habits like using strncpy vs strcpy, I'm talking 
about validation of user input, ensuring a secure architecture to begin 
with, and the like.  The later takes far more time to impliment than is 
given in many environments.  The former requires sufficient 
specifications be given upfront - otherwise you have insufficient 
information to correctly use a function like strncpy.

Kind Regards,
-dsp
Michael Silk wrote:
Quoting from the article:
''You can't really blame the developers,''
I couldn't disagree more with that ...
It's completely the developers fault (and managers). 'Security' isn't
something that should be thought of as an 'extra' or an 'added bonus'
in an application. Typically it's just about programming _correctly_!
The article says it's a 'communal' problem (i.e: consumers should
_ask_ for secure software!). This isn't exactly true, and not really
fair. Insecure software or secure software can exist without
consumers. They don't matter. It's all about the programmers. The
problem is they are allowed to get away with their crappy programming
habits - and that is the fault of management, not consumers, for
allowing 'security' to be thought of as something seperate from
'programming'.
Consumers can't be punished and blamed, they are just trying to get
something done - word processing, emailing, whatever. They don't need
to - nor should. really. - care about lower-level security in the
applications they buy. The programmers should just get it right, and
managers need to get a clue about what is acceptable 'programming' and
what isn't.
Just my opinion, anyway.
-- Michael
[...]



RE: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Goertzel Karen
I think it's a matter of SHARED reponsibility. Yes, the programmers and
their managers are directly responsible. But it's consumers who create
demand, and consumers who, out of ignorance, continue to fail to make
the connection between bad software security and the viruses, privacy,
and other issues about which they are becoming increasingly concerned.

The consumer can't be held responsible for his ignorance...at least not
yet. Because practioners of "safe software" have not done a very good
job of getting the message out in terms that consumers, vs. other
software practioners and IT managers, can understand.

I propose that the following is the kind of message that might make a
consumer sit up and listen:

"We understand that you buy software to get your work or online
recreation done as easily as possible. But being able to get that work
done WITHOUT leaving yourself wide open to exploitation and compromise
of YOUR computer and YOUR personal information is also important, isn't
it? 

"A number of software products, including some of the most popular ones,
are full of bugs and other vulnerabilities that DO leave those programs
wide open to being exploited by hackers so they can get at YOUR personal
information, and take over YOUR computing resources. 

"Why is such software allowed to be sold at all? Because no-one
regulates the SECURITY of the software products that these the companies
put out, least of all the programmers who write that software. And, more
importantly, because you the consumer hasn't been told before that you
can make a difference. You can vote with your feet. 

"Demand that the software you use not be full of holes and 'undocumented
features' that can be exploited by hackers. When you go out to buy a
lawn mower, you wouldn't buy a model that has a well-published track
record of its blades flying off. By the same token, you shouldn't buy a
software package that has a well-documented track record of being
successfully compromised by viruses, Trojan horses, and other hacker
tricks."

If we can start to raise consumer awareness in terms that consumers can
understand (avoiding the arcane terminology of software practitioners),
maybe we can start reducing demand for notoriously insecure software
products, and increasing demand for software that is developed with
security in mind.

--
Karen Goertzel, CISSP
Booz Allen Hamilton
703-902-6981
[EMAIL PROTECTED]  

> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED] On Behalf Of Michael Silk
> Sent: Wednesday, April 06, 2005 9:40 AM
> To: Kenneth R. van Wyk
> Cc: Secure Coding Mailing List
> Subject: Re: [SC-L] Application Insecurity --- Who is at Fault?
> 
> Quoting from the article:
> ''You can't really blame the developers,''
> 
> I couldn't disagree more with that ...
> 
> It's completely the developers fault (and managers). 'Security' isn't
> something that should be thought of as an 'extra' or an 'added bonus'
> in an application. Typically it's just about programming _correctly_!
> 
> The article says it's a 'communal' problem (i.e: consumers should
> _ask_ for secure software!). This isn't exactly true, and not really
> fair. Insecure software or secure software can exist without
> consumers. They don't matter. It's all about the programmers. The
> problem is they are allowed to get away with their crappy programming
> habits - and that is the fault of management, not consumers, for
> allowing 'security' to be thought of as something seperate from
> 'programming'.
> 
> Consumers can't be punished and blamed, they are just trying to get
> something done - word processing, emailing, whatever. They don't need
> to - nor should. really. - care about lower-level security in the
> applications they buy. The programmers should just get it right, and
> managers need to get a clue about what is acceptable 'programming' and
> what isn't.
> 
> Just my opinion, anyway.
> 
> -- Michael
> 
> 
> On Apr 6, 2005 5:15 AM, Kenneth R. van Wyk <[EMAIL PROTECTED]> wrote:
> > Greetings++,
> > 
> > Another interesting article this morning, this time from 
> eSecurityPlanet.
> > (Full disclosure: I'm one of their columnists.)  The 
> article, by Melissa
> > Bleasdale and available at
> > http://www.esecurityplanet.com/trends/article.php/3495431, 
> is on the general
> > state of application security in today's market.  Not a 
> whole lot of new
> > material there for SC-L readers, but it's still nice to see 
> the software
> > security message getting out to more and more people.
> > 
> > Cheers,
> > 
> > Ken van Wyk
> > --
> > KRvW Associates, LLC
> > http://www.KRvW.com
> >
> 
> 
> 




Re: [SC-L] Application Insecurity --- Who is at Fault?

2005-04-06 Thread Michael Silk
Quoting from the article:
''You can't really blame the developers,''

I couldn't disagree more with that ...

It's completely the developers fault (and managers). 'Security' isn't
something that should be thought of as an 'extra' or an 'added bonus'
in an application. Typically it's just about programming _correctly_!

The article says it's a 'communal' problem (i.e: consumers should
_ask_ for secure software!). This isn't exactly true, and not really
fair. Insecure software or secure software can exist without
consumers. They don't matter. It's all about the programmers. The
problem is they are allowed to get away with their crappy programming
habits - and that is the fault of management, not consumers, for
allowing 'security' to be thought of as something seperate from
'programming'.

Consumers can't be punished and blamed, they are just trying to get
something done - word processing, emailing, whatever. They don't need
to - nor should. really. - care about lower-level security in the
applications they buy. The programmers should just get it right, and
managers need to get a clue about what is acceptable 'programming' and
what isn't.

Just my opinion, anyway.

-- Michael


On Apr 6, 2005 5:15 AM, Kenneth R. van Wyk <[EMAIL PROTECTED]> wrote:
> Greetings++,
> 
> Another interesting article this morning, this time from eSecurityPlanet.
> (Full disclosure: I'm one of their columnists.)  The article, by Melissa
> Bleasdale and available at
> http://www.esecurityplanet.com/trends/article.php/3495431, is on the general
> state of application security in today's market.  Not a whole lot of new
> material there for SC-L readers, but it's still nice to see the software
> security message getting out to more and more people.
> 
> Cheers,
> 
> Ken van Wyk
> --
> KRvW Associates, LLC
> http://www.KRvW.com
>