Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-14 Thread Michael Silk
I don't think that analogy quite fits :) If the 'grunts' aren't doing
their job, then yes - let's blame them. Or at least help them find
ways to do it better.

-- Michael

[Ed. Let's consider this the end of the thread, please.  Unless someone
wants to say something that is directly relevant to software security,
I'm going to let it drop.  KRvW]

On 4/13/05, Dave Paris [EMAIL PROTECTED] wrote:
 So you blame the grunts in the trenches if you lose the war?  I mean,
 that thinking worked out so well with Vietnam and all...  ;-)

 regards,
 -dsp

  I couldn't agree more! This is my whole point. Security isn't 'one
  thing', but it seems the original article [that started this
  discussion] implied that so that the blame could be spread out.
 
  If you actually look at the actual problems you can easily blame the
  programmers :)




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-14 Thread Dave Paris
Michael Silk wrote:
I don't think that analogy quite fits :) If the 'grunts' aren't doing
their job, then yes - let's blame them. Or at least help them find
ways to do it better.
If they're not doing their job, no need to blame them - they're
critically injured, captured, or dead. ...or in the case of programmers
- fired.  If you insist on blaming them, you're redirecting blame and
that's BS.
As for finding ways to do it better .. they're well trained - if
they're not well trained, they're (again) critically injured, captured,
or dead.  But as happened in the most recent event in the big sandbox,
they're not well supplied in all cases.  Wow.  Sound familiar?  What?  A
programmer not given full specifications or the tools they need?  Yeah.
 That never happens in the Corporate World.
The analogy works.
Some comparisons:
You call in for close air support .. and friendlies drop munitions on
your position (your manager just told the VP yeah, we can ship two
weeks early, no problems).
You call in for intel on your position and you're told the path to your
next objective is clear - only to get ambushed as you're halfway there
(the marketing guys sold the customer a bill of goods that can't
possibly be delivered in the time alloted - and your manager agreed to
it without asking the programmers)
You're recon and you light up a target with a laser designator and then
call in the bombers - only to find they can't drop the laser-guided
munitions because friendlies just blew up the nearby fuel depot and
now they can't get a lock on the designator because of the smoke (sorry,
you can't get the tools you need to do your job so make due with what
you've got - nevermind that the right tool is readily available - i.e.
GPS-guided munitions in this example - it's just not supplied for this
project).
.. ok, enough with the examples, I hope I've made my point.
Mr. Silk, it's become quite clear to me from your opinions that you
appear to live/work in a very different environment (frankly, it sounds
somewhat like Nirvana) than the bulk of the programmers I know.
Grunts and programmers take orders from their respective chain of
command.  Not doing so with get a grunt injured, captured, or killed and
a programmer fired.  Grunts and programmers each come with a skillset
and a brain trained and/or geared to accomplishing the task at hand.
Experience lets them accomplish their respective jobs more effectively
and efficiently by building on that training - but neither can disregard
the chain of command without repercussions (scantions, court martial,
injury, or death in the case of a grunt - and demotion or firing in the
case of a programmer).  If the grunt or programmer simply isn't good at
their job, and the chain of command doesn't move them to a more
appropriate position, they're either dead or fired.
Respectfully,
-dsp


Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-13 Thread Michael Silk
On 4/13/05, der Mouse [EMAIL PROTECTED] wrote:
  I would question you if you suggested to me that you always assume
  to _NOT_ include 'security' and only _DO_ include security if
  someone asks.
  Security is not a single thing that is included or omitted.
  Again, in my experience that is not true.  Programs that are labelled
  'Secure' vs something that isn't.

 *Labelling as* secure _is_ (or at least can be) something that is
 boolean, included or not.  The actual security behind it, if any, is
 what I was talking about.

  In this case, there is a single thing - Security - that has been
  included in one and not the other [in theory].

 Rather, I would say, there is a cluster of things that have been boxed
 up and labeled security, and included or not.  What that box includes
 may not be the same between the two cases, even, never mind whether
 there are any security aspects that aren't in the box, or non-security
 aspects that are.

  Also, anyone requesting software from a development company may say:
  Oh, is it 'Secure'?  Again, the implication is that it is a single
  thing included or omitted.

 Yes, that is the implication.  It is wrong.

I couldn't agree more! This is my whole point. Security isn't 'one
thing', but it seems the original article [that started this
discussion] implied that so that the blame could be spread out.

If you actually look at the actual problems you can easily blame the
programmers :)

-- Michael




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-13 Thread Dave Paris
So you blame the grunts in the trenches if you lose the war?  I mean,
that thinking worked out so well with Vietnam and all...  ;-)
regards,
-dsp
I couldn't agree more! This is my whole point. Security isn't 'one
thing', but it seems the original article [that started this
discussion] implied that so that the blame could be spread out.
If you actually look at the actual problems you can easily blame the
programmers :)



Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-12 Thread der Mouse
 The programmer is neither the application architect nor the system
 engineer.
 In some cases he is.  Either way, it doesn't matter.  I'm not asking
 the programmer to re-design the application, I'm asking them to just
 program the design 'correctly' rather than 'with bugs'

Except that sometimes the bugs are in the design rather than the code.
Module A has a spec saying that checking a certain aspect of the input
arguments is the caller's responsibility; module B, calling module A,
is written to a spec that makes it A's responsibility to check those
values.

Neither programmer is at fault; each module was written correctly to
spec.  The real problem is that the specs are incompatible - whatever
part of the design and specification process allowed the two specs for
module A to get out of sync with one another is at fault.  (This
shouldn't happen, no, but anyone who thinks that it doesn't is
dreaming.)  Sometimes even the specs are identical, but are written
badly, leaving enough unsaid for such mismatches to occur - the art and
science of writing complete interface specs, that's another subject I
could rant at some length about

 I would question you if you suggested to me that you always assume to
 _NOT_ include 'security' and only _DO_ include security if someone
 asks.

Security is not a single thing that is included or omitted.

Another common source of security problems is that a module (call it A)
is implemented in a way that is secure against the threat model then in
effect (often this threat model is unspecified, and maybe even A's
coder was careful and went and asked and was told no, we don't care
about that).  This module is then picked up and re-used (hey, re-use
is good, right?) in an environment where the threat model is
drastically different - instant problem.  Security was included, yet
security failed, and the fault does not lie with the original coder.
(It lies with whoever reused the module in an environment it was not
designed for.)

 It's also much more likely that the foreman (aka programming
 manager) told the builder (programmer) to take shortcuts to meet
 time and budget -
 Maybe, but the programmer should not allow 'security' to be one of
 these short-cuts.

The programmer quite likely doesn't have that choice.  Refusing to do
what your manager tells you is often grounds for summary firing, with
the work being reassigned to someone who will follow orders (and
probably will be even more overloaded).

It's also not always clear whether a given thing constitutes a security
risk or not.  A certain validation check that's omitted could lead to
nothing worse than, say, a one-cycle delay in recognizing a given
signal in the initial design, but reused in another way that nobody
knew even existed at first writing, it could cause a crash (and
associated DoS) or worse.

/~\ The ASCIIder Mouse
\ / Ribbon Campaign
 X  Against HTML   [EMAIL PROTECTED]
/ \ Email! 7D C8 61 52 5D E7 2D 39  4E F1 31 3E E8 B3 27 4B




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-12 Thread ljknews
At 4:21 PM -0400 4/11/05, Dave Paris wrote:
Joel Kamentz wrote:
 Re: bridges and stuff.

 I'm tempted to argue (though not with certainty) that it seems that the 
 bridge analogy is flawed
 in another way --
 that of the environment.  While many programming languages have similarities 
 and many things apply
 to all programming,
 there are many things which do not translate (or at least not readily).  
 Isn't this like trying to
 engineer a bridge
 with a brand new substance, or when the gravitational constant changes?  And 
 even the physical
 disciplines collide
 with the unexpected -- corrosion, resonance, metal fatigue, etc.  To their 
 credit, they appear far
 better at
 dispersing and applying the knowledge from past failures than the software 
 world.

Corrosion, resonance, metal fatigue all have counterparts in the
software world.  glibc flaws, kernel flaws, compiler flaws.  Each of
these is an outside influence on the application - just as environmental
stressors are on a physical structure.

Corrosion and metal fatigue actually get worse as time goes on.
Software flaws correspond more to resonance, where there is a
defect in design or implementation.
-- 
Larry Kilgallen




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-12 Thread der Mouse
 I would question you if you suggested to me that you always assume
 to _NOT_ include 'security' and only _DO_ include security if
 someone asks.
 Security is not a single thing that is included or omitted.
 Again, in my experience that is not true.  Programs that are labelled
 'Secure' vs something that isn't.

*Labelling as* secure _is_ (or at least can be) something that is
boolean, included or not.  The actual security behind it, if any, is
what I was talking about.

 In this case, there is a single thing - Security - that has been
 included in one and not the other [in theory].

Rather, I would say, there is a cluster of things that have been boxed
up and labeled security, and included or not.  What that box includes
may not be the same between the two cases, even, never mind whether
there are any security aspects that aren't in the box, or non-security
aspects that are.

 Also, anyone requesting software from a development company may say:
 Oh, is it 'Secure'?  Again, the implication is that it is a single
 thing included or omitted.

Yes, that is the implication.  It is wrong.

The correct response to is it secure? is against what threat?, not
yes or no.  I would argue that anyone who thinks otherwise should
not be coding or specifying for anything that has a significant cost
for a security failure.  (Which is not to say that they aren't!)

/~\ The ASCIIder Mouse
\ / Ribbon Campaign
 X  Against HTML   [EMAIL PROTECTED]
/ \ Email! 7D C8 61 52 5D E7 2D 39  4E F1 31 3E E8 B3 27 4B




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-11 Thread Dave Paris
Michael Silk wrote:
Ed,
[...]
 Back to the bridge or house example, would you allow the builder to
leave off 'security' of the structure? Allow them to introduce some
design flaws to get it done earlier? Hopefully not ... so why is it
allowed for programming? Why can people cut out 'security' ? It's not
extra! It's fundamental to 'programming' (imho anyway).
-- Michael
This paragraph contains the core dichotomy of this discussion.
The builder and the programmer are synonomous.
The builder is neither the architect, nor the engineer for the 
structure.  If the architect and engineer included security for the 
structure and the builder failed to build to specification, then the 
builder is at fault.

The programmer is neither the application architect nor the system 
engineer.  If the architect and engineer fail to include (or includes 
faulty) security features (as though it were an add-on, right) then 
the programmer is simply coding to the supplied specifications.  If 
security is designed into the system and the programmer fails to code to 
the specification, then the programmer is at fault.

While there are cases that the programmer is indeed at fault (as can 
builders be), it is _far_ more often the case that the security flaw (or 
lack of security) was designed into the system by the architect and/or 
engineer.  It's also much more likely that the foreman (aka 
programming manager) told the builder (programmer) to take shortcuts to 
meet time and budget - rather than the programmer taking it upon 
themselves to be sloppy and not follow the specifications.

In an earlier message, it was postulated that programmers are, by and 
large, a lazy, sloppy lot who will take shortcuts at every possible turn 
and therefore are the core problem vis-a-vis lousy software.  It's been 
my expreience that while these people exist, they wash out fairly 
quickly and most programmers take pride in their work and are highly 
frustrated with management cutting their legs out from under them, 
nearly _forcing_ them to appear to fit into the described mold.  Ever 
read Dilbert?  Why do you think so many programmers can relate?

I think the easiest summary to my position would be don't shoot the 
messenger - and that's all the programmer is in the bulk of the cases.

Respectfully,
-dsp



RE: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-11 Thread Chris Matthews
Dave Paris wrote:

It's also much more likely that the foreman (aka
programming manager) told the builder (programmer) to take shortcuts to

meet time and budget - rather than the programmer taking it upon
themselves to be sloppy and not follow the specifications.

I'd note that there is the question if the programmer was given a
undefined time period in which to deliver said software, would they be
able to deliver code that is free of 'mechanical' (buffer overflows,
pointer math bugs, etc) bugs?.

Additionally, as an industry, we will only really have the answer to the
above question when the programming managers allocate a programmer the
time to truly implement specifications in a mechanically secure way.

But I agree with the premise that a programmer cannot be held
accountable for (design) decisions that were out of his control.  He can
only be accountable for producing mechanically correct behaviour.

-Chris

(Note that references to mechanical bugs are ones that really are
within the programmer's realm to avoid, and include language specific
and language agnostic programming techniques.)




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-11 Thread Michael Silk
Dave,

On Apr 11, 2005 9:58 PM, Dave Paris [EMAIL PROTECTED] wrote:
 The programmer is neither the application architect nor the system
 engineer.

In some cases he is. Either way, it doesn't matter. I'm not asking the
programmer to re-design the application, I'm asking them to just
program the design 'correctly' rather than 'with bugs' (or - security
problems). Sometimes they leave 'bugs' because they don't know any
better, so sure, train them. [oops, I'm moving off the point again].
All I mean is that they don't need to be the architect or engineer to
have their decisions impact the security of the work.


 If
 security is designed into the system and the programmer fails to code to
 the specification, then the programmer is at fault.

Security can be design into the system in many ways: maybe the manager
was vauge in describing it, etc, etc. I would question you if you
suggested to me that you always assume to _NOT_ include 'security' and
only _DO_ include security if someone asks. For me, it's the other way
round - when receiving a design or whatever.


 While there are cases that the programmer is indeed at fault (as can
 builders be), it is _far_ more often the case that the security flaw (or
 lack of security) was designed into the system by the architect and/or
 engineer.

So your opinion is that most security flaws are from bad design?
That's not my experience at all...

What are you classifying under that?


 It's also much more likely that the foreman (aka
 programming manager) told the builder (programmer) to take shortcuts to
 meet time and budget -

Maybe, but the programmer should not allow 'security' to be one of
these short-cuts. It's just as crucial to the finsihed application as
implementing that method to calculate the Net Proceedes or something.
The manager wouldn't allow you to not do that; what allow them to
remove so-called 'Security' (in reality - just common sense of
validating inputs, etc.).

-- Michael




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-11 Thread Carl G. Alphonce
on Monday April 11, 2005, Damir Rajnovic wrote:
  On Mon, Apr 11, 2005 at 12:21:30PM +1000, Michael Silk wrote:
Back to the bridge or house example, would you allow the builder to
   leave off 'security' of the structure? Allow them to introduce some
   design flaws to get it done earlier? Hopefully not ... so why is it
   allowed for programming? Why can people cut out 'security' ? It's not
   extra! It's fundamental to 'programming' (imho anyway).
 
  Even builders and architects do experiment and introduce new things.
  Not all of these are outright success. We have a wobbly bridge in UK and
  there is(was) new terminal at Charles de Gaulle airport in Paris.
 
  Every profession makes mistakes. Some are more obvious and some not. I am
  almost certain that architects can tell you many more stories where
  things were not done as secure as they should have been.
 
  Comparisons can be misleading.

Indeed.  I am fairly certain that there are numerous examples of
buildings which were properly designed yet were built differently.  I
can't believe that builders never use different materials than are
called for in the plans, and that they never make on-site adjustments
to the plans to accomodate last-minute customer requests (we really
want a double sink in the master bath), etc.


   ()  ascii ribbon campaign - against html e-mail
   /\

Carl Alphonce[EMAIL PROTECTED]
Dept of Computer Science and Engineering (716) 645-3180 x115 (tel)
University at Buffalo(716) 645-3464  (fax)
Buffalo, NY 14260-2000   www.cse.buffalo.edu/~alphonce




Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-11 Thread Dave Aronson
[EMAIL PROTECTED]
[EMAIL PROTECTED]
In-Reply-To: [EMAIL PROTECTED]
MIME-Version: 1.0
Content-Disposition: inline
Content-Type: text/plain;
  charset=iso-8859-1
Content-Transfer-Encoding: 7bit
Message-Id: [EMAIL PROTECTED]
Sender: [EMAIL PROTECTED]
Precedence: bulk
Mailing-List: contact [EMAIL PROTECTED] ; run by MajorDomo
List-Id: Secure Coding Mailing List sc-l.securecoding.org
List-Post: mailto:sc-l@securecoding.org
List-Subscribe: http://www.securecoding.org/list/
List-Unsubscribe: http://www.securecoding.org/list/
List-Help: http://www.securecoding.org/list/charter.php
List-Archive: http://lists.virus.org
Delivered-To: mailing list SC-L@SecureCoding.org
Delivered-To: moderator for SC-L@SecureCoding.org

Dave Paris [EMAIL PROTECTED] wrote:

  The builder and the programmer are synonomous.
 
  The builder is neither the architect, nor the engineer for the
  structure.  If the architect and engineer included security for the
  structure and the builder failed to build to specification, then the
  builder is at fault.
 
  The programmer is neither the application architect nor the system
  engineer.

This is often not true, even on some things that stretch a single
programmer's productivity to the limits (which makes it even worse).

Programmers work within the specs they are given.  That can (NOT SHOULD!)
be anything from use this language on this platform to implement this
algorithm in this style, to we need something that will help us
accomplish this goal.  The latter cries out for a requirements analyst
to delve into it MUCH further, before an architect, let alone a
programmer, is allowed anywhere NEAR it!  However, sometimes that's all
you get, from a customer who is then NOT reasonably easily available to
refine his needs any further, relayed via a manager who is clueless
enough not to realize that refinement is needed, to a programmer who is
afraid to say so lest he get sacked for insubordination, and will also
have to architect it.

If this has not happened at your company, you work for a company with far
more clue about software development than, I would guess, easily 90% of
the companies that do it.

-Dave



Re: [SC-L] Re: Application Insecurity --- Who is at Fault?

2005-04-11 Thread Dave Paris
Joel Kamentz wrote:
Re: bridges and stuff.
I'm tempted to argue (though not with certainty) that it seems that the bridge 
analogy is flawed
in another way --
that of the environment.  While many programming languages have similarities 
and many things apply
to all programming,
there are many things which do not translate (or at least not readily).  Isn't 
this like trying to
engineer a bridge
with a brand new substance, or when the gravitational constant changes?  And 
even the physical
disciplines collide
with the unexpected -- corrosion, resonance, metal fatigue, etc.  To their 
credit, they appear far
better at
dispersing and applying the knowledge from past failures than the software 
world.
Corrosion, resonance, metal fatigue all have counterparts in the
software world.  glibc flaws, kernel flaws, compiler flaws.  Each of
these is an outside influence on the application - just as environmental
stressors are on a physical structure.
Engineering problems disperse faster because of law suits that happen
when a bridge fails.  I'm still waiting for a certain firm located in
Redmond to be hauled into court - and until that happens, nobody is
going to make security an absolute top priority.
Let's use an example someone else already brought up -- cross site scripting.  
How many people
feel that, before it
was ever known or had ever occurred the first time, good programming practices 
should have
prevented any such
vulnerability from ever happening?  I actually think that would have been 
possible for the
extremely skilled and
extremely paranoid.  However, we're asking people to protect against the 
unknown.
Hardly unknowns.  Not every possiblity has been enumerated, but then
again, not every physical phenomena has been experienced w/r/t
construction either.
I don't have experience with the formal methods, but I can see that, supposing 
this were NASA,
etc., formal approaches
might lead to perfect protection.  However, all of that paranoia, formality or 
whatever takes a
lot of time, effort
and therefore huge economic impact.  I guess my personal opinion is that unit 
testing, etc. are
great shortcuts
(compared to perfect) which help reduce flaws, but with lesser expense.
Unit testing is fine, but tests inside the box and doesn't veiw your
system through the eyes of an attacker.
All of this places me in the camp that thinks there isn't enough yet to 
standardize.  Perhaps a
new programming
environment (language, VM, automation of various sorts, direct neural 
interfaces) is required
before the art of
software is able to match the reliability and predictability of other fields?
You're tossing tools at the problem.  The problem is inherently human
and economically driven.  A hammer doesn't cause a building to be
constructed poorly.
Is software more subject to unintended consequences than physical engineering?
not more subject, just subject differently.
Respectfully,
-dsp