[SC-L] MetriSec 2012 submission date is May 30th

2012-05-14 Thread James Walden
MetriSec 2012
8th International Workshop on
SECURITY MEASUREMENTS AND METRICS

Affiliated with the International Symposium on
Empirical Software Engineering and Measurement (ESEM)

September 21, 2012
Lund, Sweden

WORKSHOP OVERVIEW

Quantitative assessment is a major stumbling block for software and
system security. Although some security metrics exist, they are rarely
adequate. The engineering importance of metrics is intuitive: you
cannot consistently improve what you cannot measure. Economics is an
additional driver for security metrics: customers are unlikely to pay
a premium for security if they are unable to quantify what they
receive.

The goal of the workshop is to foster research into security
measurements and metrics and to continue building the community of
individuals interested in this field. This year, MetriSec continues
its co-location with ESEM, which offers an opportunity for the
security metrics folks to meet the metrics community at large.

The organizers solicit original submissions from industry and academic
experts on the development and application of repeatable, meaningful
measurements in the fields of software and system security. The topics
of interest include, but are not limited to:

Security metrics
Security measurement and monitoring
Development of predictive models
Experimental validation of models
Formal theories of security metrics
Security quality assurance
Empirical assessment of security architectures and solutions
Mining data from attack and vulnerability repositories: e.g. CVE, CVSS
Software security metrics
Static analysis metrics
Simulation and statistical analysis
Security risk analysis
Industrial experience

IMPORTANT DATES

Submission of papers: May 30
Notification to authors: June 24
Submission of camera-ready: July 1

PUBLICATION

Authors of accepted papers must present their work at the workshop.
The proceedings of the workshop will be electronically published by
the IEEE.

PAPER SUBMISSION

Submissions are sought in any of the following three categories:

(a) Research papers describing original results, both theoretical and
experimental, are solicited in any of the above mentioned topics.
Theoretical papers should clearly state the contribution and include
some initial validation. Experimental papers are particularly welcome.
In this case, authors are required to explicitly state their
hypothesis, to detail the methodology used, and to describe the
experiment set-up.

(b) Preliminary research results or new ideas can be submitted in the
form of short papers.

(c) Industry experience reports are also welcome. Industry papers
should have at least one author from industry or government, and will
be considered for their industrial relevance.

The page limit for the final proceedings version is 10 pages in
double-column format; short papers are limited to 4 pages. Authors
should use the IEEE Conference Proceedings Template when preparing
their submission. Only PDF files are accepted.

WORKSHOP CO-CHAIRS

James Walden - Northern Kentucky University (US)
Stephan Neuhaus - ETH Zurich (CH)

STEERING COMMITTEE

Dieter Gollmann, TU Harburg (DE)
Sushil Jajodia, GMU (US)
Guenter Karjoth, IBM (CH)
Fabio Massacci, Uni. Trento (IT)
John McHugh, Dalhousie Uni. (CA)
Riccardo Scandariato, KU Leuven (BE)
Ketil Stolen, SINTEF (NO)
Laurie Williams, NCSU (US)

PROGRAM COMMITTEE

Andrea Capiluppi, University of East London (UK)
Robert Cunningham, MIT (US)
Michael Gegick (US)
Dieter Gollmann, TU Harburg (DE)
Maureen Doyle, NKU (US)
Christophe Huygens, KU Leuven (BE)
Sushil Jajodia, GMU (US)
Erland Jonsson, Chalmers (SE)
Howard Lipson, CERT (US)
Fabio Massacci, Uni. Trento (IT)
Miles McQueen, Idaho National Laboratory (US)
Andy Meneely, NCSU (US)
Riccardo Scandariato, KU Leuven (BE)
Karen Scarfone, NIST (US)
Yonghee Shin, DePaul University (US)
Ketil Stolen, SINTEF (NO)
Jeff Stuckman, UMD (US)
Laurie Williams, NCSU (US)
Roland Yap, National University of Singapore (SG)
Nicola Zannone, Eindhoven University of Technology (NL)
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


[SC-L] MetriSec 2012 CFP International Workshop on Security Measurements and Metrics

2012-02-08 Thread James Walden
MetriSec 2012
8th International Workshop on
SECURITY MEASUREMENTS AND METRICS

Affiliated with the International Symposium on
Empirical Software Engineering and Measurement (ESEM)

September 21, 2012
Lund, Sweden

WORKSHOP OVERVIEW

Quantitative assessment is a major stumbling block for software and system
security. Although some security metrics exist, they are rarely adequate.
The engineering importance of metrics is intuitive: you cannot consistently
improve what you cannot measure. Economics is an additional driver for
security metrics: customers are unlikely to pay a premium for security if
they are unable to quantify what they receive.

The goal of the workshop is to foster research into security measurements
and metrics and to continue building the community of individuals
interested in this field. This year, MetriSec continues its co-location
with ESEM, which offers an opportunity for the security metrics folks to
meet the metrics community at large.

The organizers solicit original submissions from industry and academic
experts on the development and application of repeatable, meaningful
measurements in the fields of software and system security. The topics of
interest include, but are not limited to:

Security metrics
Security measurement and monitoring
Development of predictive models
Experimental validation of models
Formal theories of security metrics
Security quality assurance
Empirical assessment of security architectures and solutions
Mining data from attack and vulnerability repositories: e.g. CVE, CVSS
Software security metrics
Static analysis metrics
Simulation and statistical analysis
Security risk analysis
Industrial experience

IMPORTANT DATES

Submission of papers: May 30
Notification to authors: June 24
Submission of camera-ready: July 1

PUBLICATION

Authors of accepted papers must present their work at the workshop. The
proceedings of the workshop will be electronically published by the IEEE.

PAPER SUBMISSION

Submissions are sought in any of the following three categories:

(a) Research papers describing original results, both theoretical and
experimental, are solicited in any of the above mentioned topics.
Theoretical papers should clearly state the contribution and include some
initial validation. Experimental papers are particularly welcome. In this
case, authors are required to explicitly state their hypothesis, to detail
the methodology used, and to describe the experiment set-up.

(b) Preliminary research results or new ideas can be submitted in the form
of short papers.

(c) Industry experience reports are also welcome. Industry papers should
have at least one author from industry or government, and will be
considered for their industrial relevance.

The page limit for the final proceedings version is 10 pages in
double-column format; short papers are limited to 4 pages. Authors should
use the IEEE Conference Proceedings Template when preparing their
submission. Only PDF files are accepted.

WORKSHOP CO-CHAIRS

James Walden - Northern Kentucky University (US)
Stephan Neuhaus - ETH Zurich (CH)

STEERING COMMITTEE

Dieter Gollmann, TU Harburg (DE)
Sushil Jajodia, GMU (US)
Guenter Karjoth, IBM (CH)
Fabio Massacci, Uni. Trento (IT)
John McHugh, Dalhousie Uni. (CA)
Riccardo Scandariato, KU Leuven (BE)
Ketil Stolen, SINTEF (NO)
Laurie Williams, NCSU (US)

PROGRAM COMMITTEE

Andrea Capiluppi, University of East London (UK)
Robert Cunningham, MIT (US)
Michael Gegick (US)
Dieter Gollmann, TU Harburg (DE)
Maureen Doyle, NKU (US)
Christophe Huygens, KU Leuven (BE)
Sushil Jajodia, GMU (US)
Erland Jonsson, Chalmers (SE)
Howard Lipson, CERT (US)
Fabio Massacci, Uni. Trento (IT)
Miles McQueen, Idaho National Laboratory (US)
Andy Meneely, NCSU (US)
Riccardo Scandariato, KU Leuven (BE)
Karen Scarfone, NIST (US)
Yonghee Shin, DePaul University (US)
Ketil Stolen, SINTEF (NO)
Jeff Stuckman, UMD (US)
Laurie Williams, NCSU (US)
Roland Yap, National University of Singapore (SG)
Nicola Zannone, Eindhoven University of Technology (NL)
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] informIT: Building versus Breaking

2011-09-05 Thread James Walden
There are also a couple of other relevant academic security conferences:

MetriSec - http://metrisec2011.cs.nku.edu/ (September 21st in Banff, Canada)
SESS - http://homes.dico.unimi.it/~monga/sess11.html (May)

On Thu, Sep 1, 2011 at 12:41 PM, Goertzel, Karen [USA] 
goertzel_ka...@bah.com wrote:

 There are these:

 ISC(2) Secure Software Conference Series -
 https://www.isc2.org/PressReleaseDetails.aspx?id=650

 ESSoS - http://distrinet.cs.kuleuven.be/events/essos/2012/

 SecSE - http://www.sintef.org/secse

 SSIRI - http://paris.utdallas.edu/ssiri11/


 But your point is taken. Most of the conferences in this domain appear to
 be outside the U.S. I'm not sure what THAT says about U.S. attitudes about
 software assurance (though I have my suspicions).

 More important is the question of who actually attends these conferences.
 I'm in the process of updating some research on how and where software
 security assurance is being taught by colleges and universities, and what
 I'm finding is that the topic has been pretty much marginalised into an
 aspect of information assurance - i.e., it's being taught mostly to
 postgraduates who are majoring in IA and related disciplines - rather than
 an aspect of software development. There are exceptions, of course - but by
 and large that seems to be the trend. And I think the same is true of the
 conferences. It's the security wonks who care about software assurance much
 more than the actual software developers. Take a look at:
 http://zastita.com/index.php?det=64494

 ===
 Karen Mercedes Goertzel, CISSP
 Booz Allen Hamilton
 703.698.7454
 goertzel_ka...@bah.com

 Sorry, you have reached an imaginary number.
 If you require a real number, please rotate
 your phone by ninety degrees and try again.
 
 From: sc-l-boun...@securecoding.org [sc-l-boun...@securecoding.org] on
 behalf of Steven M. Christey [co...@linus.mitre.org]
 Sent: 31 August 2011 16:45
 To: Sergio 'shadown' Alvarez
 Cc: Adam Shostack; Secure Code Mailing List
 Subject: Re: [SC-L] informIT: Building versus Breaking

 While I'd like to see Black Hat add some more defensive-minded tracks, I
 just realized that this desire might a symptom of a larger problem: there
 aren't really any large-scale conferences dedicated to defense / software
 assurance.  (The OWASP conferences are heavily web-focused; Dept. of
 Homeland Security has its software assurance forum and working groups, but
 those are relatively small.)

 If somebody built it, would anybody come?

 - Steve
 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org
 List information, subscriptions, etc -
 http://krvw.com/mailman/listinfo/sc-l
 List charter available at - http://www.securecoding.org/list/charter.php
 SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
 as a free, non-commercial service to the software security community.
 Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
 ___

 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org
 List information, subscriptions, etc -
 http://krvw.com/mailman/listinfo/sc-l
 List charter available at - http://www.securecoding.org/list/charter.php
 SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
 as a free, non-commercial service to the software security community.
 Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
 ___

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] has any one completed a python security code review`

2010-04-06 Thread James Walden
On Mon, Apr 5, 2010 at 12:08 PM, Matt Parsons mparsons1...@gmail.com
wrote:
 Has anyone completed a python security code review?  What would
 you look for besides inputs, outputs and dangerous functions?
 Do any of the commercial static code analysis vendors scan that
 code?  I would think not because python is not compiled at run
 time like the other languages that static analysis tools can
 scan.  Any help would be greatly appreciated.

Static analysis tools can and do scan dynamic languages like
python, PHP, and Javascript.  Fortify 360 v2.5 can scan Python.
There are also free tools for Python, like pylint, pychecker, and
pyflakes, but none of them is primarily focused on security.
OWASP's Python ESAPI is a good starting point to learn about
potential security flaws in Python.

James Walden
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] Compilers

2006-12-22 Thread James Walden

On 12/21/06, Gary McGraw [EMAIL PROTECTED] wrote:


I have a better idead.  Stop using C++.  Jeeze.



I'll second that recommendation.  Given the abundance of better languages,
there are few good reasons to use dangerous languages like C++ on new
projects.  It's easier and less time consuming to learn a new safe language
than to use C++ securely.

James Walden
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Compilers

2006-12-22 Thread James Walden

On 12/21/06, Stephen de Vries [EMAIL PROTECTED] wrote:


You can achieve very similar goals by using unit tests.  Although the
tests are not integrated into the code as tightly as something like
Spark (or enforcing rules in the compiler), they are considered part
of the source.   IMO unit and integration testing are vastly
underutilised for performing security tests which is a shame because
all the infrastructure, tools and skills are there - developers (and
security testers) just need to start implementing security tests in
addition to the functional tests.



I agree that it's important to test the security of your software and I like
test-driven development, but unit tests are not a replacement for static
analysis assisted code reviews.  Likewise, static analysis and code reviews
aren't a substitute for security testing.

Security tests attempt to find bad input and verify that the program handles
it correctly, but you can't guarantee that you've found every possible type
of bad input.  Unit tests have the additional problem that input which may
be safe for the current unit may become dangerous when interpreted
differently in a different unit of the program (e.g., ' OR 1-1--' is just
text to your web application, but your database may interpret it as code.)

Code reviews find different bugs than tests do, and they typically find them
faster, so good testing practices are not an excuse to ignore static
analysis and code reviews.  Tests also find different bugs than code reviews
do.  If your static analysis tool doesn't have a rule to detect a particular
class of security bug, it obviously won't find it, but your testers might
have the experience to test for it.

James Walden
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Fwd: re-writing college books - erm.. ahm...

2006-11-07 Thread James Walden
On 11/7/06, Gadi Evron [EMAIL PROTECTED] wrote:
Well, I never recieved any replies here on what's already being done.. sonow, I am asking for ideas on how we can approach schools. What's needed,in order for basic CS classes to have a security orientation?
Most CS professors have little awareness about security in general or secure programming techniques in specific, so I think awareness is the place we need to start. I've been giving workshops in secure programming and software security targeted at CS educators since 2005 and will be giving workshops in both areas in March at the largest annual gathering of CS educators, the ACM SIGCSE Conference (
http://www.cs.potsdam.edu/sigcse07/index.html). Software security awareness is growing these days. I've seen software security and/or secure programming classes appear at a couple dozen security focused CS departments in the last couple of years, including my own. I teach relevant software security topics in my classes, and I know professors at a few universities who are working on a variety of approaches to introducing secure programming into CS1 and CS2.
I'm currently surveying a variety of introductory CS textbooks in C, C++, and Java to look for security errors in their examples. If you know of any such errors, I'd appreciate getting an e-mail from you with the information about the error. I plan to use the data as part of a paper on teaching secure programming in early CS classes and will acknowledge any contributions in the paper.
James WaldenAssistant Professor, NKUhttp://www.nku.edu/~waldenj1/
___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


Re: [SC-L] re-writing college books [was: Re: A banner year for software bugs | Tech News on ZDNet]

2006-10-15 Thread James Walden
On 10/12/06, Craig E. Ward [EMAIL PROTECTED] wrote:
I don't think saying use safer languages is a good way to say it.It would help conditions significantly if greater care were taken tomatch the choice of programming language to the problem to be solved
or application to be created. If a language like C is mostappropriate, then use it, just be sure to take the extra steps neededto develop it securely.I agree that the programming language should be chosen to match the problem, though it's worth pointing out that security is typically part of the problem to be solved. There are safer systems programming languages than C, such as D and Cyclone. If you've considered the alternatives and you really have to use C because it's the only thing that will do, then yes, use it and be sure to use it securely and verify that fact with static analysis tools and code reviews.
James
___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


Re: [SC-L] ComputerWorld interview with Theo de Raadt on Software Security

2004-09-10 Thread James Walden
Kenneth R. van Wyk wrote:
FYI, ComputerWorld is running an interesting interview with Theo de Raadt, on 
the state of software security, and OpenBSD in particular.  See 
http://www.computerworld.com.au/index.php/id;1498222899;fp;16;fpid;0 for the 
complete text.
You can find his presentation on exploit mitigation techniques that was 
mentioned in the article at http://cvs.openbsd.org/papers/auug04/index.html

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/


Re: [SC-L] Programming languages used for security

2004-07-10 Thread James Walden
Wall, Kevin wrote:
My vary reason for posing these questions is to see if there is any
type of consensus at all on what mechanisms / features a language
should and should not support WITH RESPECT TO SECURE PROGRAMMING.
For example, you mentioned garbage collection. To that I would add
things like strong static typing, encapsulation that can not be
violated, very restrictive automatic type conversion (if allowed
at all), closed packages or libraries or some other programming
unit, elegant syntax and semanatics (oops, said I wouldn't go
there ;-), etc.
In the past few days (actually, all through my career), I've hear a
lot of gripes about what people think is wrong regarding languages,
but little in terms of what they think is valuable.
 

Off the top of my head, I'd like some of the features you mentioned, like
   Garbage collection
   Static typing (with no auto conversions, but with type inferencing)
   Secure encapsulation
I'd also add a rich set of data types, including:
   Numeric types with restrictions as Larry Kilgallen mentioned earlier 
and unlimited precision types
   Strings
   Lists
   Arrays (bounds-checked)
   Associative arrays (aka hashes)
   Unions (as in ocaml, not C, which will also provide enumerated and 
boolean types)
   Functions (first-class functions)
   XML (like Xen)

I also want a taint checking feature like perl's, as a general purpose 
language has to communicate with external programs which don't share its 
data types, like web servers sending CGI parameter strings or databases 
receiving SQL query strings.

As for syntax, I want to be able to use functional, imperative, or 
object-oriented techniques as best fit my problem domain.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/
[EMAIL PROTECTED]



Re: [SC-L] Education and security -- another perspective (was ACM Queue - Content)

2004-07-08 Thread James Walden
ljknews wrote:
What is wrong with this picture ?
I see both of you willing to mandate the teaching of C and yet not
mandate the teaching of any of Ada, Pascal, PL/I etc.

This seems like the teaching of making do.
You read more into my post than I wrote, as I did not mandate that the students 
must learn C/C++.  They already know C/C++ by the time they take my course, but 
few have any exposure to the relevant security issues.  It's important that a 
security class cover security issues with the languages that the students have 
already used in their curriculum, unless that's already covered elsewhere.  How 
many people will change their programming language if they don't see what's 
wrong with the one they're currently using?

In summary, I teach the students the security issues (the powers and failures 
of C as Dana put it), not the language itself.  I do offer an overview of the 
features of more secure languages that students haven't used, but I don't have 
time to teach a new language in my security class, which isn't a pure software 
security class.

As for teaching students languages, we traditionally taught software 
engineering in Ada at my university, though we've moved to mostly Java or 
Python since the term project was required to be a web-based system. 
Introductory classes are taught using Java, in part because the AP test is 
Java-based, while computer architecture and assembly is taught using assembly, 
and operating systems is taught using C/C++.  Electives introduce other 
languages, of course.  I like ocaml myself, but its use is restricted to 
restricted to certain electives.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/



Re: [SC-L] Education and security -- another perspective (was ACM Queue - Content)

2004-07-07 Thread James Walden
Dana Epp wrote:
I'd be interested to hear what people think of the two approaches 
(separate security courses vs. spreading security all over the curricula).

 Regards.

 Fernando.
I don't think it's an either/or question; we need both approaches.  Students 
should study security wherever it's relevant in the curriculum, but they also 
need a security class towards the end of the degree program to integrate what 
they've learned with a deeper and more theoretical look at security at the 
level of Matt Bishop's Computer Security: Art and Science.

Well, I have been asked to teach a new forth year course at the British 
Columbia Institute of Technology (BCIT) this fall on Secure Programming 
(COMP4476).
It looks like a good class, Dana.  My only suggestion would be to present 
secure design principles in unit 2, instead of waiting to bring them up until 
unit 7 (I'm presuming you'll bring up more than least privilege there.)  I 
think you're right to wait until later in the term to bring up buffer 
overflows; it's a more difficult problem for students than I expected it to be 
the first time I gave such an assignment.

I only wish I could make all these books be textbook requirements for 
the curriculum. It should be mandatory reading.
I think they're all good books, but covering fewer topics and books in greater 
depth increases learning, so don't succumb to that temptation.  You can't teach 
them everything in one term, but you can point the way to continue learning 
with those books for students who want to know about security than one class 
can teach them.

Of course, I also think students should have to take at least one course 
in ASM to really understand how computer instructions work, so they can 
gain a foundation of learning for the heart of computer processing. And
I think they should be taught the powers and failures of C. Since I know 
many of you think I'm nuts for that, you might want to look at this 
outline with the same amount of consideration.
I agree with you on both of those requirements.  You need to have a basic 
understanding of assembly and how C is translated into assembly to understand 
the most common types of buffer overflow attacks.  There are better languages 
for secure programming than C, but students are almost certainly going to have 
to read or write C at some point in their careers, so they need to understand it.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/



Re: [SC-L] Education and security -- another perspective (was ACM Queue - Content)

2004-07-07 Thread James Walden
Crispin Cowan wrote:
Another perspective (overheard at a conference 12 years ago):
   * Scientists build stuff in order to learn stuff.
   * Engineers learn stuff in order to build stuff.
I think that's about as accurate a summary of the distinction as you can make 
in 16 words.  What makes it even more fuzzy in our case is that computer 
science does not fit in any one category, as it's a conglomerate of maths, 
science, and engineering.  That may be why some universities are moving from 
departments of computer or information science to schools of computer or 
information science.

... however, the programming skills that universities teach is usually a 
side-effect of something else they are teaching: topics like algorithms, 
graphics, database, operating systems, networking, etc. They teach you 
the topic, give you a development project in that topic, and expect you 
to pick up the programming skills along the way.

What is broken about all this is security: the above approach teaches 
the kiddies to implement software anyway they can, under a lot of time 
pressure, and with very little QA pressure: graders have no time to 
rigorously test assignment hand-ins, and certainly not time to pen-test 
them.
I agree that programming being taught as an afterthought is one of the major 
sources of the problem with security, and it's related to computer science 
being a conglomerate of disciplines.  CS today feels like we're studying 
natural philosophy in the days before biology, chemistry, and physics became 
their own disciplines.  It worked when computer science was a younger field, 
but there's so much to study today that we can't fit all of it into a four year 
curriculum.

There are only a few solutions to adding security to the curriculum in this 
sutation: 1) remove other material to add security in its place, 2) expand the 
number of required classes and thus time for a degree, or 3) specialize CS into 
multiple disciplines, at least one of which has room for security in its 
curriculum.  I think the third choice is the likely and best long term 
solution, and the first is the most workable short term solution.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/



Re: [SC-L] SPI, Ounce Labs Target Poorly Written Code

2004-06-30 Thread James Walden
Blue Boar wrote:
To clarify, I'm talking about things like passing unfiltered user input 
to a system shell, or a native API, something like that.
True.  In the case of passing a user input string to the shell or a database 
server, you're accepting what's potential a program as input.  However, if your 
language's type system considers that program to be a string, there's no way 
your compiler can perform relevant security checks.

I've read papers on the topic of adding new data types like relational database 
tables or XML documents to existing languages (as Xen does for C#), expanding 
the type system to deal with such data directly instead of reducing it to a 
string that the compiler can't automatically type check.  However, there are 
always going to be new programs to pass data to, and strings will always be a 
convenient choice of packaging new unknown data types, so I don't see this 
problem going away in the future, though particular attack instances like SQL 
injection may disappear.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/
[EMAIL PROTECTED]



Re: [SC-L] ACM Queue article and security education

2004-06-30 Thread James Walden
Kenneth R. van Wyk wrote:
Overall, I like and agree with much of what Marcus said in the article.  
I don't, however, believe that we can count on completely putting 
security below the radar for developers.  Having strong languages, 
compilers, and run-time environments that actively look out for and 
prevent common problems like buffer overruns are worthy goals, to be 
sure, but counting solely on them presumes that there are no security 
problems at the design, integration, or operations stages of the 
lifecycle.  Even if the run-time environment that Marcus advocates is 
_perfect_ in its protection, these other issues are still problematic 
and require the developers and operations staff to understand the problems.
I agree that you can't solve all security problems with development tools, but 
I think security tools are a worthwhile investment because deploying tools can 
be accomplished much more quickly than educating developers, tools can help 
experienced developers, and tools can raise awareness of software security 
issues.  The article's mention of people creating patches to eliminate compiler 
security warnings may indicate that I'm too optimistic about tools raising 
awareness, but I think that some developers will learn from their tools.

Yup, but in the belt and suspenders approach that I like to advocate, 
I'd like to see software security in our undergrad cirricula as well as 
professional training that helps developers understand the security 
touch points throughout the development process -- not just during the 
implementation phase.
I agree.  Students should see software security in all development phases 
relevant to each software course that they take; software engineering in 
particular should address security topics in all phases of the development 
process.  I think there's an additional need for a class focused purely on 
security to put all the elements of security together.

Peter G. Neumann wrote:
 Gee, Some of us have been saying that for 40 years.
I can't deny that even if I have only been reading your comp.risks digest for a 
little more than a third of that span, but I think the fact that today's 
security problems are directly and indirectly impacting large segments of the 
population has increased awareness of security problems, and, as a result, 
we're seeing a rise in security education.  Many of us like to think that 
computer science changes rapidly, and it does compared to older fields like 
physics, where you have to go to graduate school to study much that was 
developed after the 1930's, but I suspect most people in any field avoid change 
until it's forced upon them.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/
[EMAIL PROTECTED]



Re: [SC-L] Re: White paper: Many Eyes - No Assurance Against Many Spies

2004-04-30 Thread James Walden
Jeremy Epstein wrote:
I agree with much of what he says about the potential for infiltration of
bad stuff into Linux, but he's comparing apples and oranges.  He's comparing
a large, complex open source product to a small, simple closed source
product.  I claim that if you ignore the open/closed part, the difference in
trustworthiness comes from the difference between small and large.  That is,
if security is my concern, I'd choose a small open source product over a
large closed source, or a small closed source over a large open source... in
either case, there's some hope that there aren't bad things in there.
He makes three claims for greater security of his embedded OS:
	(1) A carefully controlled process for modifying source code.
	(2) Small size in terms of lines of code.
	(3) Auditing of the object code.
Certainly, a small, well-audited system is more likely to be secure than 
a large, poorly audited one.  Also, as there has been one discovered 
failed attempt to insert a backdoor into the Linux kernel, I agree that 
the potential for further such attacks exists.

However, his claim that Linux can never be made secure because it's too 
large to audit every time it changes is overstated.  He's ignoring the 
fact that few people (and even fewer in defence) will or should upgrade 
every time the kernel changes.  Widely used Linux distributions rarely 
include the latest kernel, even if your organization is using the latest 
distribution.  He's also confusing the difference between desktop and 
embedded Linux systems.  Yes, his embedded OS is small, but an embedded 
Linux system is going to be much smaller than the desktop distributions.

While kernel 2.6.5 may contain 5.46 million lines of code (counting 
blank lines and comments), much of that code is unlikely to be present 
in an embedded system.  After all, 2.72 million lines of code (49.8%) 
are drivers, 414,243 (7.6%) lines of code are sound-related, and 
another 514,262 (9.4%) lines are filesystem-related.  You're going to 
build your embedded system with the hardware drivers and filesystems 
that you need, not every possible device and obscure filesystem 
available.  The same is true for userspace setuid programs, which I'll 
not count as I'm not sure which ones would be necessary for the types of 
systems under discussion.

In summary, there are both fewer times and fewer lines of source code 
(and bytes of object code) that need to be audited than he claims. 
While auditing Linux is a more difficult task than auditing a smaller 
embedded OS, his claims are overblown since he ignores the fact that you 
only need to audit the parts and versions of the kernel (and OS) that 
you install and use when you install a new version.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/



Re: [SC-L] virtual server - security

2004-04-01 Thread James Walden
Along the same lines as the suggestion to use VMware, have you thought 
about User Mode Linux?  User Mode Linux lets you run virtual Linux 
machines on your existing Linux system.  If you have enough memory, you 
could give each of your user groups their own virtual Linux machine with 
its own filesystem and virtual hardware, offering much greater isolation 
than chroot.  Each could also run their own version of 
Apache/mod_perl/PHP, fulfilling potentially different security 
requirements.  Compromise of one virtual machine wouldn't compromise the 
others or the real box that they're running on.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/
[EMAIL PROTECTED]