Re: [SC-L] Microsoft SDL report card
On Fri, Apr 15, 2011 at 7:33 AM, Ben Laurie wrote: > > Which is why I am interested in and devoting most of my time now to > capability systems. Ben, Is your work focused on the technical bits of this, or the human interaction pieces? Seems to me that much of the work on technical implementations of capabilities, fine-grained permissions, MAC, etc. have been worked out repeatedly over time and we've never come up with very usable systems. Or ones that stay usable over time Try setting the permissions for an application when you install it, or figure out whether it is asking for more permissions than it really needs, etc? Thoughts? - Andy ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___
Re: [SC-L] informIT: Modern Malware
On Wed, Mar 23, 2011 at 8:14 AM, Gary McGraw wrote: > > I agree that clueless users who click on whatever pops up lead to many > infections even when software is is reasonable shape, but I don't see that > as a reason not to build better software. Presumably, you guys at paypal > agree. Right? First, I tend to use my personal email here rather than work one, so don't assume I speak for them ever, and especially not when I use my own email :) Second, I totally agree on making endpoints more resilient against malware, increasing software security, etc. I've noticed however that we (many of us, especially those with a user-rights bent) end up with two competing goals in this space: 1. Make endpoints resilient against malware 2. Allow users to have complete control of their own computer, aka, no walled gardens. These two competing desires make defeating malware especially problematic. Lots of malware exploits technical flaws, and increasing our software security practices will help defeat these. As these defenses get better, malware moves towards social engineering, and we're ill-equipped to defend against these as there are more and more software distribution channels, and policing gets harder. Hence the traditional AV-signature based approaches, which are only semi-effective, especially when the Rogue-AV software even has a human-staffed helpdesk to help you remove your "actual AV" and replace it with theirs. All the systems we've come up with so far to defeat this involve walled gardens, heuristics looking for bad behavior, etc. and they are all sort of a band aid. Your article started out saying - "At the same time, software complexity, including the notion of extensibility designed into virtual machines like the Java Virtual Machine (JVM), leads to serious and widespread software vulnerability that lies at the root of the malware problem.". It is this statement that I'm wary of, as it doesn't take into account the non-vulnerability aspects of the problem. If we ignore those and only focus on drive-by malware, we're quickly going to find that the attackers have shifted their focus, and our purely technical controls are ineffective. Neil makes a good point on this thread about how Dasient, and other providers, can help, and there are also some client-side techniques that are useful. So is Apple's curated app-store. It isn't perfect, but the curated model along with swift revocation is a fairly effective defense against mass-infection, but not targeted infection. No real conclusions here I suppose, but I thought it useful to highlight some of the inherent tensions. - Andy ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___
Re: [SC-L] informIT: Modern Malware
On Tue, Mar 22, 2011 at 8:41 AM, Gary McGraw wrote: > hi sc-l, > > The tie between malware (think zeus and stuxnet) and broken software of the > sort we work hard on fixing is difficult for some parts of the market to > fathom. I think it's simple: software riddled with bugs and flaws leads > directly to the malware problem. No, you don't use static analysis to "find > malware" as the AT&T guys sometimes think…you use it to find the kinds of > bugs that malware exploits to get a toehold on target servers. One level > removed, but a clear causal effect. Gary, Interestingly, your article only covers malware that gets installed by exploiting a technical vulnerability, not malware that gets installed by exploiting a human vulnerability (social engineering). I've been looking around and haven't found much data on infection rates, percentages, success rates, etc. but "voluntarily" installed malware is a significant and growing concern, and it requires an entirely different approach than that required for malware that exploits a technical vuln. Thoughts? - Andy ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___
Re: [SC-L] "Checklist Manifesto" applicability to software security
On Thu, Jan 7, 2010 at 7:11 AM, Jeremy Epstein wrote: > Greetings, > > So as I was listening, I was thinking that many of the same things > could be said about software developers and problems with software > security - every piece of software is unique, any non-trivial piece of > software is amazingly complex, developers tend to consider themselves > as artists creating unique works, etc. > > Has anyone looked into the parallelisms before? If so, I'd be > interested in chatting (probably offlist) about your thoughts. I've had exceptionally good luck/results from checklists during the development process, though nothing I could scientifically quantify. That said, I wonder whether any of the academics on the list would be willing to actually do a study. Do some actual trials on defect rates in things like student assignments when they have some students go through a checklist to examine their code, and others not. Might be interesting to see exactly what types of checklist items really result in a reduction in bugs... -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Genotypes and Phenotypes
On Mon, Oct 12, 2009 at 9:55 AM, Gunnar Peterson wrote: > Its been awhile since there was a bugs vs flaws debate, so here is a snippet > from Jaron Lanier > A: No, no, they're not. What's the difference between a bug and a variation > or an imperfection? If you think about it, if you make a small change to a > program, it can result in an enormous change in what the program does. If > nature worked that way, the universe would crash all the time. Certainly > there wouldn't be any evolution or life. There's something about the way > complexity builds up in nature so that if you have a small change, it > results in sufficiently small results; it's possible to have incremental > evolution. Right now, we have a little bit -- not total -- but a little bit > of linearity in the connection between genotype and phenotype, if you want > to speak in those terms. But in software, there's a chaotic relationship > between the source code (the "genotype") and the observed effects of > programs -- what you might call the "phenotype" of a program. Is this really true though? A small change in libc doesn't change the whole look and feel of a word processing program. It looks exactly the same, but maybe behaves very slightly differently over a small range of inputs, etc. And, while not being an expert in biology, I'm quite certain that there are very minor mutations in certain key places that result in complete system failure or almost entirely fatal diseases, conditions, etc. Is the complexity and expression of it really the key piece here? Or is it general resilience against failure, complexity spread out so that the common enemies (transcription errors in one place) aren't fatal. The system is designed against different threat models. -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Where Does Secure Coding Belong In the Curriculum?
On Tue, Aug 25, 2009 at 7:26 AM, Goertzel, Karen [USA] wrote: > For consistency's sake, I hope you agree that if security is an > intermediate-to-advanced concept in software development, then all the other > "-ilities" ("goodness" properties, if you will), such as quality, > reliability, usability, safety, etc. that go beyond "just get the bloody > thing to work" are also intermediate-to-advanced concepts. > > In other words, teach the "goodness" properties to developers only after > they've inculcated all the bad habits they possibly can, and then, when they > are out in the marketplace and never again incentivised to actually unlearn > those bad habits, TRY desperately to change their minds using nothing but > F.U.D. and various other psychological means of dubious effectiveness. Seriously? We're going to teach kids in 5th grade who are just learning what an algorithm is how to protect against malicious inputs, how to make their application fast, handle all exception conditions, etc? Maybe we're still having that pupil/student discussion? In engineering disciplines we split courses into different areas of concern but still make everyone take all of the classes whether they are beginner or advanced. Or, physics for example. Or maybe something like music lessons? Maybe we should teach all kids about vibrato and complex rhythms from day-1, or maybe before they have even picked up an instrument we should make them study music theory? I'm just having a hard time understanding why we're trying to invent this from scratch when plenty of other disciplines, how people learn other skills, etc. all start from basics and then get more advanced. -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Where Does Secure Coding Belong In the Curriculum?
On Tue, Aug 25, 2009 at 4:09 AM, Stephan Neuhaus wrote: > > On Aug 25, 2009, at 02:35, Benjamin Tomhave wrote: > >> First, security in the software development concept is at least an >> intermediate concept, if not advanced. > > Not at all. That would be like saying that correctness is also an advanced > concept, because it gets in the way of coding. Security is about exploiting > assumptions (often hidden) that we make when we write and deploy software. I > see no reason why teaching to think about assumptions should be deferred. > You teach math students how to do proofs right from the beginning for > essentially the same reasons :-) really? First graders are learning to do math proofs instead of basic addition? I'm quite surprised by this. We're missing I think the point I raised earlier. Not everyone learns to program in high school or college. And, even learning the basics of what an algorithm are is tricky, much less learning defensive programming, etc. So, yes, it is an "advanced" concept for the majority of beginning programmers. -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Where Does Secure Coding Belong In the Curriculum?
On Wed, Aug 19, 2009 at 2:15 PM, Neil Matatall wrote: > Inspired by the "What is the size of this list?" discussion, I decided I > won't be a lurker :) > > A question prompted by > http://michael-coates.blogspot.com/2009/04/universities-web-app-security.html > and the OWASP podcast mentions > > So where does secure coding belong in the curriculum? > > Higher Ed? High School? > > Undergrad? Grad? Extension? Does it help at all to consider how and where most people actually learn to program/develop? I don't have percentages handy of how many people with a job title or informal role as "programmer" or "developer" actually took any formal education in this. If we're just trying to reach the group of developers that went through formal training then we've seen some pretty good answers here in this thread already. If we want to cover others though, we need to look elsewhere. Let's look at another few fields where safety is important and yet the work is often done by both professionals and amateurs - Plumbing and/or Electrical Work. My own view is that much software development is actually a lot closer to the work of the amateur electrician than the professional electrician. That is, unlike fields like engineer, architect, lawyer, accountant, we don't rely on professional standards, degrees, certifications, etc. for most programmers. I'm leaving aside for a moment whether we can or should, and just pointing out that it is the case. In the case of the amateur electrician you'll find a wide variety in their knowledge of safety concerns, adherence to code, etc. They probably know enough to not electrocute themselves while they are working (though not always) but don't necessarily know enough to put in wiring that won't burn their house down in a few years. I think our real question isn't just how to reach the "professional" programmer trained via formal training programs, but also how to reach the "amateur" programmer trained via books, trial+error, etc. In these cases the best bet is to make sure that the general training manuals, how-to guides, etc. have a lot of safety/security information included in them. That the books people use to learn actually show them safe examples, etc. Obviously there are variations of code requirements per location and such, but basic safety rules will probably be mostly universal. - Andy ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software SecurityAlchemist(informIT)
On Wed, Mar 25, 2009 at 10:18 AM, ljknews wrote: > > Worry about enforcement by the hardware architecture after > you have squeezed out all errors that can be addressed by > software techniques.\ Larry, Given the focus we've seen fro Microsoft and protecting developers from mistakes through things like DEP, ASLR, SEH, etc. why do you think that these can't be done in parallel? I mean, we used to not have Virtual Memory or real MMUs and the developer had to make sure they didn't step on other people's pages. Hardware support for protection on pages has helped with a lot of things right? I'm not saying I'm holding out hope for hardware to solve all our problems (that would be silly) but I do think it can be fairly useful for some classes of problems and a lot more scalable/repeatable. Practical right now, no. But we're sort of in the realm of fantasy in this discussion already if we think the general mass of people writing software are going to switch languages because certain ones are more reliable - Andy ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software SecurityAlchemist(informIT)
Ok, so your point then is that a desire for type-safety influenced the hardware architecture of these machines. Fair enough, though I don't know enough of the history of these machines to know how accurate it is. But how can I doubt you Gary? :) I was mainly reflecting in my comments though that the programming language and the hardware architecture are coupled in terms of the resulting security model. Or they can be anyway. On Wed, Mar 25, 2009 at 8:42 AM, Gary McGraw wrote: > Hi Andy, > > The code/data mix is certainly a problem. Also a problem is the way stacks > grow on many particular machines, especially with common C/C++ compilers. > You noted a Burroughs where things were done better. There are many > others. C is usually just a sloppy mess by default. > > Language choice can sometimes make up for bad machine architecture, but > ultimately at some level of computational abstraction they come to be the > same thing. You may recall that I am a scheme guy. TI made a scheme > machine that never caught on some years back (around the same time as the > LISP machine...like emacs only even more bindings at least on the Symbolics > <http://en.wikipedia.org/wiki/Lisp_machine>). Those machines had a > fundamentally different architecture at the processor level. > > In any case, type safety is at the root of these decisions and makes a HUGE > difference. Go back and read your lambda calculus, think about closure, > symbolic representation, continuations, and first class objects and I think > you'll see what I mean. http://en.wikipedia.org/wiki/Lambda_calculus > > gem > (supposedly still on vacation, but it is a rainy day) > > http://www.cigital.com/~gem <http://www.cigital.com/%7Egem> > > > On 3/24/09 2:50 PM, "Andy Steingruebl" wrote: > > > On Mon, Mar 23, 2009 at 7:22 AM, Gary McGraw wrote: > hi guys, > > I think there is a bit of confusion here WRT "root" problems. In C, the > main problem is not simply strings and string representation, but rather > that the "sea of bits" can be recast to represent most anything. The > technical term for the problem is the problem of type safety. C is not type > safe. > > Really? It isn't that the standard von Neumann architecture doesn't > differentiate between data and code? We've gone over this ground before > with stack-machines like the Burroughs B5500 series which were not > susceptible to buffer overflows that changed control flow because code and > data were truly distinct chunks of memory. > > Sure its a different programming/hardware model, but if you want to fix the > root cause you'll have to go deeper than language choice right? You might > have other tradeoffs but the core problem here isn't just type safety. > > Just like in the HTML example. The core problem is that the > language/format mixes code and data with no way to differentiate between > them. > > Or is my brain working too slowly today? > -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software SecurityAlchemist(informIT)
On Mon, Mar 23, 2009 at 7:22 AM, Gary McGraw wrote: > hi guys, > > I think there is a bit of confusion here WRT "root" problems. In C, the > main problem is not simply strings and string representation, but rather > that the "sea of bits" can be recast to represent most anything. The > technical term for the problem is the problem of type safety. C is not type > safe. Really? It isn't that the standard von Neumann architecture doesn't differentiate between data and code? We've gone over this ground before with stack-machines like the Burroughs B5500 series which were not susceptible to buffer overflows that changed control flow because code and data were truly distinct chunks of memory. Sure its a different programming/hardware model, but if you want to fix the root cause you'll have to go deeper than language choice right? You might have other tradeoffs but the core problem here isn't just type safety. Just like in the HTML example. The core problem is that the language/format mixes code and data with no way to differentiate between them. Or is my brain working too slowly today? -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Security in QA is more than exploits
On Wed, Feb 4, 2009 at 7:26 PM, Paco Hope wrote: > > Andy also said "I think we lose something when we start saying 'everything > is > relative.'" I think we lose something more important if we try to impose > abolutes: we lose the connection to the business. No business operates on > absolutes and blind imperatives. Few, if any, profit-focused businesses > dogmatically fix all remotely exploitable SQL injections. Every business > looks > pragmatically at these things. Fixing the bug might cause the release of > the > product to slip by 6 weeks or a major customer to buy a competitor's > product > this quarter instead of waiting for the release. It's always a judgment > call > by the business. Even if their goal and their track record is fixing 100% > of > sev 1 issues before release, you know that each sev 1 issue was considered > in > terms of its cost, impact, schedule delay and so on. The ppint here though is that repeatable processes do matter. Having a standard of what constitutes a given severity of bug standardized in a policy statement is a good thing. Sure that is hard as every application is different, but you need a starting place. And so while my standards don't say "XSS always equals P1" they do say "XSS that can be discovered in an external facing application" or even slightly more generically than that. So my bug priority matrix does talk about business impact because that is what matters, but I still have to give real world examples to folks who aren't expert security testers of how to handle a bug when they come across it. And we need to provide clear guidance in standards because every single bug shouldn't require an ad-hoc trage process. > > It is an outstanding idea for infosec guys to provide security test cases, > or > the framework for them, to QA. That beats the heck out of what they usually > do. However, a bunch of test cases for XSS, CSRF, SQL injection and so on > will > not map easily to requirements or to the QA workflow. At what priority do > they > execute? When the business (inevitably) squeezes testing to try to claw > back a > day or two on the slipped schedule, can any of these security tests be left > out? Why or why not? Without hanging them into the QA workflow with clear > traceability, QA will struggle to prioritize them correctly and maintain > them. > Security requirements would make that priority and maintenance > straightforward. At this point I'm not disagreeing with you, but taking > your good approach and extending it a step farther. I undertsand this, but handing security requirements to QA folks without a set of reeatable test cases for doing them isn't going to help much, in mos organizations. James Whittaker doesn't work for me :) . And if you're developing web applications you're probably going to have some set of standardized testing you do. You need to have a repository of test cases for certain things, and I think testing for certain type of attacks is probably a decent starting point. Sure you want QA to own those, but if you're worried about buffer overflows you've going to have a bunch of standard test cases, test scenarios, test data (long input strings, inputs with null bytes in them, etc) that you're going to reuse a bunch of times so that each tester isn't starting from scratch when they see the security requirments - "Application must handle input properly and not crash." I don't think we're far off here in what we're saying, but repeatability is key. Leaving the interface with QA at the level of security requriements in a functional spec isn't going to cut it. And, you're probably going to have some standardized set of security requirements for a whole swath of your applications that you might not want to repeat ad-naseum in every single product/feature spec. This is the place for standards, policies, and testing guidelines so that this becomes just part of the regular QA cycle. -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Security in QA is more than exploits
On Wed, Feb 4, 2009 at 11:17 AM, Paco Hope wrote: > Before anyone talks about vulnerabilities to test for, we have to figure > out what the business cares about and why. What could go wrong? Who cares? > What would the impact be? Answers to those questions drive our testing > strategy, and ultimately our test plans and test cases. Paco, I don't really read what Robert wrote this way. I think what this general "risk management" approach misses is that certain things are always going to be defects, bugs, etc. Sure there are differences per-business and per-application. All bugs aren't created equal. But I think we lose something when we start saying "everything is relative." Each application, each business, each org needs a testing plan, strategy, and a definition of what they care about. At the same time there are going to be common types of tests that everyone performs. All Robert is pointing out is that if certain classes of vulnerabilities are important to you, then you want to have a common testing process for them. Bias #3 is that idea that a bunch of web vulnerabilities are equivalent in > impact to the business. That is, you just toss as many as you can into your > test plan and test for as much as you can. This isn't how testing is > prioritized. Again, I don't think he's saying this at all. Where I work every XSS is absolutely critical, and we get them fixed immediately. this might not be the case elsewhere. Some folks don't really worry about XSS that much. Because I can find differences though doesn't mean that everything is relative. Authentication bypass, SQL Injection, these types of things tend to rate HIGH/P1/Major for almost everyone, and I think. > > You don't organize testing based on which top X vulnerabilities are likely > to affect your organization (as the blog suggests). Likelihood is one part > of the puzzle. Business impact is the part that is missing. You prioritize > security tests by risk severity—that marriage of likelihood and impact to > the business. If I have a whole pile of very likely attacks that are all low > or negligible impact, and I have a few moderately likely attacks that have > high impact, I should prioritize my testing effort around the ones with > greater impact to my business. Again - fair enough. But at the same time you also prioritize around effort to test and avoid, right? Bias #4 is the treatment of testers like second class citizens. In the blog > article, developers are "detail oriented" have a "deep understanding of > flows." Constrast this with QA who merely understand "what is provided to > them." They sound impotent, as if all they can do is what they're told. > Software testing, despite whatever firsthand experience the author may have, > is a mature discipline. It is older and more formalized than "security" as a > discipline. Software testing is older than the Internet or the web. If > software testing as a discipline has adopted security too slowly, given > security's rise to the forefront in the marketplace, that might be a > legitimate criticism. But I don't approve of the slandering QA by implying > that they just take what's given them and execute it. QA is hard and there > are some really bright minds working in that field. I don't think Robert's comments were about the general field/discipline of QA. His commentary was more about the types of QA organizations he has come across. My own experience (albeit limited as well) has found a relative lack of highly skilled QA folks as well. There are people responsible for quality that are at the level you're talking about but I still bet they are more the exception than the rule. Most QA organizations are staffed with people writing relatively simple tests, running through positive functional testing, etc. I think the point here is that you have to tailor expectations to the organization you have. Much in the same way that if you have mostly junior programmers who are lucky to get their code to compile you're probably not going to have a lot of luck training them on formal proofs, rigorous design, etc. -- Andy Steingruebl stein...@gmail.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Unclassified NSA document on .NET 2.0 Framework Security
On Tue, Nov 25, 2008 at 9:48 AM, Gunnar Peterson <[EMAIL PROTECTED]>wrote: > > but actually the main point of my post and the one i would like to > hear people's thoughts on - is to say that attempting to apply > principle of least privilege in the real world often leads to drilling > dry wells. i am not blaming any group in particular i am saying i > think it is in the "too hard" pile for now and we as software security > people should not be advocating for it until or unless we can find > cost effective ways to implement it. > > I'd love to hear someone from Microsoft talk about the creation of default ready for shipping service security profiles for Server-2008. Windows has lots of services and lots of privileges that can be configured. Every paper I've generally seen on the subject is about reverse engineering least privileges by reducing them, checking whether the software still functions, looking for access violations, and then increasing the privileges until things start working. A lot like this Calvin and Hobbes comic: CALVIN: How do they know the load limit on bridges, Dad? DAD: They drive bigger and bigger trucks over the bridge until it breaks. Then they weigh the last truck and rebuild the bridge. This is what we do with least privilege, but without ever knowing whether we've really gotten the least privileges, or not. Hell, in a modern operating system how the hell do you figure this out anyway? - Andy ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] No general-purpose computer, or everything under surveillance?
On Tue, May 13, 2008 at 1:51 PM, David A. Wheeler <[EMAIL PROTECTED]> wrote: > > If you interpret the definition of these terms of "general purpose" and > "surveillance" differently, i.e., "limit applications to least > privilege, and locally monitor their behavior", then I'd agree. But > this is another way of saying "we need to implement least privilege and > local monitoring", which are well-established security principles. And > it's already happening, e.g.: That's fine in principle. Have we ever seen a usable system based on these principles that user's didn't reject/hate? Look at the general press and people's perceptions of the security of Leopard v. Vista. We can complain all we want to about UAC and perhaps the constant "nagging" but as Apple's commercials so clearly pointed out people hate their computer explicitly and publicly trying to keep them safe. > * Deployment is already moving away from general-purpose privileges. > SELinux lets people define very fine-grained privileges, so that a > program does NOT have arbitrary rights. OLPC goes even further; its > security model is remarkable and worth learning from. That's great, but all of these schemes rely on: - Expert users to configure a policy for new software - Each piece of software to ship with a correct least-privilege configuration (how do we get the malware authors to do this?) - A user who doesn't choose to override the default security settings so they can see the dancing hamsters > * Observing behavior (and making decisions based on them) is ALREADY > what some systems and network systems do. Same here. We're still light years away from being able to do this in practice. We can't tell that the new financial management software you just downloaded is "supposed" to ask for your bank password, and that the game you just downloaded shouldn't. And user's aren't generally informed enough to make these kinds of decisions either, especially given the user interface we typically give them. Don't forget all of the wonderful fun we've had over the years getting people to not open executables sent via email, not to visit sites with a self-signed SSL certificate, to check for the lock icon in their browser, to make sure that their wireless settings don't allow them to connect to random wireless access points, etc > But the difference is who is in final control. In the end, the users of > computers should be in final control, not their makers, or we have given > up essential liberty. I don't think you're fundamentally wrong in that I'm not (and I can't speak for others) in favor of removing the controls completely. But, we ought to be shipping systems whose fundamental defaults are easier to use, more secure, and really hard to override. Compare IE6/FF2 to IE7/FF3 on this front. Sure you can still visit the site with the self-signed certificate, and you can still visit a site that they've categorized as a phishing site. But it isn't quite as easy as it used to be, and I'd say that's a good thing. If you own a tablesaw it comes with a blade guard. Its probably a good idea that it does. If you really want to you can remove it and I don't really feel the need to stop you. Unless I'm paying for your insurance that is. Your car also comes with pollution controls. These pollution controls often inhibit your max speed, acceleration, etc. They are really hard to, or impossible to disable. They also make our environment cleaner. Which is the right analogy for the personal computer? -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Microsoft's message at RSA
On Fri, May 9, 2008 at 3:42 PM, Gary McGraw <[EMAIL PROTECTED]> wrote: > Hi andy (and everybody), > > Indeed. I vote for personal computer liberty over guaranteed iron clad > security any day. For amusing and shocking rants on this subject google up > some classic Ross Anderson. Or heck, I'll do it for you: > http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html I've heard this point for years, and yet when we actually look at ways of solving the consistent problems of software security, we always come back to tamper-proof/restricted-rights as a pretty reasonable starting point. I don't know whether this mailing list is really the place for me to advocate about this, but every time we get into a situation where we talk about high reliability (electronic voting for example) people are all up in arms that we haven't followed pretty strict practices to make sure the machines don't get hacked, aren't hackable by even experts, etc. hardened hardware, trusted computing bases, etc. But, if you want to try and apply the same engineering principles to protecting an individual's assets such as their home computer, bank account credentials, etc. then you're trampling on their freedom. I don't really see how we can viably have both. Sure we're looking at all sorts of things like sandboxing and whatnot, but given multi-purpose computing and the conflicting goals of absolute freedom and defense against highly motivated attackers, we're going to have to make some choices aren't we? I don't disagree that all of these technologies can be misused. Most can. We've all read the Risks columns for years about ways to screw things up. At the same time individual computers don't exist in isolation. They are generally part of an ecosystem (the internet) and as such your polluting car causes my acid rain and lung cancer. Strict liability isn't the right solution to this sort of public policy problem, regulation is. That regulation and control can take many forms, some good, some bad. I don't see the problem getting fixed though without some substantial reworking of the ecosystem. Some degree of freedom may well be a casualty. Please don't think I'm actually supporting the general decrease in liberty overall. At the same time I'm pretty sure that traffic laws are a good idea, speed limits are a good idea, even though they restrict individual freedoms.In the computing space I'm ok allowing people to opt-out but only if in doing to they don't pose a manifest danger to others. Balancing the freedom vs. the restriction isn't easy of course, and I'm not suggesting it is. I'm merely suggesting that all of the research we've ever done in the area doesn't point to our current model (relying on users to make choices about what software to use) promising. How to make this happen without it turning into a debacle is of course the tricky part. -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Microsoft's message at RSA
On Mon, May 5, 2008 at 10:24 AM, Gary McGraw <[EMAIL PROTECTED]> wrote: > hi sc-l, > > Here's an article about Mundie's keynote at RSA. It's worth a read from a > software security perspective. Somehow I ended up playing the foil in this > article...go figure. > > http://reddevnews.com/features/article.aspx?editorialsid=2470 > > So what do you guys think? Is this end-to-end trusted computing stuff going > to fly with developers? I think you're both right. I'm working on a longer writeup of the ideas on the end-to-end paper but I think you've captured part of the problem at the heart of things. We're going to have to trade some fundamental computing liberties to get the kind of security required to actually have trusted relationships via computers. Good or bad I don't want to comment on right now. If you've read "Code and other laws of cyberspace" by Lessig you'll see some of the same ideas albeit it from a more regulatory perspective than from a purely technical one. The updated "Code 2.0" book captures a lot of these same ideas. I think Charny is missing the mark ever so slightly when he says the security goals can be achieved without compromise on the part of privacy, or functionality. As Lessig clearly points out - the rules of the networks, computers, etc. aren't real rules in any sense. its not like they are physical laws, the rules are determined by code. This code, and the policy behind it, can change. I think the real question isn't whether this is going to fly with developers, its whether its going to fly with the public at large. Are people (and their proxies - Governments) going to finally demand a change in the the rules/game? -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] quick question - SXSW
On Wed, Mar 12, 2008 at 4:30 PM, Gary McGraw <[EMAIL PROTECTED]> wrote: > Hey andy, > > You mean AJAX one? Last time I went there was zero interest and even less > clue about security among attendees. The only shining light was a long > conversation I had with bill joy about security critical decisions those guys > screwed up with Java (especially with regards to closure). > > A decade of evangelism only goes so far! Do help! Fair enough :) I was looking at the program for the just finished SD West and the security track actually looks to have been pretty good. I think one thing we're missing from there is more emphasis on actual SDL process, rather than focus on individual items within it. Activities like how to form a steering group within a company, how to bootstrap some of the practices, etc. Do folks here have suggestions of conferences we ought to be targeting with these sorts of presentations, papers, etc? JavaOne seems like it might have been a good place to target. There are some smaller developer conferences out there, some general security conferences, and there has been discussion here and within OWASP as well of how we can start better targeting these forums for our evangelizing... Thoughts? -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] quick question - SXSW
On Tue, Mar 11, 2008 at 6:43 AM, Benjamin Tomhave <[EMAIL PROTECTED]> wrote: > I had just a quick query for everyone out there, with an attached thought. > > How many security and/or secure coding professionals are prevalently > involved with the SXSW conference this week? I know, I know... it's a big > party for developers - particularly the Web 2.0 clique - but I'm just > curious. > On a related note a quick perusal of the JavaOne conference tracks doesn't show a lot of content in this area either. Is this due to a lack of interest, or people in the security world not pitching talks to the development conference organizer? -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Darkreading: Getting Started
On Jan 9, 2008 4:48 PM, Gary McGraw <[EMAIL PROTECTED]> wrote: > hi sc-l, > > One of the biggest hurdles facing software security is the problem of how to > get started, especially when faced with an enterprise-level challenge. My > first darkreading column for 2008 is about how to get started in software > security. In the article, I describe four approaches: > 1. the top-down framework; > 2. portfolio risk; > 3. training first; and > 4. leading with a tool. Gary, I had success with #4, but not using the tools we usually think of for bootstrapping a program, namely static analysis or testing tools. When I took the position they had already settled on using Netegrity's Siteminder product for a common authentication and authorization scheme across all of the applications. I managed to get them to settle on doing a quasi-RBAC with Siteminder, using it almost as an identity service as well. Settling on one common high-quality authentication and authorization tool/framework had three effects: 1. It removed these services from the realm of development. They just had to integrate with it, but didn't have to figure out all of the corner cases to password changes, etc. that so often crop up, and people mess up in homegrown approaches. 2. It convinced developers to build clean interfaces in their code for things like authorization to call out externally and/or have the data provided to them in a standard fashion. By settling on RBAC it also helped a lot with role and permission modeling that did need to happen in the app. 3. In a shop that usually wanted to do everything itself, it broke that cycle and people got used to not having to write everything from scratch. It was a bit of a non-standard way to use a tool to bootstrap a security program. They essentially got sold Netegrity originally for the wrong reasons, but they picked it and in implementing it correctly did themselves a huge service. Just one data point on leading with a tool that focused more on architecture and design than it did on finding defects. -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 29, 2007 6:07 PM, Blue Boar <[EMAIL PROTECTED]> wrote: > Andy Steingruebl wrote: > > I like contractual approaches to this problem myself. People buying > > large quantities of software (large enterprises, governments) should > > get contracts with vendors that specify money-back for each patch they > > have to apply where the root cause is of a given type. For example, I > > get money back every time the vendor has a vulnerability and patch > > related to a buffer overflow. > > That changes the incentive to hide security bugs and not patch them or > to slipstream them. Any regulatory regime that deals with security issues is subject to the same thing. Whether its PCI and eluding Auditors or SOX-404 and documenting controls, you'll always have people that want to try to game the system. I'm not suggesting that this is the only solution, but from an economics and motivation perspective SLAs related to software and security features are more likely to work and incur lower overhead than a regulatory regime that is centrally administered. Sure, there are going to be pieces of software that this scheme won't work for or where there aren't very many bulk purchasers, only 1-off purchasers. Things like video games for example where there aren't large institutional purchases. That said, I think contracts between large consumers and software producers would be a good start to the problem. -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 29, 2007 2:47 PM, Kenneth Van Wyk <[EMAIL PROTECTED]> wrote: > > The article quotes David Rice, who has a book out called > "Geekconomics: The Real Cost of Insecure Software". In it, he tried > to quantify how much insecure software costs the public and, more > controversially, proposes a "vulnerability tax" on software > developers. He believes such a tax would result in more secure > software. I like contractual approaches to this problem myself. People buying large quantities of software (large enterprises, governments) should get contracts with vendors that specify money-back for each patch they have to apply where the root cause is of a given type. For example, I get money back every time the vendor has a vulnerability and patch related to a buffer overflow. I wrote a small piece about this: http://securityretentive.blogspot.com/2007/09/buffer-overflows-are-like-hospital.html Turns out that the federal government isn't paying for avoidable outcomes anymore. Certain things fall into the rough category of "negligence" and so aren't covered. We ought to just do this for software via a contracts mechanism. I'm not sure we want to start out with a big-bang public-policy approach on this issue. We'd want to know a lot more about how the economics work out on a small scale before applying it to all software. -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___