Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Dec 3, 2007 8:34 AM, silky <[EMAIL PROTECTED]> wrote: > > how does anyone know how to hire anyone for a job that they themselves > aren't qualified for? well, you pay professionals to do it. > recruitment agents. this should be part of their role. and absolutely > agreed; most certification is useless, secure programming is no > different. > > Um, have you ever dealt with a recruitment agent? How are they going to tell? The guy had secure coding on his CV? Ok A few points in general: 1 - I'm yet to meet a programmer who intentionally creates security problems in production code. Most developers I meet are very much interested in secure coding, so in that respect things are a lot better than they were 5 years ago when very few people knew, and even less cared. Penalizing developers for writing insecure code is not the answer, because as others have pointed out all it will do is encourage people to cover things up and never talk about security vulnerabilities. You have to take into account the environment in which they work, which is most likely not conductive to producing quality output, and also that even the best people will make mistakes. I've heard of some companies taking the attitude that code level security issues are OK, because it means they didn't waste too much money on higher quality outsourced developers ... and from a security vendor no less, whoda thunk ;-) 2 - Source code scanners still have a long way to go. I realize there are a lot of vested interests on this list, but based on my recent experiences with commercial scanners it is pure folly relying on them to secure your applications. They are useful tools with a real place, and better than previous generations, but overpriced and still of limited value. That they are sold as "quality tools" rather than "security tools" is telling. Running code through 3 different scanners is great, but a) who has the time, b) who can justify 3 different tools to management, c) who's going to wield the rod, and d) why do you think anyone would actually care about the rod? 3 - Taxes, government bodies, penalties, etc. all bullshit for now. When its possible to prove a program is correct, ok, but until then its way to fuzzy and wobbly to start throwing bureaucracy at. It would be good to see some form of self-regulation, ideally from a credible independent source, not a cert merchant or security services vendor. Yours in brevity, Pete ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Dec 1, 2007 7:59 AM, Steven M. Christey <[EMAIL PROTECTED]> wrote: > > On Fri, 30 Nov 2007, silky wrote: > > > i still think all these ideas are wrong and the model is simple: don't > > employ people who write and generate insecure code. it's just part of > > programming. you wouldn't hire a doctor to be a gardener. don't hire > > an idiot to program your apps. > > How does a manager who hasn't written code in the last 10 years (if ever) > know how to distinguish the idiots from the experts? Secure programming > certification and education is, at best, in its infancy. how does anyone know how to hire anyone for a job that they themselves aren't qualified for? well, you pay professionals to do it. recruitment agents. this should be part of their role. and absolutely agreed; most certification is useless, secure programming is no different. > - Steve -- mike http://lets.coozi.com.au/ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 30, 2007 8:15 AM, Kenneth Van Wyk <[EMAIL PROTECTED]> wrote: > But the real problem with it, as I said, is metrics. Should it be > based on (say) defect density per thousand lines of code as reported > by (say) 3 independent static code analyzers? What about design > weaknesses that go blissfully unnoticed by code scanners? (At least > the field experience concept could begin to address these over time, > perhaps.) You, sir, are on the right track! Secure design inspection and [secure] code review results can give these metrics. I think that code that is high-percent CAPEC-free during design inspection and high-percent CWE-free during code review should be given a higher software secure assured rating than inspections/reviews that contain many CAPEC/CWE issues. Extended criteria to the above would involve designs/code that are formally specified or verified (using formal methods/logic). Think Orange Book (TCSEC, ITSEC, Common Criteria, et al) and you'll know what I mean by this. The interesting thing about Orange Book is that even the most assured/functional division, A1, only requires that the formal top-level specification (i.e. TCB design aka security policy model aka access-control matrix) be formally verified. Criteria that goes beyond A1 would formally specify and/or verify actual source code. The NHTSA NCAP five-star rating system (Chapter 2 in Geekonomics) that is used in automobile safety utilizes crash test dummies in a similar way. When you say 3 independent scanners, I really like what you are talking about here. I often see this more as a three-step process: first start with secure design inspection, move to binary/bytecode analysis, and then move to manual code review augmented by static source code analysis. Each part of the process can add information (i.e. generate test-cases) to the next process to improve the reviews. I always though that the software security assurance five-star rating system would work in a similar way as the Food Safety Network works - by using samples to identify "diseased/vulnerable" product. This would mean that we could take the most core, critical components in any application (the ones that require the most security). Then take samples of that code (by choosing the ones that "smell the worst / look the most diseased"), by combining code coverage with cyclomatic complexity. There is a project that combines these two metrics called Crap4J. Code that has been already covered with testing (and that has shown the highest "cluster" of bugs can be included. Untested areas of code can also be included. Cyclomatic complexity metrics can slightly augment the process, although from my persepctive - less complex code can contain vulnerabilities just the same. Let's say that Cigital (on-track to be CWE-Compatible/Effective) performs a secure design inspection (note: I wonder if there is going to be a CAPEC-Compatible program?) and hands their report to Veracode (currently CWE-Compatible). Veracode can then perform binary/bytecode analysis and hand their report back to Cigital for the secure code review. Using a mix of CWE-Compatible tools, Cigital inspectors can then build the last of three reports. All three reports could be used to score an application using a standardized five-star rating system. While applications should be tested at every revision, some applications may only be tested on some sort of scheduled basis. Companies who refuse to provide timely designs and code samples should be penalized by heavy fines. I'm not sure who is going to enforce this, but I see it as a government function. However, this isn't a regulation - it's simply assurance level reporting, made available to the public (think equivalents to safercar.gov, Consumer Reports, and automobile price stickers). Imagine flipping through pages of a tech magazine and seeing secure software assurance five-star ratings listed at the top section of product reviews, right next to or below the name of the product. Of course, open-source software has readily available code, but there should be penalities for OSS developers who do not provide timely software designs. Clearly, these cannot be directly monetary - but instead the rating board can fine software vendors who use the open-source software as a third-party component in their applications (or shipped along with their products). This will likely provide the necessary pushback on open-source developers to provide software designs. Note that my five-star rating system suggested is only half of a secure software initiative - the assurance part. According to the Orange Book, there has to be functional measures that placate the inherent problems with trusting the TCB: object reuse and covert channels. The Orange Book specifies that the TCB requires a security kernel, security perimeter, and trusted paths to/from the users and the TCB (input validation is referred to as "secure attention keys"). Not only does the acce
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 30, 2007 1:59 PM, Steven M. Christey <[EMAIL PROTECTED]> wrote: > > i still think all these ideas are wrong and the model is simple: don't > > employ people who write and generate insecure code. it's just part of > > programming. you wouldn't hire a doctor to be a gardener. don't hire > > an idiot to program your apps. > > How does a manager who hasn't written code in the last 10 years (if ever) > know how to distinguish the idiots from the experts? Secure programming > certification and education is, at best, in its infancy. Felix Linder said it best in his recent presentation, "Security and Attack Surface of Modern Applications". Commercial software doubles in size every 18 months. How are we going to train developers and security professionals fast enough to keep up with that pace? Cheers, Andre (I swear this is the last one for now, sorry for splitting this into so many messages) ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 29, 2007 5:13 PM, Andy Steingruebl <[EMAIL PROTECTED]> wrote: > I like contractual approaches to this problem myself. People buying > large quantities of software (large enterprises, governments) should > get contracts with vendors that specify money-back for each patch they > have to apply where the root cause is of a given type. For example, I > get money back every time the vendor has a vulnerability and patch > related to a buffer overflow. > > I wrote a small piece about this: > http://securityretentive.blogspot.com/2007/09/buffer-overflows-are-like-hospital.html If you read Geekonomics, you'll find out why this may never happen. Because of existing software contracts, this is impossible today. David Rice dedicates chapter five to a discussion on this, but it also sprinkled throughout the book. Cheers, Andre ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 30, 2007 1:37 PM, Steven M. Christey <[EMAIL PROTECTED]> wrote: > > Software vendors will need a 3 tier approach to software security: Dev > > training and certification, internal source testing, external > > independent audit and rating. > > I don't think I've seen enough emphasis on this latter item. A > sufficiently vibrant set of independent testing organizations that follows > some established procedures would be one way for customers to get an > independent guarantee of software's (relative) security. This in turn > could put pressure on other vendors to follow suit. PCI PA-DSS, ISECOM OSSTMM v3, and OWASP Secure Software Contract Annexes (combined with the OWASP Web Security Certification Framework) will be available for use in the near-immediate future. Many other similar efforts will likely follow. > The challenges would be defining what those procedures should be, > maintaining them in a way so that they remain relevant, convincing > existing research organizations to participate, and handling the problem > of free (as in beer) software. > > A gazillion years ago, John Tan of the L0pht proposed an "Underwriters > Laboratories" for software, and maybe its time is almost upon us. I thought this document was more about using FIPS 140-1 to verify hardware-based cryptographic systems (we now have FIPS 140-2 to do this for software crypto systems), while providing metrics of how long it takes to break said crypto via brute-force (in the same way it takes a safe-cracker to bust a safe open)? It's also interesting to note that the FIPS 140-n standards have four levels of verification. Cheers, Andre ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 29, 2007 3:47 PM, Kenneth Van Wyk <[EMAIL PROTECTED]> wrote: > The article quotes David Rice, who has a book out called > "Geekconomics: The Real Cost of Insecure Software". In it, he tried > to quantify how much insecure software costs the public and, more > controversially, proposes a "vulnerability tax" on software > developers. He believes such a tax would result in more secure > software. I read Geekonomics a few weeks ago, when it became available on SafariBooksOnline. I have mixed feelings about the author, the book, and the subject matter. His discussions in the book are great - especially in the first four chapters. However, I find the solutions and conclusions that he comes to in the last chapter (including all this "tax" business) to leave a lot to be desired. My primary reasons for disliking this "vulnerability tax" are that it doesn't take into account both web applications and Software-as-a-Service. Not surprisingly, the book fails to cover both of these topics. I'm not sure if David Rice does this on purpose, because he does touch on open-source software issues (dedicating an entire chapter to it, and sprinkling the topic through the book). BTW - I think David Rice brought in the idea of a "vulnerability tax" because it was the first analogy that popped into his head from the research and discussion brought about in his book. On page 157 (Chapter 4), he discusses the incentives put forward by the ISAlliance in the form of Cyber Insurance Discounts - http://www.isalliance.org/content/view/29/71/ Quote, "AIG, the world's largest provider of cyber insurance, agreed to provide premium credits of up to 15% for companies that join the ISAlliance and subscribe to these best practices. For many companies, the cash value of this discount may be worth more than the entire cost of ISAlliance membership". More details in the Market Incentives Legislative whitepaper here - http://www.isalliance.org/content/view/92/229/ In the last chapter of Geekonomics, David Rice talks to many solutions and incentives besides the "vulnerability tax", but none are quite as coherent (or controversial). I suggest reading the entire book regardless of what you think about what amounts to a very small section/topic. > IMHO, if all developers paid the tax, then I can't see it resulting in > anything other than more expensive software... Perhaps I'm just > missing something, though. David Rice does propose the tax for both software vendors (not sure if this includes SaaS) and consumers, which is stated more clearly in the book. The way he proposes all this doesn't seem like a solution - as many vendors will turn this around on governments and force the consumers to, again, eat the cost of any type of effort. Does anyone expect that software vendors or open-source software makers are really going to be able to produce more secure software because of a "vulnerability tax"? Personally, I don't think this gets very close to the root-cause of software vulnerabilities. Cheers, Andre ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Fri, 30 Nov 2007, silky wrote: > i still think all these ideas are wrong and the model is simple: don't > employ people who write and generate insecure code. it's just part of > programming. you wouldn't hire a doctor to be a gardener. don't hire > an idiot to program your apps. How does a manager who hasn't written code in the last 10 years (if ever) know how to distinguish the idiots from the experts? Secure programming certification and education is, at best, in its infancy. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Fri, 30 Nov 2007, Shea, Brian A wrote: > Software vendors will need a 3 tier approach to software security: Dev > training and certification, internal source testing, external > independent audit and rating. I don't think I've seen enough emphasis on this latter item. A sufficiently vibrant set of independent testing organizations that follows some established procedures would be one way for customers to get an independent guarantee of software's (relative) security. This in turn could put pressure on other vendors to follow suit. The challenges would be defining what those procedures should be, maintaining them in a way so that they remain relevant, convincing existing research organizations to participate, and handling the problem of free (as in beer) software. A gazillion years ago, John Tan of the L0pht proposed an "Underwriters Laboratories" for software, and maybe its time is almost upon us. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
IMO the path to changing the dynamics for secure coding will reside in the market, the courts, and the capacity of the software industry to measure and test itself and to demonstrate the desired properties of security, quality, and suitability for purpose. In today's market we do well in suitability for purpose (aka marketing then testing, pilot, and purchase) but I believe we do poorly at security and quality. Rather than try to "tax" software for being bad, it will likely end up being more successful to reward them for being good in the form of market support and sales. That dynamic will probably work better, and will empower the companies and individuals (who choose to get this involved) to make the choices of security and quality against cost or convenience. The punishment for bad software is lost sales and eventual loss of the use of that product within the market. How? Software vendors will need a 3 tier approach to software security: Dev training and certification, internal source testing, external independent audit and rating. The open source version of this can be the same, but applied more individually or at the "derivative product" level (ie if I make a Linux based appliance from open source, I become the owner of the issues in that Linux derivative) The legal side will need to alter the EULA away from a "hold harmless" model to one where vendors and software buyers can assert that the software is expected to perform at certain security or quality levels, and have a known backing from a legal recourse side. Companies that make/sell software and can be sued offer more recourse than open source, but that's not better or worse just different. The buyer can choose the degree of security and quality rating (based on the audit etc) and the legal recourse they want via the SLA or choice to use open source. For a high assurance system one might choose a software vendor with a contract and SLA, while also choosing open source for lower assurance efforts that come with less recourse but less cost too. This opens a market for companies who choose to "resell" open source, but provide the support and assurances. It also allows anyone to choose which factors are important, and buy / use accordingly. If choice results in "downstream" impacts, then the deployer of the software is initially accountable, and they must determine if they have recourse via SLA or contract to their provider. If they accepted risk of a non-supported software package, then their deployment and the ensuing harm is their responsibility. If they have recourse, as the saying goes "they pass the savings on". Home users are also empowered if they choose to be, but overall they gain in two major ways. The market will be driven to more secure software over time without their direct knowledge (as companies and governments choose to require software to be more secure) and they can benefit from any legal recourses that are available for notable security failures or quality gaps. Using these factors anyone could make decisions based on the need for recourse (courts), assurance (market), and quality (industry rating and standards for security and quality) and come away with software that meets their needs in each area, without excluding open or closed source or leaving the corporate / consumer customers unprotected. DISCLAIMER: Views are my own, and not those of my employer, and were generated over a cup of coffee in a fairly stream of consciousness kind of way. Grains of salt not supplied, but are recommended when consuming the contents. :) -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Leichter, Jerry Sent: Friday, November 30, 2007 6:28 AM To: der Mouse Cc: SC-L@securecoding.org Subject: Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading | > Just as a traditional manufacturer would pay less tax by | > becoming "greener," the software manufacturer would pay less tax | > for producing "cleaner" code, [...] | | > One could, I suppose, give rebates based on actual field experience: | > Look at the number of security problems reported per year over a | > two-year period and give rebates to sellers who have low rates. | | And all of this completely ignores the $0 software "market". (I'm | carefully not saying "free", since that has too many other meanings, | some of which have been perverted in recent years to mean just about | the opposite of what they should.) Who gets hit with tax when a bug | is found in, say, the Linux kernel? Why? I'll answer this along my understanding of the lines of the proposal at hand. I have my doubts about the whole idea, for a number of reasons, but if we grant that it's appropriate for for-fee software, it's easy decid
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
>>> Just as a traditional manufacturer would pay less tax by >>> becoming "greener," the software manufacturer would pay less >>> tax for producing "cleaner" code, [...] >> And all of this completely ignores the $0 software "market". Who >> gets hit with tax when a bug is found in, say, the Linux kernel? > [T]f we grant that [the idea is] appropriate for for-fee software, > it's easy decide what happens with free software - though you won't > like the answer: The user of the software pays anyway. Well, it's full of enforcement issues; with for-pay, the point of sale is a comparatively well-regulated point at which taxes can be applied, but there is no such convenient choke-point for gratuit software. How would you even *find* all the relevant users? Also, in the open-source world, people mix-and-match software to a degree not seen in the closed-source world. Does everyone who ever downloaded a copy pay? Everyone who's still running it? Everyone who ever ran it? What about people who fixed the relevant bug themselves? The only answers I can see are (1) to completely forbid software sharing between end users, even when it's not against copyright law, or (2) a massive DRM-style invasion of everyone's machines, so as to report exactly what software they're running to some enforcement authority. I can't see either one flying. And, incidentally, why would you think I wouldn't like that answer? As far as I know I'm not under any jurisdiction considering such a stupid idea (yes, I consider it stupid), and if some other jurisdiction wants to break their software industry that badly, it's their lookout. > The argument the author is making is that security problems impose > costs on *everyone*, not just on the party running the software. > [...externalities...] > Imposing a tax is the classic economic answer to such a market > failure. The tax's purpose is (theoretically) to transfer the > externalized costs back to those who are in a position to respond. > In theory, the cost for security problems - real or simply possible; > we have to go with the latter because by the time we know about the > former it's very late in the game - So? Why is that a problem? It seems to me that someone who runs, say, Windows, with all its horrible security record, in such a way as to not cause a problem (this is not a hypothetical case), should not be taxed, because that user is not imposing any externalized costs on the world at large. There's a problem finding everyone who's offended, but it's no worse than the problems of finding all users of a piece of gratuit software. > should be born by those who develop the buggy code, and by those who > choose to use it. I can argue both ways wrt imposing it on the developers. Often enough, the bugs are not bugs, but rather an end user misapplying software. I've often enough written software that was perfectly fine in its intended application but, if misapplied, could be a risk. /~\ The ASCII der Mouse \ / Ribbon Campaign X Against HTML [EMAIL PROTECTED] / \ Email! 7D C8 61 52 5D E7 2D 39 4E F1 31 3E E8 B3 27 4B ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 29, 2007, at 6:35 PM, Leichter, Jerry wrote: So he's not completely naive, though the history of security metrics and standards - which tend to produce code that satisfies the standards without being any more secure - should certainly give on pause. One could, I suppose, give rebates based on actual field experience: Look at the number of security problems reported per year over a two- year period and give rebates to sellers who have low rates. Right, so this is where I believe the entire idea would fall apart. I don't think we have adequate metrics today to measure products fairly. Basing the tax on field experience would also be problematic to measure well, although I could see this leading to development organizations getting some sort of actuarial score. But the real problem with it, as I said, is metrics. Should it be based on (say) defect density per thousand lines of code as reported by (say) 3 independent static code analyzers? What about design weaknesses that go blissfully unnoticed by code scanners? (At least the field experience concept could begin to address these over time, perhaps.) I do think that software developers who produce bad (security) code should be penalized, but at least for now, I still think the best way of doing this is market pressure. I don't think we're ready for more, on the whole, FWIW. But _consumers_ wield more power than they probably realize in most cases. Cheers, Ken - Kenneth R. van Wyk SC-L Moderator KRvW Associates, LLC http://www.KRvW.com smime.p7s Description: S/MIME cryptographic signature ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
| > Just as a traditional manufacturer would pay less tax by | > becoming "greener," the software manufacturer would pay less tax | > for producing "cleaner" code, [...] | | > One could, I suppose, give rebates based on actual field experience: | > Look at the number of security problems reported per year over a | > two-year period and give rebates to sellers who have low rates. | | And all of this completely ignores the $0 software "market". (I'm | carefully not saying "free", since that has too many other meanings, | some of which have been perverted in recent years to mean just about | the opposite of what they should.) Who gets hit with tax when a bug | is found in, say, the Linux kernel? Why? I'll answer this along my understanding of the lines of the proposal at hand. I have my doubts about the whole idea, for a number of reasons, but if we grant that it's appropriate for for-fee software, it's easy decide what happens with free software - though you won't like the answer: The user of the software pays anyway. The cost is computed in some other way than as a percentage of the price - it's no clear exactly how. Most likely, it would be the same tax as is paid by competing non-free software with a similar security record. (What you do when there is no such software to compare against is an interesting problem for some economist to work on.) The argument the author is making is that security problems impose costs on *everyone*, not just on the party running the software. This is a classic economic problem: If a party can externalize its costs - i.e., dump them on other parties - its incentives become skewed. Right now, the costs of security problems for most vendors are externalized. Where do they do? We usually think of them as born by that vendor's customers. To the degree that's so, the customers will have an incentive to push costs back on to the vendor, and eventually market mechanisms will clean things up. To some degree, that's happened to Microsoft: However effective or ineffective their security efforts, it's impossible to deny that they are pouring large sums of money into the problem. To the degree that the vendors' customers can further externalize the support onto the general public, however, they have no incentive to push back either, and the market fails. This is pretty much the case for personal, as opposed to corporate, users of Microsoft's software. Imposing a tax is the classic economic answer to such a market failure. The tax's purpose is (theoretically) to transfer the externalized costs back to those who are in a position to respond. In theory, the cost for security problems - real or simply possible; we have to go with the latter because by the time we know about the former it's very late in the game - should be born by those who develop the buggy code, and by those who choose to use it. A tax on the code itself directly takes from the users of the code, indirectly from the vendors because they will find it more difficult to compete with vendors who pay lower tax rates, having written better code. It's much harder to impose the costs directly on the vendors. (One way is to require them to carry insurance - something we do with, say, trucking companies). In any case, these arguments apply to free software in exactly the same way they do for for-fee software. If I cars away for free, should I be absolved of any of the costs involved if they pollute, or cause accidents? If I'm absolved, should the recipients of those cars also be absolved? If you decide the answer is "yes", what you've just decided is that *everyone* should pay a hidden tax to cover those costs. In what why is *that* fair? -- Jerry ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
your plan would simply result in vendors denying the existence of bugs. i still think all these ideas are wrong and the model is simple: don't employ people who write and generate insecure code. it's just part of programming. you wouldn't hire a doctor to be a gardener. don't hire an idiot to program your apps. On 11/30/07, Andy Steingruebl <[EMAIL PROTECTED]> wrote: > On Nov 29, 2007 2:47 PM, Kenneth Van Wyk <[EMAIL PROTECTED]> wrote: > > > > The article quotes David Rice, who has a book out called > > "Geekconomics: The Real Cost of Insecure Software". In it, he tried > > to quantify how much insecure software costs the public and, more > > controversially, proposes a "vulnerability tax" on software > > developers. He believes such a tax would result in more secure > > software. > > I like contractual approaches to this problem myself. People buying > large quantities of software (large enterprises, governments) should > get contracts with vendors that specify money-back for each patch they > have to apply where the root cause is of a given type. For example, I > get money back every time the vendor has a vulnerability and patch > related to a buffer overflow. > > I wrote a small piece about this: > http://securityretentive.blogspot.com/2007/09/buffer-overflows-are-like-hospital.html > > Turns out that the federal government isn't paying for avoidable > outcomes anymore. Certain things fall into the rough category of > "negligence" and so aren't covered. We ought to just do this for > software via a contracts mechanism. I'm not sure we want to start out > with a big-bang public-policy approach on this issue. We'd want to > know a lot more about how the economics work out on a small scale > before applying it to all software. > > -- > Andy Steingruebl > [EMAIL PROTECTED] > ___ > Secure Coding mailing list (SC-L) SC-L@securecoding.org > List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l > List charter available at - http://www.securecoding.org/list/charter.php > SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) > as a free, non-commercial service to the software security community. > ___ > -- mike http://lets.coozi.com.au/ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 29, 2007 6:07 PM, Blue Boar <[EMAIL PROTECTED]> wrote: > Andy Steingruebl wrote: > > I like contractual approaches to this problem myself. People buying > > large quantities of software (large enterprises, governments) should > > get contracts with vendors that specify money-back for each patch they > > have to apply where the root cause is of a given type. For example, I > > get money back every time the vendor has a vulnerability and patch > > related to a buffer overflow. > > That changes the incentive to hide security bugs and not patch them or > to slipstream them. Any regulatory regime that deals with security issues is subject to the same thing. Whether its PCI and eluding Auditors or SOX-404 and documenting controls, you'll always have people that want to try to game the system. I'm not suggesting that this is the only solution, but from an economics and motivation perspective SLAs related to software and security features are more likely to work and incur lower overhead than a regulatory regime that is centrally administered. Sure, there are going to be pieces of software that this scheme won't work for or where there aren't very many bulk purchasers, only 1-off purchasers. Things like video games for example where there aren't large institutional purchases. That said, I think contracts between large consumers and software producers would be a good start to the problem. -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
Andy Steingruebl wrote: > I like contractual approaches to this problem myself. People buying > large quantities of software (large enterprises, governments) should > get contracts with vendors that specify money-back for each patch they > have to apply where the root cause is of a given type. For example, I > get money back every time the vendor has a vulnerability and patch > related to a buffer overflow. That changes the incentive to hide security bugs and not patch them or to slipstream them. BB ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
> Just as a traditional manufacturer would pay less tax by > becoming "greener," the software manufacturer would pay less > tax for producing "cleaner" code, [...] > One could, I suppose, give rebates based on actual field experience: > Look at the number of security problems reported per year over a > two-year period and give rebates to sellers who have low rates. And all of this completely ignores the $0 software "market". (I'm carefully not saying "free", since that has too many other meanings, some of which have been perverted in recent years to mean just about the opposite of what they should.) Who gets hit with tax when a bug is found in, say, the Linux kernel? Why? /~\ The ASCII der Mouse \ / Ribbon Campaign X Against HTML [EMAIL PROTECTED] / \ Email! 7D C8 61 52 5D E7 2D 39 4E F1 31 3E E8 B3 27 4B ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
On Nov 29, 2007 2:47 PM, Kenneth Van Wyk <[EMAIL PROTECTED]> wrote: > > The article quotes David Rice, who has a book out called > "Geekconomics: The Real Cost of Insecure Software". In it, he tried > to quantify how much insecure software costs the public and, more > controversially, proposes a "vulnerability tax" on software > developers. He believes such a tax would result in more secure > software. I like contractual approaches to this problem myself. People buying large quantities of software (large enterprises, governments) should get contracts with vendors that specify money-back for each patch they have to apply where the root cause is of a given type. For example, I get money back every time the vendor has a vulnerability and patch related to a buffer overflow. I wrote a small piece about this: http://securityretentive.blogspot.com/2007/09/buffer-overflows-are-like-hospital.html Turns out that the federal government isn't paying for avoidable outcomes anymore. Certain things fall into the rough category of "negligence" and so aren't covered. We ought to just do this for software via a contracts mechanism. I'm not sure we want to start out with a big-bang public-policy approach on this issue. We'd want to know a lot more about how the economics work out on a small scale before applying it to all software. -- Andy Steingruebl [EMAIL PROTECTED] ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] Insecure Software Costs US $180B per Year - Application and Perimeter Security News Analysis - Dark Reading
| FYI, there's a provocative article over on Dark Reading today. | http://www.darkreading.com/document.asp?doc_id=140184 | | The article quotes David Rice, who has a book out called | "Geekconomics: The Real Cost of Insecure Software". In it, he tried | to quantify how much insecure software costs the public and, more | controversially, proposes a "vulnerability tax" on software | developers. He believes such a tax would result in more secure | software. | | IMHO, if all developers paid the tax, then I can't see it resulting in | anything other than more expensive software... Perhaps I'm just | missing something, though. The answer to this is right in the article: Just as a traditional manufacturer would pay less tax by becoming "greener," the software manufacturer would pay less tax for producing "cleaner" code, he says. "Those software manufacturers would pay less tax pass on less expense to the consumer, just as a regular manufacturing company would pass on less carbon tax to their customers," he says. He does go on to say: It's not clear how the software quality would be measured ... but the idea would be for a software maker to get tax breaks for writing code with fewer security vulnerabilities. And the consumer ideally would pay less for more secure software because tax penalties wouldn't get passed on, he says. Rice says this taxation model is just one of many possible solutions, and would likely work in concert with torte law or tighter governmental regulations So he's not completely naive, though the history of security metrics and standards - which tend to produce code that satisfies the standards without being any more secure - should certainly give on pause. One could, I suppose, give rebates based on actual field experience: Look at the number of security problems reported per year over a two- year period and give rebates to sellers who have low rates. There are many problems with this, of course - not the least that it puts new developers in a tough position, since they effectively have to lend the money for the tax for a couple of years in the hopes that they'll get rebates later when their code is proven to be good. -- Jerry ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___