Re: [SC-L] [WEB SECURITY] Re: What do you like better Web penetration testing or static code analysis?
Regarding the code snippet -- it does depend on the environment -- point well taken. But in this case (from what I can tell), unless you actually have the file_exists() function *disabled* in php.ini, this is vulnerable to XSS. - Greg Sebastian Schinzel wrote, On 04/28/2010 04:03 AM: On Apr 28, 2010, at 7:10 AM, SneakySimian wrote: ?php $file = $_GET['file']; if(file_exists($file)) { echo $file; } else { echo 'File not found. :('; } Ignoring the other blatant issues with that code snippet, is that vulnerable to XSS? No? Are you sure? Yes? Can you prove it? As it turns out, it depends on a configuration setting in php.ini. The only real way to know if it is an issue is to run it in the environment it is meant to be run in. Now, that's not to say that the developer who wrote that code shouldn't be told to fix it in a source code analysis, but the point is, some issues are wholly dependent on the environment and may or may not get caught during code analysis. Other issues such as code branches that don't execute or do execute in certain environments can be problematic to spot during normal source code analysis. So you suggest to actually perform n black-box tests where n is the set of all possible permutations of all variables in php.ini (hint: n will be very large)? This is certainly not feasible. Your code shows a very simple data flow, which may or may not be exploitable. But this is not the point. The point of software security is to increase the reliability of the software when under attack. Reliable software performs output encoding when user input is printed to HTML. This code does not perform output encoding and should therefore be fixed. The discussion about whether or not this is exploitable on which platforms is a waste of time. In many cases, you will find yourself spending a lot of time in trying to get a running exploit, whereas the actual fix for the code takes a fraction of the time. For me, penetration testing is solely a method to raise awareness and to gather new security requirements FOR a customer application FROM security researchers. Knowledge transfer from security researchers to the business is key here. It helps finding actual attacks but does not help the customer writing better code. Code audits (where automated or manual) are the way to go to improve reliability by pointing out dangerous coding patterns. My 0.02€... Cheers Sebastian ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___
Re: [SC-L] [WEB SECURITY] Re: What do you like better Web penetration testing or static code analysis?
I am working on a collaborative effort trying to blog daily about a different software security bug. I am looking for comments on my blog on how to make it better. Maybe eventually we can turn this into an OWASP project. I am really just doing this because at the current time all I am doing is web penetration testing and I want to make sure that I don't lose any of my code review skills. Any comments positive or negative would be very helpful. http://parsonsisconsulting.blogspot.com/ Thanks, Matt Parsons, CISSP, MSM On Wed, Apr 28, 2010 at 12:10 AM, SneakySimian sneaky.sim...@gmail.com wrote: I couldn't let this one go. Having done both source code analysis and blackbox testing, I see merits in both. The failure that was the Debian SSL bug is a prime example of why I prefer blackbox testing. That's not to say things can't go wrong in blackbox testing, because they do, but not all code behaves the same way in the same environment, so if you actually test it in the environment it is running in, you can then understand why the code behaves the way it does. Oversimplified example: ?php $file = $_GET['file']; if(file_exists($file)) { echo $file; } else { echo 'File not found. :('; } Ignoring the other blatant issues with that code snippet, is that vulnerable to XSS? No? Are you sure? Yes? Can you prove it? As it turns out, it depends on a configuration setting in php.ini. The only real way to know if it is an issue is to run it in the environment it is meant to be run in. Now, that's not to say that the developer who wrote that code shouldn't be told to fix it in a source code analysis, but the point is, some issues are wholly dependent on the environment and may or may not get caught during code analysis. Other issues such as code branches that don't execute or do execute in certain environments can be problematic to spot during normal source code analysis. That all said, I do enjoy reading code, especially comment coding from other developers. :P On Tue, Apr 27, 2010 at 2:29 PM, Andre Gironda and...@gmail.com wrote: On Tue, Apr 27, 2010 at 4:08 PM, Arian J. Evans arian.ev...@anachronic.com wrote: I think everyone would agree that you definitely want to apply additional (deeper?) degrees of analysis and defensive compensating-control to high-value and high-risk assets. The tough question is what ruler you use to justify degree of security investment to degree of potential Risk/Loss. That requires information sharing and trend analysis, something that our classic vulnerability management programs have also not solved Join us on IRC: irc.freenode.net #webappsec Have a question? Search The Web Security Mailing List Archives: http://www.webappsec.org/lists/websecurity/archive/ Subscribe via RSS: http://www.webappsec.org/rss/websecurity.rss [RSS Feed] Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA Join us on IRC: irc.freenode.net #webappsec Have a question? Search The Web Security Mailing List Archives: http://www.webappsec.org/lists/websecurity/archive/ Subscribe via RSS: http://www.webappsec.org/rss/websecurity.rss [RSS Feed] Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA -- Matt Parsons, CISSP 315-559-3588 Blackberry 817-238-3325 Home Office mparsons1...@gmail.com www.parsonsisconsulting.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___
Re: [SC-L] [WEB SECURITY] Re: What do you like better Web penetration testing or static code analysis?
On Apr 28, 2010, at 7:10 AM, SneakySimian wrote: ?php $file = $_GET['file']; if(file_exists($file)) { echo $file; } else { echo 'File not found. :('; } Ignoring the other blatant issues with that code snippet, is that vulnerable to XSS? No? Are you sure? Yes? Can you prove it? As it turns out, it depends on a configuration setting in php.ini. The only real way to know if it is an issue is to run it in the environment it is meant to be run in. Now, that's not to say that the developer who wrote that code shouldn't be told to fix it in a source code analysis, but the point is, some issues are wholly dependent on the environment and may or may not get caught during code analysis. Other issues such as code branches that don't execute or do execute in certain environments can be problematic to spot during normal source code analysis. So you suggest to actually perform n black-box tests where n is the set of all possible permutations of all variables in php.ini (hint: n will be very large)? This is certainly not feasible. Your code shows a very simple data flow, which may or may not be exploitable. But this is not the point. The point of software security is to increase the reliability of the software when under attack. Reliable software performs output encoding when user input is printed to HTML. This code does not perform output encoding and should therefore be fixed. The discussion about whether or not this is exploitable on which platforms is a waste of time. In many cases, you will find yourself spending a lot of time in trying to get a running exploit, whereas the actual fix for the code takes a fraction of the time. For me, penetration testing is solely a method to raise awareness and to gather new security requirements FOR a customer application FROM security researchers. Knowledge transfer from security researchers to the business is key here. It helps finding actual attacks but does not help the customer writing better code. Code audits (where automated or manual) are the way to go to improve reliability by pointing out dangerous coding patterns. My 0.02€... Cheers Sebastian ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___
Re: [SC-L] [WEB SECURITY] Re: What do you like better Web penetration testing or static code analysis?
There is no reason the php.ini and other framework or app server configuration files can't be taken into account in a static analysis. Veracode performs static analysis of an application in its final executable form. So for compiled languages that is a binary executable, for managed languages it is the compiled bytecode and for interpreted languages it is the source code. If there are standard configuration files that are part of the run time environment for frameworks or app servers they are be considered part of the final executable version of the app. If the configuration files are missing then a worst case analysis is performed. Of course the php.ini submitted to the static analyzer might not match the one running in production but you can have the same issue doing dynamic testing on a staging environment. It can be a goal for both static and dynamic testing to have the analysis come as close as possible to what will be in production. -Chris -Original Message- From: SneakySimian [mailto:sneaky.sim...@gmail.com] Sent: Wednesday, April 28, 2010 1:10 AM To: Andre Gironda Cc: websecurity; Secure Coding; Adam Muntner; Arian J. Evans Subject: Re: [WEB SECURITY] Re: [SC-L] What do you like better Web penetration testing or static code analysis? I couldn't let this one go. Having done both source code analysis and blackbox testing, I see merits in both. The failure that was the Debian SSL bug is a prime example of why I prefer blackbox testing. That's not to say things can't go wrong in blackbox testing, because they do, but not all code behaves the same way in the same environment, so if you actually test it in the environment it is running in, you can then understand why the code behaves the way it does. Oversimplified example: ?php $file = $_GET['file']; if(file_exists($file)) { echo $file; } else { echo 'File not found. :('; } Ignoring the other blatant issues with that code snippet, is that vulnerable to XSS? No? Are you sure? Yes? Can you prove it? As it turns out, it depends on a configuration setting in php.ini. The only real way to know if it is an issue is to run it in the environment it is meant to be run in. Now, that's not to say that the developer who wrote that code shouldn't be told to fix it in a source code analysis, but the point is, some issues are wholly dependent on the environment and may or may not get caught during code analysis. Other issues such as code branches that don't execute or do execute in certain environments can be problematic to spot during normal source code analysis. That all said, I do enjoy reading code, especially comment coding from other developers. :P On Tue, Apr 27, 2010 at 2:29 PM, Andre Gironda and...@gmail.com wrote: On Tue, Apr 27, 2010 at 4:08 PM, Arian J. Evans arian.ev...@anachronic.com wrote: I think everyone would agree that you definitely want to apply additional (deeper?) degrees of analysis and defensive compensating-control to high-value and high-risk assets. The tough question is what ruler you use to justify degree of security investment to degree of potential Risk/Loss. That requires information sharing and trend analysis, something that our classic vulnerability management programs have also not solved Join us on IRC: irc.freenode.net #webappsec Have a question? Search The Web Security Mailing List Archives: http://www.webappsec.org/lists/websecurity/archive/ Subscribe via RSS: http://www.webappsec.org/rss/websecurity.rss [RSS Feed] Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA Join us on IRC: irc.freenode.net #webappsec Have a question? Search The Web Security Mailing List Archives: http://www.webappsec.org/lists/websecurity/archive/ Subscribe via RSS: http://www.webappsec.org/rss/websecurity.rss [RSS Feed] Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___
Re: [SC-L] [WEB SECURITY] Re: What do you like better Web penetration testing or static code analysis?
So - Just to make sure I understand - You are saying you don't actually perform all of these activities for clients to help them secure their web software today? I think that will be a relief to some on the list. I know a few people called me concerned that they were never going to have time to sleep again with all that to do! Overall you do make interesting points with your ideas. I definitely agree with your assertion that automation alone has significant limitations. This is definitely the right forum to bounce around your ideas about what types of security/secure/coding/analysis activities may work, what activities we might want to try out, and what the best books to read are, to help us figure out how to secure the bazillions of web applications that exist today. Ciao, --- Arian Evans On Tue, Apr 27, 2010 at 12:52 PM, Andre Gironda and...@gmail.com wrote: On Tue, Apr 27, 2010 at 11:52 AM, Arian J. Evans arian.ev...@anachronic.com wrote: So to be clear - You are saying that you do all of the below when you are analyzing hundreds to thousands of websites to help your customers identify weaknesses that hackers could exploit? How do you find the time? Not me personally, but the industry as a whole does provide most of these types of coverage. Everyone sees it a different way, and probably everybody is right. What I do find incorrect and wrong is assuming that you can automate anything (especially risk and vulnerability decisions -- is this a vuln; is this a risk). What I also find wrong is that the tools which attempt to automate finding vulnerabilities and assigning risk (but can't deliver on either) cost $60k/assessor for a code scan or $2k/app for a runtime scan. A team (doesn't have to be security people, but should probably include at least one) should instead use a free tool such as Netsparker Community Edition, crawl a target app through a few proxies (a few crawl runs) such as Burp Suite Professional, Casaba Watcher, and Google Ratproxy -- do a few other things such as track actions a browser would take (in XPath expression format) and plot a graph of dynamic-page/HTML/CSS/JS/SWF/form/parameter/etc objects (to show the complexity of the target app) -- and provide a data corpus (not just a database or list of findings) to allow the reviewers to make more informed decisions about what has been testing, what should be tested, and what will provide the most value. Combine the results with FindBugs, CAT.NET, VS PREfix/PREfast, cppcheck, or other static analysis-like reports in order to generate more value in making informed decisions. Perhaps cross-correlate and maps URLS to source code. I call this Peripheral Security Testing. Then, if time allows (or the risk to the target app is seen as great enough), add in a threat-model and allow the team to perform penetration-testing based on those informed decisions. I call this latter part, Adversarial Security Testing. Does manual testing take more time, or does it instead find the right things and allow the team to make almost-fully informed decisions, thus saving time? I will leave that as a straw man argument that you can all debate. dre ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates ___