BT

Security Assessment Techniques: Code Review v Pen Testing

by Srini Penchikala on Dec 06, 2010 |

Web application security testing and assessment should include both security code review and penetration testing techniques. Dave Wichers, an OWASP board member, spoke at the recent AppSec DC 2010 Conference about the pros and cons of code review and pen testing approaches in finding security vulnerabilities in web applications.

Dave said both code review and pen testing can leverage automated analysis tools to assist the security engineers in the process. The code review process should include checking all the custom developed code as well as any configuration files of the application, libraries, frameworks, and the server where the application has been deployed.

He compared the strengths and weaknesses of each approach. The strengths of pen testing are it requires less specialized expertise, easier to setup and perform, exercises the entire application infrastructure and it proves the vulnerabilities. On the other hand, code review approach has the advantages that it’s easier to find all the content, all instances of certain types of flaws, it verifies controls are correct and that the controls are used in all the required places.

Dave also showed how these two techniques stack up to the OWASP Top 10 list of security vulnerabilities. Code review is a better approach in finding vulnerabilities in the following categories:

  • Injection flaws
  • Cross-Site Scripting (XSS)
  • Direct Object References
  • Cross-Site Request Forgery (CSRF)
  • URL Based Access Control
  • Crypto Storage
  • Redirect/Forward flaws

And penetration testing is the winner in finding security flaws in the following categories:

  • Configuration flaws
  • Transport Security flaws

Other issues like authentication vulnerabilities and finding authentication & session management flaws can be discovered using both the techniques. Both provide value in finding authentication related issues such as Account Lockout, Strong Credentials Requirement, Authentication Event Logging, Proper Invalidation of Sessions on Logout, and Sufficiently Random Session Tokens.

Security teams should use both techniques in their security testing efforts but the advantage of code review grows significantly with size of the application portfolio and the level of rigor of the assessment. Dave concluded the discussion saying it's a myth that code review is way more expensive. If you have people with the right skills, it's actually faster and more effective.
 

Hello stranger!

You need to Register an InfoQ account or to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Web app scans vs Manual Code Reviews by Jeevak Kasarkod

Almost all the security vulnerabilities such as sql injection, cross site scripting, CSRF etc can be uncovered by web app/web service scanning tools and should comprise the majority of the code review and penetration testing coverage. There also exist static code analysis tools which reduces the cost of fixing code later in the software lifecycle. These tools spit out metrics based OWASP top 10 reports by categorizing the vulnerabilities in the appropriate buckets. The challenge is more intelligence in static code analysis tools to discover more vulnerabilities upfront.

Re: Web app scans vs Manual Code Reviews by HP Rules

Yes, every HP product should comprise the majority of the code review and penetration testing coverage! WebInspect is flawless in its false positive reporting. Fortify is a point-and-click static code analyzer that have 100% coverage in all scenarios! And, stop using well-trained analysts to assist in the review. Just use HP professional services to run the tools. Your vulnerability count will go to 0 in no time at all. Please contact an HP rep today for all your lulz.

Re: Web app scans vs Manual Code Reviews by Dave Wichers

Thanks Srini for the nice post about my talk at the OWASP conference.

I believe that Jeevak is from HP and so he might be a little biased on the capabilities of automated analysis tools. Such tools have their value, but they also have serious limitations as well, and require much more expertise to use effectively than most people understand.

The sarcastic reply from 'HP Rules', whoever that is, is just pointing out that the previous poster is most likely from a product vendor. I'm sure HP isn't promising anything as wildly fantastic as this sarcastic reply, but customers of such tools do need to clearly understand the strengths and weaknesses of such tools and the level of expertise, integration, and customization required to get good value from any investment in automated analysis tools.

Aspect Security has worked with numerous customers who have first invested large amounts of money buying tools, but didn't invest much in training, integration, customization, as well as making sure they had reasonably skilled staff using these tools. It's certainly appealing to imagine tools can handle the bulk of the effort required to discover vulnerabilities and determine the severity of the risks they represent but the amount of false positives and negatives is still so high that a significant amount of expert user involvement is still required to triage what was found, find the flaws the tools can't find, and then triage all this stuff so that the most important risks rise to the top, rather than simply the risks that are easiest to identify with automation. After we worked with these customers to integrate and customize these tools for their environment, they started to receive much more value from their investment in automation than they were getting before.

One other thing I want to point out is that finding flaws, which was the focus of my talk, is still the least expensive part of the application security problem, and yet it is getting the most focus today. The expensive part is building secure applications in the first place, or remediating issues that are found through application security assessments. Training developers, arming them with reusable security controls, introducing security architecture reviews and threat modeling, and secure coding guidelines is where you will get the most bang for your buck, and yet automated analysis is getting the lion's share of many naive organizations attention because it's so appealing to imagine that this will solve their problem, or at least be the best first place to invest.

In summary, I think automation clearly has a useful role in both code analysis and application penetration testing, but it requires far more expert human investment to get significant value out of them than most people realize, and these tools are simply unable to discover some of the highest risk issues like authentication, access control, and context aware input validation and output encoding.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

3 Discuss

Educational Content

General Feedback
Bugs
Advertising
Editorial
InfoQ.com and all content copyright © 2006-2013 C4Media Inc. InfoQ.com hosted at Contegix, the best ISP we've ever worked with.
Privacy policy
BT