Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Rolling Review Wrap-up: Web Application Scanners

Companies making heavy use of Ajax that don't want to expose themselves to attack should hire some protection, whether a skilled in-house Web security tester, a SaaS capable of handling Ajax, or the most expensive option, a consultant. With the exception of IBM's AppScan, automated Web application scanners are simply not yet up to the task of finding security flaws in Ajax code. And it's not like we made it hard on them: The Ajax applications we used in testing were relatively simple. None of the vulnerabilities we expected our scanners to find was advanced or required complex analysis of client-side code. Rather, they were traditional Web application security vulnerabilities, just exposed through an updated Ajax interface. As long as the scanners under test could navigate the application, identifying the vulnerabilities should have been a walk in the park.

Instead, most ended up unable to automatically crawl the applications, requiring a human to surf through the site to teach the scanner where to prod and poke.

This article is part of NWC's Rolling Review of Web Applications Scanners. Click on that link to go to the Rolling Reviews home page to read all the features and reviews now.

As we wrap up our four-month Rolling Review series, we do want to award some partial credit. While only IBM's WatchFire AppScan automatically handled our Ajax applications, Acunetix Web Vulnerability Scanner, Cenzic Hailstorm and Hewlett-Packard WebInspect (post-update) were capable of analyzing and detecting vulnerabilities in the Ajax application, albeit only when we manually walked them through the relevant bits.

Unfortunately, that's just not good enough. Much of the value of a scanner is that it's a repeatable, exhaustive crawler. Requiring a human to replace the automated spider reduces the code coverage, and thus the effectiveness, of the scanner. So while we don't give those products a complete failing grade, they have a ways to go before they can claim to be truly Ajax-capable. Until then, expect to dig into code manually.

Long Strange Trip

When we kicked off this Rolling Review, in the May 14 issue of Network Computing, we knew a few things: That browsers are insecure, and RIAs make us nervous. That Ajax is a wildly popular development environment yet very challenging from a security standpoint. And that few developers are committed to secure coding.

  • 1