RSA, part 2: static analysis

A continuation of RSA: “It feels like something’s missing”

RSA’s a tough show for static analysis companies, but several were there. Ounce had the largest booth and an excellent message (“listen to your code”); Veracode, Armorize, and Fortify had smaller presence. However, I didn’t actually spend much time at the booths or looking at the details of any specific technology, instead talking with various folks I ran into about the strategic possibilities.

John Viega (formerly of Secure Software and Symantec, and still on the Fortify advisory board), made a very interesting point about the market: he pegged the total size is somewhere around $700 million/year, but the vast majority of it is for services — the tools market proper is still probably less than $100 million. This is a pretty accurate reflection of the limitations and usability challenges of today’s tools, and the difficulty of getting them broadly deployed. More positively, it also highlights the huge upside for tools and solutions companies that can crack these problems.

Chris Wysopal of Veracode and I briefly chatted (at a Microsoft-sponsored community event of all places) about the possibilities from combining multiple analyses, including binary-based and source-code-based. As he pointed out, most tools have at least some areas where they’re the best — particular defects, languages, environments, or even coding styles.

From the customer perspective, being able to examine all their outputs together has huge advantages — and this is also great for penetration testers and others companies that offer security services. From a tools perspective, the “toolbox” framework Scott Page uses in his work on cognitive diversity highlights that for some problems, combining the results of multiple diverse analyses is likely to give better results than any single analysis.  [One obvious example: the intersection of defects produced by two or more independent high-noise analyses may be a low-noise subset.] Of course all the individual tools do this already to some extent, for example combining lighter-weight global analyses with more detailed local analyses; still, I came away thinking that this could be an area where significant improvements are possible.

To enable any kind of investigation of these cross-tool effects, there needs to some kind of common format for the output. Somewhat surprisingly, there isn’t yet a broadly-accepted XML definition for code-level defect information — although the format for the SAMATE project’s upcoming Static Analysis Tools Exposition (SATE)  is a good start, so things are heading in the right direction. [And encouragingly, it includes a field for CWE id, so it ties in well with efforts on that front.]

At the iSec forum/social on Thursday night, I ran into Sondra Schneider of Security University, and brought up a topic Hugh Thompson of People Security and I had discussed earlier: the potential synergies between analysis tools and training. Sondra made the interesting point that in a lot of situations, people don’t think of these tools in terms of defect detection but rather as making code review more effective.* I’ve wondered for quite a while whether there’s a possibility for a tool focused primarily on code review, providing annotation and issue tracking as well as good visualization — like SourceInsight (a great code understanding tool) but with the much richer information you get from detailed analysis.** So there are a couple of interesting possibilities to follow up on here as well.

If you see a theme emerging here, you’re right. It ties back to the suggestion I made in my Strategy, security, and static analysis: what’s next for me post: the conditions appear perfect for a “team of rivals” strategy, where the interests of tools providers, consulting and services organizations, customers, and the security industry as a whole align.

jon

* and sure enough Armorize’s site leads with “Still struggling with manual code review? See how Armorize can help!”

** Zynamics’ BinNavi is probably the closest thing out there, although focuses specifically on the much smaller and more technical reverse engineering market; and it’s not integrated with any issue tracking systems


Comments

2 responses to “RSA, part 2: static analysis”

  1. Great post(s) Jon – some spot on insights and I’ve linked to both posts.

    I wholeheartedly agree with your “emerging trends” thoughts, but feel it would be very hard to get all the players talking and agreeing on anything – although I believe that making things more secure are in (most) people’s interests, there’s a lot of “turf guarding” going on.

    However, if this does happen and you want someone from the consulting side to offer their (free) services, then I’d love to be part of that discussion.

  2. Thanks, Mike!

    it would be very hard to get all the players talking and agreeing on anything – although I believe that making things more secure are in (most) people’s interests, there’s a lot of “turf guarding” going on.

    For sure. One key will be to find some neutral turf, which is part of why I’m so attracted to the defect format — as far as I know, nobody views this as a significant competitive advantage. Another is to make sure that each of the early players has some incentive, ways in which the shared effort increases the value of what they say as their turf.

    However, if this does happen and you want someone from the consulting side to offer their (free) services, then I’d love to be part of that discussion.

    Free? Free? Did you say free???? Sure, that’d be great! Details TBD 🙂

    Seriously, though, if this does go forward I’ll certainly take you up on this. Consultants are a big part of the industry; it’s important to make sure that their interests are reflected as well.

Leave a Reply

Your email address will not be published. Required fields are marked *