Eyes Above The Waves

Robert O'Callahan. Christian. Repatriate Kiwi. Hacker.

Friday 23 June 2017

Rising Tolerance For Static Analysis False Positives?

When I was a young graduate student working on static analysis tools, conventional wisdom was that a static analysis tool needed to have a low false-positive rate or no-one would use it. Minimizing false-positives was a large chunk of the effort in every static analysis project.

It seems that times have changed and there are now communities of developers willing to tolerate high false positive rates, at least in some domains. For example:

It will also miss really obvious bugs apparently at random, and flag non-bugs equally randomly. The price of the tool is astronomical, but it's still worthwhile if it catches bugs that human developers miss.
Indeed, I've noticed people in various projects ploughing through reams of false positives in case there are any real issues.

I'm not sure what changed here. Perhaps people have just become more appreciative of the value of subtle bugs caught by static analysis tools. Maybe it's a good time to be a developer of such tools.

Comments

Mark Erikson
I would venture a guess that the name of the tool starts with "F" and ends with "ortify", in which case I can completely agree with the assessment of its behavior and capabilities. As for the cost/benefit tradeoff... well, let's just say it's "the bane of my existence" and leave it at that.