Facebook Sign

Update: Facebook awards $50K Internet Defense Prize for Work on Securing Web Apps

Saying that research dollars for cyber security are disproportionately devoted to work on “offensive” techniques (like hacking), social media giant Facebook has awarded two researchers  a $50,000 prize for their work on cyber defense.

Facebook Sign
Facebook awarded a $50,000 prize to researchers from Germany who developed a new method for discovering so-called “second order” security holes in web applications.

The company announced on Wednesday that Johannes Dahse and Thorsten Holz, both of Ruhr-Universität Bochum in Germany for their work on a method for making software less prone to being hacked.

The two developed a method for detecting so-called “second-order” vulnerabilities in Web applications using automated static code analysis. Their paper (PDF here) was presented at the 23rd USENIX Security Symposium in San Diego.

In a blog post announcing the prize, John Flyn, a security engineering manager at Facebook, said the Internet Defense Prize recognizes “superior quality research that combines a working prototype with significant contributions to the security of the Internet—particularly in the areas of protection and defense.”

Dahse and Holz’s work was chosen by a panel to receive the prize both on its  technical merit and because panelists could “could see a clear path for applying the award funds to push the research to the next level,” Flynn wrote.

Second order vulnerabilities are distinct from ‘first order’ security holes like SQL injection and cross site scripting. They allow an attacker to use one of those first-order flaws to manipulate a web application and store a malicious payload on a web server. That payload, which may be stored as a shared resource on the application server, can later be used to target all users of the application.

Existing dynamic and static code analysis techniques have not been effective at identifying second-order vulnerabilities. So called dynamic analysis (“fuzzing”) often misses second order vulnerabilities entirely.

Existing static analysis techniques also did a poor job uncovering second order vulnerabilities because of the way that most modern web applications store data in external resources (like a database or external file). That data can then be recalled and used by the application, but might escape the notice of code analysis tools that merely look at the flow of untrusted data in a web application, the researchers wrote in their paper.

The award-winning approach that the two developed improves analysis of untrusted data flows within web applications. The researchers developed an automated method for collecting “all locations in persisten stores that are written to and can be controlled (tainted) by an adversary.” Basically, their analysis tool provides comprehensive auditing of data flows within a web application, but defers decisions about whether particular data flows are malicious until all “taintable writings to persistent stores are known.”

As one measure of its efficacy, the researchers applied their tool to six, popular applications written using PHP, including OpenConf, HotCRP and osCommerce. They claim to have found and reported 159 previously unknown second-order vulnerabilities in the applications, including remote code execution holes affecting osCommerce and OpenConf.

The award does not give Facebook ownership of the researchers’ work, but Flynn said the company would get an update from the researchers on their progress developing the technology going forward.

In a post on the Secure Medicine blog, Kevin Fu of the University of Michigan, the program chair of the USENIX Security Symposium, said he was “delighted” that Facebook selected the USENIX Conference as a forum to search for groundbreaking defensive work. Fu wrote that Facebook intends to make this an annual prize, and may even increase the prize amount.

Chris Eng, the Vice President of Security Research at Veracode said that second order vulnerabilities are a tricky problem. “Most static analyzers simplify how they model Persistent Data Stores,” he wrote. For example, a static analysis tool trying to spot cross site scripting vulnerabilities might decide to treat all database reads made by an application as ‘untrusted’ and potentially malicious.

“In this paper they attempt to differentiate between trusted and untrusted database reads, session variables, etc. In theory this should result in more accurate findings,” he said. But Eng anticipated that adapting the tool to work on typical, enterprise applications would be a challenge – given their size and complexity.

“I would expect the approach to be less effective on enterprise code. In particular, they allude to the challenges in dealing with dynamically-generated SQL queries, which will be rampant in enterprise code bases,” Eng wrote.  “Also in terms of complexity, the largest open source app they looked at was 66 KLOC; enterprise apps are often well into the millions of LOC and rely on numerous frameworks that also need to be modeled,” he said.

Read more on Facebook’s blog here.

 

Comments are closed.