Talk by Joseph Greathouse (hosted by Andy Warfield)

Date

Title: Sampling Dynamic Dataflow Analyses

Abstract:

Dynamic dataflow analyses, such as taint checking, cause large runtime slowdowns. While they are useful at finding errors, the overheads they cause stymie adoption and limit the number of tests run on any particular program. This work describes ways of sampling these tests. In other words, allowing individual users to perform analysis on a small portion of the dynamic dataflows in the program. If users control the proportion of the dataflows they see at runtime, they can control the slowdowns they suffer and are more likely to run the analyses. Individual users may miss many program errors, but over a large population, the analyses performed by some users will detect problems. These users can report the potential software errors to developers, making the software that the entire population uses more robust.

Bio:

Joseph Greathouse is a PhD candidate in the Department of Electrical Engineering and Computer Science at the University of Michigan. He researches architecture and software mechanisms that allow developers to dynamically analyze programs to find errors with little runtime overhead. He received his undergraduate degree from the University of Illinois at Urbana-Champaign.