Most Influential POPL Paper Award


Presented annually to the author(s) of a paper presented at the POPL held 10 years prior to the award year. The award includes a prize of $1,000 to be split among the authors of the winning paper. The papers are judged by their influence over the past decade.

Selection Committee The award given in year N is for the most influential paper presented at the conference held in year N-10. The selection committee consists of the following members:

  • the current SIGPLAN Chair, ex officio,
  • a member of the SIGPLAN EC appointed as committee Chair by the SIGPLAN Chair,
  • the General Chair and Program Chair for POPL N-10,
  • the General Chair and Program Chair for POPL N-1, and
  • a member of the SIGPLAN EC appointed by the committee Chair.

The SIGPLAN Chair shall adjudicate conflicts of interest, appointing substitutes to the committee as necessary.

Recipients

2024 Ramana Kumar, Magnus Myreen, Michael Norrish, Scott Owens

(for 2014): CakeML: A Verified Implementation of ML

Citation:

This paper provided exciting evidence that the idea of machine-checked verified compilation can generalize beyond CompCert-style optimization verification to end-to-end correctness statements in the context of ML. A particularly novel feature of CakeML that distinguished it from contemporary efforts like CompCert was its support for bootstrapping that facilitates verification of the compiler implementation along with the runtime system. Because all of this functionality is fully expressed and verified within HOL4, the correctness theorem naturally extends to the machine-code implementation generated by bootstrapping, essentially for free. This paper has spurred a long line of productive follow-on work that has been quite impactful in broadening our understanding of verified compilation, particularly with respect to the number of new techniques and methods it spearheaded.

2023 Thomas Dinsdale-Young, Lars Birkedal, Philippa Gardner, Matthew Parkinson, Hongseok Yang

(for 2013): Views: Compositional reasoning for concurrent programs

Citation:

Concurrency remains one of the most challenging aspects of modern-day programming, and verification techniques for ensuring the correctness of concurrent programs are of utmost importance. This paper makes significant theoretical advances in reasoning about concurrency by presenting an elegant meta-theory of concurrent reasoning principles. At the heart of the proposed theory lies a notion of composition, called “views”, which consists of abstract knowledge about the machine state and the thread’s rights to change it. The paper shows that the proposed Concurrent View Framework unifies most prior attempts at reasoning about concurrency, including both type systems as well as different program logics. Due to its foundational and unifying nature the paper has had significant influence and impact on publications on the topic since its appearance.

2022 Thomas H. Austin and Cormac Flanagan

(for 2012): Multiple facets for dynamic information flow

Citation:

The paper proposed to track information flow dynamically through Javascript programs using a run-time monitor; folklore that dynamic monitors were unsound led others to focus on static techniques. Its key new idea was to blend multiple executions into a single one via “faceted values.” This idea generalized a prior whole program technique of “secure multi execution” (SME), and inspired follow-on work in Javascript security, information flow analysis in other languages (e.g., Python, Haskell, Java), and other problem domains, notably “variability-aware execution” and “dynamic determinacy analysis.” It even influenced follow-on improvements to SME itself. The variety of authors in the most-cited citing papers, and the variety of their areas—most top venues in PL, SE, and Security are represented—speak to its influence. The paper is exemplary in seeding a nice core idea that has evidently proliferated throughout its core area, and also influenced solutions to other problems outside this area.

2021 Sumit Gulwani

(for 2011): Automating string processing in spreadsheets using input-output examples

2020 Saurabh Srivastava, Sumit Gulwani, Jeffrey Foster

(for 2010): From program verification to program synthesis

Citation:

The paper greatly advanced our ability to synthesize programs from logical specifications. It was based on the insight that much of the work carried out by a program verifier could be repurposed not just for checking that code matches a specification, but also to synthesize code that does. The user specifies the input-output behavior as a logical formula, and also provides structural and resource constraints, which describe a template language for the space of possible programs. In later works, such domain-specific languages (DSLs) became the norm for restricting the search space to make synthesis tractable. The verifier then scaffolds the synthesized code onto this structure, adhering to the prescribed resources. The authors were able to synthesize a range of clever algorithms, which served as inspiration for some of the massive effort on verification-based program synthesis over the past decade.

2019 Cristiano Calcagno, Dino Distefano, Peter W. O'Hearn, Hongseok Yang

(for 2009): Compositional shape analysis by means of bi-abduction

Citation:

Shape Analysis is the problem of precisely predicting, at compile-time, the contents of mutable, pointer-based data structures. It holds the key to identifying, and hence eliminating, a wide variety of classes of software defects, from null-dereference errors, to use-after-free bugs and memory leaks to subtle data races that complicate concurrency.

This paper showed how to use compositionality to scale shape analysis up
to large code bases, by introducing bi-abduction: a new technique for analyzing each procedure in isolation, independent of its callees.

Bi-abduction allows an analysis to determine separation logic pre-conditions under which the procedure executes safely, and corresponding post-conditions that describe how the procedure mutates the heap when executed from a configuration satisfying the precondition. We can then use separation logic to compose the pre- and post-conditions for different functions, yielding the first precise shape analyses capable of running on millions of lines of code.

This work directly led to the development of the Infer static analyzer which has been used to find and fix tens of thousands of defects across multiple industrial code bases. Further, it rekindled interest in the notion of abduction, leading to new techniques for interactive exploration of the results of static analysis, and compositional program synthesis.

2018 Kohei Honda, Nobuko Yoshida, Marco Carbone

(for 2008): Multiparty asynchronous session types

Citation:

Session types are a type-based framework for codifying communication structures and verifying protocols in concurrent, message-passing programs. Previously, session types could only model binary (two-party) protocols. This paper generalizes the theory to the multiparty case with asynchronous communications, preventing deadlock and communication errors in more sophisticated communication protocols involving any number (two or more) of participants. The central idea was to introduce global types, which describe multiparty conversations from a global perspective and provide a means to check protocol compliance. This work has inspired numerous authors to build on its pioneering foundations in the session types community and has initiated many applications of multiparty session types in programming languages and tools. It has also influenced other areas of research, such as software contracts, runtime verification and hardware specifications.

2017 Dachuan Yu, Ajay Chander, Nayeem Islam, Igor Serikov

(for 2007): JavaScript Instrumentation for Browser Security

Citation:

The selected paper presents one of the earliest models of the interaction between the browser and JavaScript. It uses this model to work out the formalization and dynamic enforcement of rich security policies. Since then, people have routinely discovered additional, pernicious security problems based on this model. Eliminating these problems remains an important challenge to this day.

Looking back, the selected paper made a prescient, and influential, contribution to understanding these JavaScript-based security problems. The authors chose a formal, semantic approach to model these problems and potential solutions, while remaining true to the complicated characteristics that make both JavaScript and the browser real-world artifacts.

2016 Xavier Leroy

(for 2006) Formal certification of a compiler back-end or: programming a compiler with a proof assistant

Citation:

This paper presents the CompCert compiler, a landmark effort in program verification. The paper was (and still is) groundbreaking in that it demonstrates the feasibility of using an interactive theorem prover – specifically, Coq – to both program and formally verify a realistic compiler – specifically, the backend of an optimizing compiler for a significant subset of C. The verification ensures that properties proved about C source code continue to hold of the assembly output of the compiler, thereby filling a major gap in the development of high-assurance software and facilitating a great deal of follow-on work on verified software toolchains. Moreover, CompCert made a convincing case that theorem-proving technology is mature enough to be applied to the full functional verification of realistic systems, and in so doing heralded a new age of “big verification”.

2015 Nate Foster, Michael B. Greenwald, Jonathan T. Moore, Benjamin C. Pierce, Alan Schmitt

(for 2005) Combinators for Bidirectional Tree Transformations: A Linguistic Approach to the View Update Problem

Citation:

The “view-update problem” is a classic problem in databases: when operating on a partial view of a data structure, how should updates to the view be propagated to the original data structure? This POPL 2005 paper was instrumental in bringing the view-update problem to the attention of the programming languages community and demonstrating the broad relevance of the problem beyond databases. The immediate contributions of the paper were (1) to introduce a general mathematical space of well-behaved bidirectional transformations called “lenses,” and (2) to develop a specific instantiation of this framework in the form of a domain-specific language of combinators for tree transformations, which served as the basis for subsequent tools for editing XML and HTML. More broadly, the paper sparked a great deal of follow-on work in the area of BX (“bidirectional transformations”), leading to a fruitful collaboration between the worlds of databases, programming languages, and software engineering.

2014 Thomas Henzinger, Ranjit Jhala, Rupak Majumdar, Kenneth McMillan

(for 2004) Abstractions from proofs

Citation:

Interpolation can be used to construct an inductive invariant for a transition system that is strong enough to prove a given property. Prior to Henzinger et al.’s paper, interpolation had only been applied to model checking of finite-state systems. In this paper, the authors demonstrated a fundamental generalization of Craig interpolation to program analysis by predicate abstraction, opening the door for interpolation to be applied to abstraction refinement of infinite-state systems. This work showed how interpolation offers a fundamental way to explain abstraction refinement in a logical framework and has led to many extensions to increase the power of abstraction in program analysis.

2013 David F. Bacon, Perry Cheng, VT Rajan

(for 2003) A real-time garbage collector with low overhead and consistent utilization

Citation:

Languages with automatic memory management offer well-known advantages in terms of software engineering, but the use of garbage collection seemed for a long time to place such languages off-limits for hard real-time and embedded systems. Indeed, prior to this highly influential paper, the existing approaches to real-time garbage collection suffered from a number of practical problems: a lack of hard bounds on pause times in the face of fragmentation or large data structures; high space overhead due to the use of copying collection to avoid fragmentation; and uneven mutator utilization due to the use of work-based scheduling algorithms. Bacon, Cheng & Rajan’s paper showed how to overcome all these problems in the design of a real real-time garbage collector for Java. The collector, which was implemented and tested in the Jikes RVM, employed an impressive variety of techniques, such as time-based scheduling and “mostly non-moving” collection, in order to provide guaranteed pause times, consistent mutator utilization, and low space overhead. In short, the paper marked a major advance toward the important goal of turning hard real-time garbage collection into a reality.

2012 George C. Necula, Scott McPeak, and Westley Weimer

(for 2002) CCured: Type-Safe Retrofitting of Legacy Code

Citation:

This paper showed a path to taking legacy C code and making it safer, by adding type safety guarantees to existing C programs. It uses a combination of static and dynamic techniques to allow programmers to conduct low-level operations while guaranteeing safety properties regarding pointers and minimizing overhead. For many C programs, CCured’s type inference is able to infer that most or all of the pointers are statically verifiable to be type safe. This paper demonstrates the beauty of both dynamic and static types, influencing the design of languages such as C#, and inspiring continuing research on static and dynamic analysis.

2011 Samin Ishtiaq and Peter W. O'Hearn

(for 2001) BI as an Assertion Language for Mutable Data Structures

Citation:

In applying the logic of bunched implications (BI) to mutable data structures this paper laid the groundwork for the flowering of separation logic (heap models of separation logic are specific BI models), which allows reasoning about programs with dynamically allocated (and deallocated) memory. This paper introduced the ‘frame rule’ which supports local reasoning in separation logic, where specifications and proofs concentrate on just those memory cells manipulated by a program component rather than the entire global program state. Separation logic underlies continuing research on what is described as local reasoning, having diverse application in domains such as program verification and automated parallelization.

2010 Luca Cardelli and Andrew D. Gordon

(for 2000) Anytime, Anywhere: Modal Logics for Mobile Ambients

Citation:

“Anytime, Anywhere: Modal Logics for Mobile Ambients” by Luca Cardelli and Andrew D. Gordon helped spur a flowering of work in the area of process calculi that continues today. The paper focused on modal logics for reasoning about both temporal and spacial modalities for ambient behaviours, demonstrating techniques that also apply to other process calculi (even those without an explicit notion of location), so contributing to excitement in an area that was growing at that time and continues. The work has led to application of concurrency theory in fields as diverse as security, safety critical applications, query languages for semistructured data, and systems biology.

2009 Andrew C. Myers

(for 1999) JFlow: Practical Mostly-Static Information Flow Control

Citation:

“JFlow: Practical Mostly-Static Information Flow Control” by Andrew C. Myers demonstrated the practicality of using static information flow analysis to protect privacy and preserve integrity by giving an efficient information flow type checker for an extension of the widely-used Java language. The work has had a significant impact both within and beyond the programming language community. In particular, subsequent work for other languages has largely followed the path laid out in this paper, and the compiler infrastructure developed for JFlow (now called Jif) is widely used as a research platform. Furthermore, using the JFlow work as a basis, several major research initiatives are investigating the challenges of building complex, real-world systems with confidentiality guarantees.

2008 Greg Morrisett, David Walker, Karl Crary, and Neal Glew

(for 1998) From System F to Typed Assembly Language

Citation:

“From System F to Typed Assembly Language” by Greg Morrisett, David Walker, Karl Crary, and Neal Glew began a major development in the application of type system ideas to low level programming. The paper shows how to compile a high-level, statically typed language into TAL, a typed assembly language defined by the authors. The type system for the assembly language ensures that source-level abstractions like closures and polymorphic functions are enforced at the machine-code level while permitting aggressive, low-level optimizations such as register allocation and instruction scheduling. This infrastructure provides the basis for ensuring the safety of untrusted low-level code artifacts, regardless of their source. A large body of subsequent work has drawn on the ideas in this paper, including work on proof-carrying code and certifying compilers.

2007 George Necula

(for 1997) Proof-carrying Code

2006 Bjarne Steensgaard

(for 1996) Points-to Analysis in Almost Linear Time

2005 Luca Cardelli

(for 1995) A Language with Distributed Scope

2004 Mads Tofte and Jean-Pierre Talpin

(for 1994) Implementation of the Typed Call-by-Value lambda-calculus using a Stack of Regions

2003 Simon Peyton Jones and Philip Wadler

(for 1993) Imperative functional programming