USA - 2020READ FULL CITATION AND ESSAY
For fundamental algorithms and theory underlying programming language implementation and for synthesizing these results and those of others in their highly influential books, which educated generations of computer scientists.
Computer software powers almost every piece of technology we interact with. Virtually every program running our world - from those on our phones or in our cars to programs running on giant server farms inside big web companies - is written by humans in a higher-level programming language and then compiled into lower-level code for execution. Much of the technology for doing this translation for modern programming languages, under the hood, owes its beginnings to this year's Turing Award winners.
Alfred V. Aho is the Lawrence Gussman Professor Emeritus of Computer Science at Columbia University and Jeffrey D. Ullman is the Stanford W. Ascherman Professor Emeritus of Computer Science at Stanford University. Aho and Ullman began their productive and creative collaboration in 1967 at Bell Labs, which they both joined after finishing their PhDs at Princeton University. While at Bell Labs, they developed efficient algorithms for analyzing and translating programming languages, and they continued their close collaboration after Ullman moved to academia in 1969. Their joint and individual research contributions over the span of several decades have shaped the foundations of programming language theory and implementation, as well as algorithm design and analysis.
Aho and Ullman made broad and fundamental contributions to the field of programming language compilers through their technical contributions and influential textbooks. Aho and Ullman contributed to formal language theory and invented efficient algorithms for lexical analysis, syntax analysis, code generation, and code optimization. For example, in an early paper, Aho and Ullman showed how it was possible to make Knuth's LR(k) parsing algorithm work with simple grammars that technically did not meet the requirements of an LR(k) grammar; this idea led to the YACC parser-generator, which for many years was the preferred method for quickly building a parser. They developed efficient algorithms for data-flow analysis that exploited the structure of "gotoless" programs, which were at the time just becoming the norm. As the field of relational databases was opening up, their paper (with Catriel Beeri) on "lossless joins" explained what the normal forms defined by Codd, Fagin, and others were attempting to do with regard to design of databases and gave a methodology for determining when it was possible to break a relation into smaller components without losing information.
Their research, along with that of others, formed the basis of their two-volume series on parsing and compiling, which led to their definitive book on compiler technology, Principles of Compiler Design, published in 1977. This "Dragon Book" (nicknamed after its colorful cover) and its subsequent editions with Ravi Sethi, and later with Sethi and Monica Lam, became the bibles of compiler design. These books lucidly lay out the phases in translating a high-level programming language to machine code, modularizing the entire enterprise of compiler construction. Their fundamental algorithms have been incorporated into commercial compilers and popular compiler construction tools, such as Lex and YACC, which are routinely used by millions worldwide to automate building components of programming language compilers.
The impact of their work since the 1970s is so profound and so pervasive that it is easily taken for granted. Anyone who has programmed in C or any of its derivatives; anyone who has used Unix tools, such as AWK, fgrep, Lex, and YACC; anyone who has taken a compiler course; or anyone who has built a compiler by just reading the Dragon Book has been touched by the work of Aho and Ullman.
Their early joint work in algorithm design and analysis techniques contributed crucial approaches to the theoretical core of computer science that emerged in this period. In its focus on basic methods for processing graphs, strings, and sequences, this algorithmic work was tightly integrated with their research on programming languages as well. Their foundational book with John Hopcroft, The Design and Analysis of Computer Algorithms, published in 1974, created the conceptual framework not just for teaching algorithms in the standard computer science curriculum for decades, but also for presenting and analyzing new algorithms developed by the research community. In addition to incorporating their own results, the book codified a range of disparate algorithms into a set of general design methods, including divide-and-conquer, recursion, dynamic programming, and others that have long since entered the standard toolbox of computer scientists.
The Dragon Book and their foundational algorithms book, along with their subsequent books on data structures and algorithms, and on theoretical foundations of computer science, sit on the shelves of all researchers and practitioners of computing to this day.
For fundamental algorithms and theory underlying programming language implementation and for synthesizing those results and other state-of-the art results in their highly influential books, which educated generations of computer scientists, Aho and Ullman have been selected to receive the recognition conferred by the ACM A.M. Turing Award.
USA - 1996
For contributions to Interconnection Networks, High-Performance Computer Architecture, System Reliability, and Scheduling Techniques.