USA - 2012
For transformative contributions to the theory of computation.
USA - 2010READ FULL CITATION AND ESSAY
For transformative contributions to the theory of computation, including the theory of probably approximately correct (PAC) learning, the complexity of enumeration and of algebraic computation, and the theory of parallel and distributed computing.
Over the past 30 years, Leslie G. Valiant has made fundamental contributions to many aspects of theoretical computer science. His work has opened new frontiers, introduced ingenious new concepts, and presented results of great originality, depth, and beauty. Time and again, Valiant's work has literally defined or transformed the computer science research landscape.
Valiant's greatest single contribution may be his 1984 paper "A Theory of the Learnable," which laid the foundations of computational learning theory. He introduced a general framework as well as concrete computational models for studying the learning process, including the famous "probably approximately correct" (PAC) model of machine learning. This has developed into a vibrant research area and has had enormous influence on machine learning, artificial intelligence, and many areas of computing practice, such as natural language processing, handwriting recognition, and computer vision.
Valiant has made many seminal contributions to computational complexity. He introduced the notion of complexity of enumeration, in terms of the complexity class #P. The most surprising consequence of this study was that natural enumeration problems can be intractable even when the corresponding decision problem is tractable. Another fundamental contribution to computational complexity was Valiant's theory of algebraic computation, in which he established a framework for understanding which algebraic formulas can be evaluated efficiently.
A third broad area in which Valiant has made important contributions is the theory of parallel and distributed computing. His design of randomized routing strategies laid the groundwork for a rich body of research that exposed how randomization can be used to offset congestion effects in communication networks. He proposed the bulk synchronous model of parallel computation. He also posed a number of influential challenges leading to the construction of parallel algorithms for seemingly inherently sequential problems. Finally, the superconcentrators constructed by Valiant in the context of computational complexity established the fundamental role of expander graphs in computation.