Abstract: A central question in lower bound theory is whether a compound (computational) problem that encodes N independent instances of a simple problem is N times as hard as the simple problem. Such a statement, when true, is called a direct sum theorem. Such theorems are not always easy to prove and sometimes, contrary to intuition, they are false!
In this talk, I shall describe a unified framework that I call the "information complexity paradigm" for proving various direct sum theorems. Central to the paradigm is a way to quantify the work of an algorithm (typically, a communication protocol) using information theoretic notions of entropy and mutual information, and then using mathematical properties of these quantities to analyze the algorithm.
I shall survey the large number of results that have made use of the paradigm, since its formal introduction at FOCS 2001. I shall briefly discuss the applications of such results, which range widely, touching upon high-dimensional nearest neighbor searching, algorithms for massive data streams, randomized decision trees, and Boolean circuits.
This talk will be accessible to graduate students.