The lectures will be given in Cortona, Italy, July 18-29, 2005. Notes will be posted.
Summary. In computer science, there are some important problems for which randomized algorithms are simpler and faster than the best known deterministic ones. Similarly, in combinatorics, there are some objects (e.g., expander graphs, Ramsey graphs, error-correcting codes, and many others) whose existence is easy to prove using the probabilistic method but for which we only have difficult and sub-optimal explicit constructions.
Perhaps surprisingly, work in the past 20+ years on "hardness versus randomness" suggests that every probabilistic algorithm may be simulated deterministically with comparable efficiency. In particular, this is true under certain plausible complexity-theoretic assumptions, such as that random integers are hard to factor (but much weaker assumptions suffice). Under the same assumptions, every object whose existence can be proved with the probabilistic method has an efficient construction (provided that one can efficiently recognize such an object given one).
These predictions have been recently confirmed by the Agrawal-Kayal-Saxena deterministic algorithm for testing primality in polynomial time, Reingold's deterministic algorithm to solve connectivity problems in undirected graphs using logarithmic memory, and progress on combinatorial constructions. While the primality testing algorithm is somewhat independent of the work on "hardness versus randomness," there is a deep connection between such work and some recent combinatorial constructions as well as Reingold's algorithm.
In this two-weeks couse we will begin by reviewing some examples of randomized algorithms (checking polynomial identities, random walks on graphs) and of probabilistic constructions (Ramsey graphs and expander graphs) and we will see a first instantiation of the hardness-versus-approach: the Blum-Micali-Yao construction of pseudorandom generators from "one-way" permutations. We will then see the Nisan-Wigderson construction of a pseudorandom generator under considerably weaker complexity assumption, and then show how to further weaken those assumptions by proving a connection between the average-case complexity and the worst-case complexity of certain problems. Moving on to combinatorial constructions, we will define randomnness extractors, discuss their relation to expander graphs, and give a construction of randomness extractors based on the Nisan-Wigderson pseudorandom generator construction. Time permitting, we will see the zig-zag product approach to the construction of expanders and how this is used back in complexity theory to give constructions of pseudorandom walks on graphs and to show how to do "depth-first search without the stack" in Reingold's algorithm.
Subject to change
Some excellent survey papers: