# COMP 760 (Winter 2014): Introduction to communication and information complexity

Instructor's contact: See my homepage.
Time and Location: Monday-Wednesday 10:05-11:25 at McConnell 103.

What is it about? This course is intended for graduate students in theoretical computer science or mathematics. It is a topics course in communication complexity and information complexity. The setting is that there are two players (with unlimited computational power), and given a function f(x,y), they want to agree on a communication protocol for the following purpose: Each of the players receives an n bit input, say x and y. Neither knows the other's input, and they wish to use the protocol to collaboratively compute f(x,y). Communication complexity is concerned with minimizing the number of bits communicated by the players for the worst-case choice of inputs x and y. Information complexity is concerned with the amount of information that the communicated bits reveal about the inputs x and y to the other player or to the external observer. The areas of communication and information complexity are currently very active fields of research in theoretical computer science, and they employ a surprisingly vast range of techniques and tools from other areas of mathematics.

What are the prerequisites? The main prerequisite is mathematical maturity and familiarity with basic background in linear algebra, mathematics analysis, and probability theory.

Evaluation? It will be based on assignments and a final presentation.

### Projects

• Shachar Lovett. Communication is bounded by root of rank, and Thomas Rothvoss, A direct proof for Lovett's bound on the communication complexity of low rank matrices. (Costin)
• A. A. Sherstov. The communication complexity of gap Hamming distance. (Antony)
• Mark Braverman, Ankit Garg. Small value parallel repetition for general games. (Yaqiao)
• Mark Braverman, Ankit Garg, Denis Pankratov, Omri Weinstein, Information lower bounds via self-reducibility. (Lena)
• Mark Braverman, Jon Schneider. Information complexity is computable. (Liana)

### Lectures

• Lecture 1: Deterministic model of two-party communication. Protocols for the equality and parity problems. Formal definition of protocols and their tree representation. Combinatorial rectangles, and their relation to deterministic protocols. Tight lower bounds for the equality. (Reading: 1.1, 1.2, 1.3 of Kushilevits-Nisan).

• Lecture 2: Lower bounds on communication via matrix rank. Log-rank Conjecture. Rectangle covers, and upper-bounding deterministic communication complexity in terms of the sizes of the smallest covers (Theorem 2.11 of Kushilevits-Nisan). Introduction to private coin randomized communication complexity. A private coin randomized protocol for Equality. (Reading: 1.4, 2.1, 2.3, 3.1 of Kushilevits-Nisan).

• Lecture 3: Public coin Vs Private coin. A randomized protocol for Equality using public coin randomness. Newman's theorem (Theorem 3.14 of Kushilevits-Nisan) for the equivalence of the public coin and the private coin models. (Reading: 3.3 of Kushilevits-Nisan).

• Lecture 4: Distributional communication complexity, and Yao's theorem that maximum distributional communication complexity equals to the public coin randomized communication complexity. (Reading: 3.4 of Kushilevits-Nisan).

• Lecture 5: Discrepancy, lower-bounding the randomized CC in terms of discrepancy, randomized CC of the inner product function. (Reading: Section 2 of Toni's note, and/or 3.5 of Kushilevits-Nisan).

• Lecture 6: Limitation of the the discrepancy method, limitation of the fooling set technique, approximate rank and sign rank, Krause's theorem (randomized CC is at least log of the approximate rank), unbounded error model.

• Lecture 7: Paturi-Simon theorem (unbounded randomized CC equals log of the sign rank). Non-determinism, Communication complexity classes: P, NP, CoNP, PSPACE, Polynomial hierarchy, BPP, UPP, PP.

• Lecture 8: Reductions and completeness, DISJ is CoNP-complete, Matrix norms, Forster's theorem.

• Lecture 9: Fourier analysis of Boolean functions, degree, approximate degree, sign-degree, threshold weight.

• Lecture 10: Pattern Matrices and their spectral properties, Generalized discrepancy method. (Reading: 3.3, 4.1, 4.2 of Sherstov's thesis).

• Lecture 11: randomized CC and the discrepancy of pattern matrices, and their relation to approximate and sign degree. (Reading: 4.3, 4.4, 4.5, 4.6 of Sherstov's thesis).

• Lecture 12: Introduction to information theory and Entropy.

• Lecture 13: Mutual information and divergence.

• Lecture 14: Introduction to information complexity.

• Lecture 15: Information complexity of Equality.

• Lecture 16: Direct product/sum theorems, additivity of information complexity, information equals amortized communication.

• Lecture 17: A direct sum theorem using compression (from Lecture 16), Braverman-Rao's correlated sampling.

• Lecture 18: Compression using Braverman-Rao's correlated sampling (from Lecture 17). The [BBCR] compression.

• Lecture 19: Braverman's exponential compression. The external information complexity of the AND function.

• Lecture 20: The internal information complexity of the AND function.

• Lecture 21: Exact asymptotics of the randomized communication complexity of Disjointness.

• Lecture 22: Karchmer-Widgerson theorem (See Section 1.1 in Toni's note), Super-logarithmic depth lower bounds via the direct sum in communication complexity (See Theorem 7 in Karchmer-Raz-Widgerson). Multiparty Number on the forehead model (9.1, 9.2, 9.3 of Sherstov's thesis).

• Lecture 23 (April 8th): Costin's presentation.

• Lecture 24 (April 13th): Antony's presentation.

• Lecture 25 (April 15th): Yaqia's presentation.

• Lecture 26 (April 20th): Lena's presentation.

• Lecture 27 (April 22th): Liana's presentation.