Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore an advanced seminar lecture on communication complexity theory that extends Yao's classical two-party communication model to distributed estimation problems. Learn how Alice and Bob can estimate expected values of functions when they hold probability distributions rather than singleton inputs, requiring communication that scales with error parameters. Discover a novel debiasing protocol that achieves linear communication complexity in the error parameter, improving upon naive sampling approaches that scale quadratically. Examine spectral techniques for proving lower bounds in distributed estimation and understand why the Equality function represents the easiest full rank Boolean function for this problem class. Investigate connections between this theoretical framework and practical applications in sketching algorithms, database systems, and machine learning, while exploring tight lower bounds for various functions including the notable exception of set-disjointness problems.