Title
Information Lower Bounds via Self-reducibility.
Abstract
We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702---732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.
Year
DOI
Venue
2012
10.1007/s00224-015-9655-z
Theory of Computing Systems \/ Mathematical Systems Theory
Keywords
DocType
Volume
Information complexity,Communication complexity,Self-reducibility,Gap-hamming distance,Inner product
Journal
59
Issue
ISSN
Citations 
2
1432-4350
2
PageRank 
References 
Authors
0.38
18
4
Name
Order
Citations
PageRank
Mark Braverman181061.60
Ankit Garg212516.19
Denis Pankratov3717.81
omri weinstein4496.47