We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Randomized Query to Communication lifting for low discrepancy gadgets

Formale Metadaten

Titel
Randomized Query to Communication lifting for low discrepancy gadgets
Alternativer Titel
Query-to-Communication lifting using low-discrepancy gadgets
Serientitel
Anzahl der Teile
12
Autor
Lizenz
CC-Namensnennung - keine kommerzielle Nutzung - keine Bearbeitung 4.0 International:
Sie dürfen das Werk bzw. den Inhalt in unveränderter Form zu jedem legalen und nicht-kommerziellen Zweck nutzen, vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
Identifikatoren
Herausgeber
Erscheinungsjahr
Sprache

Inhaltliche Metadaten

Fachgebiet
Genre
Abstract
Lifting theorems are theorems that relate the query complexity of a function f : {0, 1}^n → {0, 1} to the communication complexity of the composed function f ◦ g^n, for some “gadget” g : {0, 1}^b × {0, 1}^b →{0, 1}. Such theorems allow transferring lower bounds from query complexity to the communication complexity, and have seen numerous applications in the recent years. In addition, such theorems can be viewed as a strong generalization of a direct-sum theorem for the gadget g. We prove a new lifting theorem that works for all gadgets g that have logarithmic length and exponentially-small discrepancy, for both deterministic and randomized communication complexity. Thus, we increase the range of gadgets for which such lifting theorems hold considerably. Our result has two main motivations: First, allowing a larger variety of gadgets may support more applications. In particular, our work is the first to prove a randomized lifting theorem for logarithmic-size gadgets, thus improving some applications the theorem. Second, our result can be seen a strong generalization of a direct-sum theorem for functions with low discrepancy. Joint work with Arkadev Chattopadhyay, Yuval Filmus, Or Meir, Toniann Pitassi