Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR07-085 | 2nd September 2007 00:00

Multilinear Formulas, Maximal-Partition Discrepancy and Mixed-Sources Extractors

RSS-Feed

Abstract:

We study multilinear formulas, monotone arithmetic circuits, maximal-partition discrepancy, best-partition communication complexity and extractors constructions. We start by proving lower bounds for an explicit polynomial for the following three subclasses of syntactically multilinear arithmetic formulas over the field C and the set of variables {x1,...,xn}:

1. Noise-resistant. A syntactically multilinear formula computing a polynomial h is \eps-noise-resistant, if it approximates h even when each of its edges is multiplied by an arbitrary value that is \eps close to 1 (we think of this value as noise). Any formula is 0-noise-resistant, and, more generally, the smaller \eps is the less restricted an \eps-noise-resistant formula is. We prove an \Omega(n/k) lower bound for the depth of 2^{−k}-noise-resistant syntactically multilinear formulas, for every integer k.

2. Non-cancelling. A syntactically multilinear formula is \tau-non-cancelling, if for every sum gate v in it, the norm of the polynomial computed by v is at least \tau times the norm of the polynomial computed by both children of v. Any formula is 0-non-cancelling, and, more generally, the smaller \tau is the less restricted a \tau-non-cancelling formula is. We prove an \Omega(n/k) lower bound for the depth of 2^{−k}-non-cancelling syntactically multilinear formulas, for every integer k.

3. Orthogonal. A syntactically multilinear arithmetic formula is orthogonal, if for every sum gate v in it, the two polynomials computed by the children of v are orthogonal (as vectors). Orthogonal syntactically multilinear formulas were first defined by Aaronson in connection to a certain type of quantum computation. We prove a tight 2^{\Omega(n)} lower bound for the size of orthogonal syntactically multilinear formulas.

We also prove a tight 2^{\Omega(n)} lower bound for the size of (not necessarily multilinear) monotone arithmetic circuits. To the best of our knowledge the best lower bounds previously known for the monotone model are 2^{\Omega(\sqrt{n})}.

One ingredient of our proof is an explicit map f from {0,1}^n to {0,1} that has exponentially small discrepancy for every partition of {1,...,n} into two sets of roughly the same size. More precisely, for every partition of {1,...,n} into two sets of size at least n/3 each, the matrix of f that corresponds to that partition has exponentially small discrepancy (the discrepancy of a matrix is the maximal difference between the number of 1's and 0's in a sub-matrix divided by the size of the matrix). We give two additional applications of this property:

1. Communication Complexity. The best-partition communication complexity of a map h from {0,1}^n to {0,1} is defined as the minimal communication complexity of h, where the minimum is taken over all partitions of {1,...,n} to two sets A and B of equal size (where Alice gets the bits in A and Bob gets the bits in B). We prove a tight \Omega(n) lower bound for the probabilistic best-partition communication complexity of f. To the best of our knowledge the best lower bound previously known for this model is \Omega(\sqrt{n}).

2. Mixed-2-source Extractors. A mixed-2-source is a source of randomness whose bits arrive from two independent sources (of size n/2 each), but they arrive in a fixed but unknown order. Using the small maximal-partition discrepancy of f we are able to extract one almost perfect random bit from a mixed-2-source of min-entropy (1 − \delta)n (for some constant \delta > 0). We then show how to use the same methods in order to extract a linear number of almost perfect random bits from such sources.



ISSN 1433-8092 | Imprint