Given a distribution p on {-1,1}^d, we want to test whether p is uniform. If p is assumed to be a product distribution, this can be done with Theta(sqrt{d}) samples; without this assumption, then things get exponentially worse and Theta(2^{d/2}) are necessary and sufficient. Assume however we can condition on arbitrary bits: does the problem become easier? If so, how much?
This is the "subcube conditional sampling model", first considered in Bhattacharyya and Chakraborty (2018). We provide a nearly optimal ~O(sqrt{d})-algorithm for testing uniformity in this model. The key
component of our proof is a natural notion of random restriction for distributions on {-1,1}^d, and a quantitative analysis of how such a restriction affects the mean vector of the distribution. Along the way, we provide an optimal upper bound on the (unconditional) sample complexity of a natural problem, "mean testing."
Joint work with Xi Chen, Gautam Kamath, Amit Levi, and Erik Waingarten.