Ari | Last updated: Tue 5 Mar 2024 15:28:59 GMT


The Completeness Of Statistical Distance For HVSZK

In this note, we prove the seminal result that : All statements that have statistical zero knowledge proofs reduce to the statistical distance problem1. To do so, we begin by defining what the statistical distance (SD) problem even is. And in order to describe (SD), we first need to describe what we mean by distributions encoded by circuits. Then the major technical results can be found by clicking the pick tiles in each section.

Open Problem

This reduction only works for distributions encoded by circuits. If we did not have circuits but instead drew samples using some black box oracle, then it is an open problem to show that this reduction, still holds.

Distributions Encoded By Circuits

Let X be a boolean circuit (with possibly unbounded fan in) with m input gates and n output gates. The distribution encoded by X is a probability distribution on the set \{ 0, 1\}^n, induced by feeding X with inputs sampled uniformly randomly from \{ 0, 1\}^m. We will often abuse notation by referring to X directly as the distribution instead of the circuit for the remainder of this document.

When we sample from distributions this way – we will often say that we have white-box access to the distribution.

Statical Difference Promise Problem

For constants 0 \leq \beta < \alpha \leq 1, Statistical Difference (\mathsf{SD}^{\alpha, \beta}) is a the promise problem \mathsf{SD}^{\alpha, \beta}_\mathsf{Yes}, \mathsf{SD}^{\alpha, \beta}_\mathsf{No}, where

\mathsf{SD}^{\alpha, \beta}_\mathsf{Yes}= \{(X,Y) : d_\mathtt{TV} (X, Y) > \alpha \} \mathsf{SD}^{\alpha, \beta}_\mathsf{No}= \{(X,Y) : d_\mathtt{TV} (X, Y) < \beta \}

where X and Y are distributions encoded by circuits.

Act 1: Statistical Distance Is HVSZK

The first thing to show that there is an honest verifier zero knowledge proof system for the Statical Distance promise problem. The polarisation lemma is the key trick that we need to understand here.


Act 2a: Public Coin HVSZK Karp Reduces Statical Closeness

We have some promise problem \Pi=(\Pi^\mathsf{Yes}, \Pi^\mathsf{No}) and a public coin HVSZK proof system (\mathsf{P}, \mathsf{V}) for \Pi with constant completeness and soundness error. Let \mathsf{Sim} denote the simulator for the proof system described above. Given x \in \Pi^\mathsf{Yes}\cup \Pi^\mathsf{No}, we want to use the simulator \mathsf{Sim} to come up with two distributions X and Y such that

\begin{equation*} x \in \Pi^\mathsf{Yes}\implies d_\mathtt{TV} (X, Y) \leq \frac{1}{3} \end{equation*}

\begin{equation*} x \in \Pi^\mathsf{No}\implies d_\mathtt{TV} (X, Y) > \frac{2}{3} \end{equation*}

NOTE: Statical Closeness is the complement of the Statistical Distance promise problem.

Act 2b: Private Coin HVSZK Karp Reduces To Entropy Difference

We have some promise problem \Pi=(\Pi^\mathsf{Yes}, \Pi^\mathsf{No}) and a private coin HVSZK proof system (\mathsf{P}, \mathsf{V}) for \Pi with completeness and soundness error of 2^{-40}. Let \mathsf{Sim} denote the simulator for the proof system described above. Given x \in \Pi^\mathsf{Yes}\cup \Pi^\mathsf{No}, define the security parameter as \kappa= |x|. We want to use the output of the simulator \mathsf{Sim}(1^\kappa, x) to come up with two distributions X and Y such that

\begin{equation*} x \in \Pi^\mathsf{Yes}\implies \mathsf{H}(X) \geq \mathsf{H}(Y) + 1 \end{equation*}

\begin{equation*} x \in \Pi^\mathsf{No}\implies \mathsf{H}(Y) \geq \mathsf{H}(X) + 1 \end{equation*}

where \mathsf{H}(X) and \mathsf{H}(Y) refers to the entropy of distributions X and Y as defined in Definition .


Act 3 Entropy Difference Karp Reduces To Statistical Difference

This is the final piece of the puzzle to show that the entropy difference problem (ED) karp reduces to statistical distance.