- it.information-theory boolean-functions
- Updated Sun, 22 May 2022 09:00:29 GMT

Say we have a function $f:\mathbb{Z}_2^n \to \mathbb{R}$ such that $$\forall x\in \mathbb{Z}_2^n \quad f(x) \in \left\{\frac{1}{2^n}, \frac{2}{2^n}, \ldots, \frac{2^n}{2^n} \right\},$$ and $f$ is a distribution, i.e., $\sum_{x\in \mathbb{Z}_2^n} f(x) = 1$.

The Shannon entropy of $f$ is defined as follows: $$H(f) = -\sum _{x \in \mathbb{Z}_2^n} f(x) \log \left( f(x) \right) .$$

Let $\epsilon$ be some constant. Say we get an $\epsilon$-noisy version of $f(x)$, i.e., we get a function $\tilde{f}:\mathbb{Z}_2^n \to \mathbb{R}$ such that $|\tilde{f}(x)- f(x) | < \epsilon$ for every $x\in \mathbb{Z}_2^n$. What is the effect of the noise on the entropy? That is, can we bound $H(\tilde{f})$ by a "reasonable" function of $\epsilon$ and $H(f)$, such as: $$(1-\epsilon)H(f) < H(\tilde{f}) < (1+\epsilon)H(f),$$ or even, $$(1-\epsilon^c n)^d H(f) < H(\tilde{f}) < (1+\epsilon^c n)^d H(f),$$ for some constants $c,d$.

*Edit: Trying to get a feeling for the effect of noise on Shannon's entropy, any
"reasonable" additive bound on $H(\tilde{f})$ would also be very interesting.*

Such a bound is not possible. Consider the case where $f$ is the distribution that is uniform over some set $S$ of size $2^{\delta \cdot n}$, and let $\tilde{f}$ be the distribution that with probability $\delta$ outputs a uniformly distributed element of $S$, and otherwise outputs a uniformly distributed string.

It is not hard to see that you can get from $f$ to $\tilde{f}$ you only need noise of at most $(1-\delta) \cdot 2^{-\delta \cdot n}$. However, $H(f)= \delta \cdot n$ while $H(\tilde{f}) \approx (1 - \delta + \delta^2) \cdot n$. Thus, you get a difference of $(1 - \delta)^2 \cdot n$ for arbitrarily small $\delta$ for an extremely low noise.

In particular, you can set $\delta = \frac{\log(1/\varepsilon)}{n}$, and obtain noise $\varepsilon$ and entropy difference $\approx n - 2\log(1/\varepsilon)$.

- +1 – You mean
*approximately*$\epsilon n$, right? In any case, I think that it might help the asker to answer about an additive bound as well, and not just about a mutliplicative bound (I know he asked specifically about multiplicative). — Aug 24, 2012 at 22:34 - +0 – Thanks for the correction. I don't know what is the answer for an additive bound. — Aug 25, 2012 at 00:14
- +0 – Thanks for the answer Or! It is hopeless to get a multiplicative bound on a function with entropy $0$. However, what about functions with entropy greater than 0? Is it possible to get such a bound then? — Aug 25, 2012 at 04:44
- +0 – @DanaMoshkovitz - The case of an additive bound is indeed very relevant. I'll add it to the question. Thanks for pointing it out! — Aug 25, 2012 at 04:50
- +0 – @OrMeir - Slightly tweaking your example yields a function that contradict the first type of multiplicative bound I asked for (even for functions with $H(f) \neq 0$), however I couldn't find such an example for the second type of multiplicative bound I asked about (assuming $H(f) \neq 0$). Any ideas? — Aug 25, 2012 at 05:13