Say we have a function $f:\mathbb{Z}_2^n \to \mathbb{R}$ such that $$\forall x\in \mathbb{Z}_2^n \quad f(x) \in \left\{\frac{1}{2^n}, \frac{2}{2^n}, \ldots, \frac{2^n}{2^n} \right\},$$ and $f$ is a distribution, i.e., $\sum_{x\in \mathbb{Z}_2^n} f(x) = 1$.
The Shannon entropy of $f$ is defined as follows: $$H(f) = -\sum _{x \in \mathbb{Z}_2^n} f(x) \log \left( f(x) \right) .$$
Let $\epsilon$ be some constant. Say we get an $\epsilon$-noisy version of $f(x)$, i.e., we get a function $\tilde{f}:\mathbb{Z}_2^n \to \mathbb{R}$ such that $|\tilde{f}(x)- f(x) | < \epsilon$ for every $x\in \mathbb{Z}_2^n$. What is the effect of the noise on the entropy? That is, can we bound $H(\tilde{f})$ by a "reasonable" function of $\epsilon$ and $H(f)$, such as: $$(1-\epsilon)H(f) < H(\tilde{f}) < (1+\epsilon)H(f),$$ or even, $$(1-\epsilon^c n)^d H(f) < H(\tilde{f}) < (1+\epsilon^c n)^d H(f),$$ for some constants $c,d$.
Edit: Trying to get a feeling for the effect of noise on Shannon's entropy, any "reasonable" additive bound on $H(\tilde{f})$ would also be very interesting.
Such a bound is not possible. Consider the case where $f$ is the distribution that is uniform over some set $S$ of size $2^{\delta \cdot n}$, and let $\tilde{f}$ be the distribution that with probability $\delta$ outputs a uniformly distributed element of $S$, and otherwise outputs a uniformly distributed string.
It is not hard to see that you can get from $f$ to $\tilde{f}$ you only need noise of at most $(1-\delta) \cdot 2^{-\delta \cdot n}$. However, $H(f)= \delta \cdot n$ while $H(\tilde{f}) \approx (1 - \delta + \delta^2) \cdot n$. Thus, you get a difference of $(1 - \delta)^2 \cdot n$ for arbitrarily small $\delta$ for an extremely low noise.
In particular, you can set $\delta = \frac{\log(1/\varepsilon)}{n}$, and obtain noise $\varepsilon$ and entropy difference $\approx n - 2\log(1/\varepsilon)$.