In "standard" error reduction with an expander, if a randomize algorithm uses $n^d$ random bits, we need $n^d+O(n)$ random bits to achieve $2^{-O(n)}$ error probability. Now, if the algorithm has a good error probability to begin with, say 1/n, without expanders we need to run it only O(n/log n) times to achieve the same error and use $n^d\times O(n/\log n)$ random bits. If I'm correct with expanders it doesn't help (beside a little in the first step), meaning we still need $n^d+O(n)$ random bits. There exist some method, such that a good algorithm does have an affect on it (uses say $n^d+O(n/\log n)$ random bits)? There exist some barrier for achieving such a result?
*The question refers to the randomized algorithm in a black-box manner.
You can't expect this to be true since you can reduce the error of a constant-error algorithm to $1/n$ using an additional $O(\log n)$ bits using expanders (and by the way you don't really need even these extra bits to do that). So if an error of $1/n$ can go down to $\exp(-n)$ error using only $O(n/\log n)$ additional bits, then you could always go from constant error to $\exp(-n)$ error using $O(\log n)+O(n/\log n)=O(n/\log n)$.