[IEEE Trans. on Information Theory, November 1994, pp. 1926-1938]
Asymptotic Bounds on Optimal Noisy Channel Quantization
Via Random Coding
Kenneth Zeger and Vic Manzella
Abstract
Asymptotically optimal zero-delay vector quantization in the presence of
channel noise is studied using random coding techniques. First, an upper bound
is derived for the average rth-power distortion of channel optimized
k-dimensional vector quantization at transmission rate R on a binary
symmetric channel with bit error probability $\epsilon$. The upper bound
asymptotically equals
$2^{-rR g(\epsilon,k,r)}$, where $\frac{k}{k+r} \left[
{1 - \log_2 \left( 1 + 2\sqrt {\epsilon ( 1 - \epsilon )} \right) }
\right] \leq g(\epsilon,k,r) \leq 1$ for all $\epsilon \geq 0$,
$\lim_{\epsilon \rightarrow 0} g(\epsilon,k,r) = 1$, and $\lim_{k \rightarrow
\infty} g(\epsilon,k,r) = 1$. Numerical computations of $g(\epsilon,k,r)$ are
also given. This result is analogous to Zador's asymptotic distortion rate of
$2^{-rR}$ for quantization on noiseless channels. Next, using a random
coding argument on nonredundant index assignments, a useful upper bound is
derived in terms of point density functions, on the minimum mean squared error
of high resolution, regular, vector quantizers in the presence of channel
noise. The formula provides an accurate approximation to the distortion of a
noisy channel quantizer whose codebook is arbitrarily ordered. Finally, it is
shown in that the minimum mean squared distortion of a regular, noisy channel
VQ with a randomized nonredundant index assignment, is, in probability,
asymptotically bounded away from zero.