4488 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 10, OCTOBER 2008
List Decoding of Biorthogonal Codes and the
Hadamard Transform With Linear Complexity
Ilya Dumer, Fellow, IEEE, Grigory Kabatiansky, and Cédric Tavernier
Abstract—Let a biorthogonal Reed–Muller code of
length be used on a memoryless channel with an input al-
phabet and a real-valued output . Given any nonzero received
vector in the Euclidean space and some parameter ,
our goal is to perform list decoding of the code and re-
trieve all codewords located within the angle from . For
an arbitrarily small , we design an algorithm that outputs this list
of codewords with the linear complexity order of bit oper-
ations. Without loss of generality, let vector be also scaled to the
Euclidean length of the transmitted vectors. Then an equiva-
lent task is to retrieve all coefficients of the Hadamard transform
of vector whose absolute values exceed . Thus, this decoding
algorithm retrieves all -significant coefficients of the Hadamard
transform with the linear complexity instead of the com-
plexity of the full Hadamard transform.
Index Terms—Biorthogonal codes, Hadamard transform, soft-
decision list decoding.
I. INTRODUCTION
B
IORTHOGONAL (first-order) Reed–Muller codes
have been extensively used in communica-
tions and addressed in many papers since the 1960s. These
codes have optimal parameters and achieve the maximum
possible distance for the given length and
dimension . One renowned decoding algorithm designed
by Green [1] performs maximum-likelihood decoding of codes
and finds the distances from the received vector to
all codewords of with complexity of
bit operations. Another algorithm designed by Litsyn and
Shekhovtsov [2] performs bounded distance decoding and
corrects up to errors with linear complexity .
In the area of probabilistic decoding, a major breakthrough
has been achieved by Goldreich and Levin [3]. Their algorithm
takes any received vector and outputs the list of codewords of
within a decoding radius performing this
task with a high probability and a low poly-logarithmic
complexity for any and . Recently,
Manuscript received July 2, 2007. Current version published September 17,
2008. The work of I. Dumer was supported in part by the National Science
Foundation under Grants CCF0622242 and CCF0635339. The work of G. Ka-
batiansky was supported in part by the Russian Foundation for Fundamental Re-
search under Grants 06-01-00226 and 08-07-92495. The material in this paper
was presented in part at the IEEE International Symposium on Information
Theory, Nice, France, June 2007.
I. Dumer is with the Department of Electrical Engineering, University of Cal-
ifornia, Riverside, CA 92521 USA (e-mail: dumer@ee.ucr.edu).
G. Kabatiansky is with the Institute for Information Transmission Problems,
Moscow 101447, Russia and with INRIA, Rocquencourt, France (e-mail:
kaba@iitp.ru).
C. Tavernier is with Communications and Systems (CS), Le Plessis Robinson,
France; (e-mail: tavernier.cedric@gmail.com).
Communicated by T. Etzion, Associate Editor for Coding Theory.
Digital Object Identifier 10.1109/TIT.2008.929014
list decoding of codes has been extended to deter-
ministic algorithms. In particular, the algorithm of [4] performs
error-free list decoding within the radius with linear
complexity for any received vector.
This paper advances the results of [4] in two different di-
rections. First, we extend list decoding of codes
to an arbitrary memoryless semi-continuous channel. Second,
the former complexity of [4] will be reduced to
. In doing so, we use the following setup.
Let a binary vector be mapped onto the
Euclidean vector with symbols .
Given two binary vectors and , consider the Hamming dis-
tance , the Euclidean distance , and the inner
product of their maps . Then
(1)
Now any binary code is mapped into the cube , which
in turn belongs to the Euclidean sphere of radius in
the Euclidean space . Thus, any binary code of Hamming
distance becomes a spherical code, where two different
codewords have the inner product at most and the
angle at least .
Below, we consider a memoryless channel with an input al-
phabet and some larger output alphabet (usually, ). We use
a code on this channel and replace an output in
any position with its log-likelihood ratio
We then call a received vector. Note
that any codeword has a higher posterior probability
than another codeword if it also has a
larger inner product .
Note that all codewords become equiprobable for
therefore, we will assume that . Without loss of generality,
we can multiply by the scalar , where
is the squared Euclidean length of vector Then, all vectors
and belong to the same sphere
We now proceed with the biorthogonal codes. Let
be any affine Boolean function defined on all points
0018-9448/$25.00 © 2008 IEEE
Authorized licensed use limited to: Univ of Calif Riverside. Downloaded on March 03,2010 at 21:09:31 EST from IEEE Xplore. Restrictions apply.