Transforms1
This book focuses on the discrete Fourier transform (DFT), dis-
crete convolution, and, particularly, the fast algorithms to calculate
them. These topics have been at the center of digital signal pro-
cessing since its beginning, and new results in hardware, theory
and applications continue to keep them important and exciting.
As far as we can tell, Gauss was the first to propose the techniques
that we now call the fast Fourier transform (FFT) for calculating
the coefficients in a trigonometric expansion of an asteroid’s orbit
in 1805 [174]. However, it was the seminal paper by Cooley and
Tukey [88] in 1965 that caught the attention of the science and
engineering community and, in a way, founded the discipline of
digital signal processing (DSP).
The impact of the Cooley-Tukey FFT was enormous. Problems
could be solved quickly that were not even considered a few years
earlier. A flurry of research expanded the theory and developed
excellent practical programs as well as opening new applications
[94]. In 1976, Winograd published a short paper [403] that set a
1This content is available online at <http://cnx.org/content/m16324/1.10/>.
Available for free at Connexions
<http://cnx.org/content/col10683/1.5>
1
CHAPTER 1. PREFACE: FAST FOURIER
2
TRANSFORMS
second flurry of research in motion [86]. This was another type
of algorithm that expanded the data lengths that could be trans-
formed efficiently and reduced the number of multiplications re-
quired. The ground work for this algorithm had be set earlier by
Good [148] and by Rader [308]. In 1997 Frigo and Johnson devel-
oped a program they called the FFTW (fastest Fourier transform
in the west) [130], [135] which is a composite of many of ideas
in other algorithms as well as new results to give a robust, very
fast system for general data lengths on a variety of computer and
DSP architectures. This work won the 1999 Wilkinson Prize for
Numerical Software.
It is hard to overemphasis the importance of the DFT, convolu-
tion, and fast algorithms. With a history that goes back to Gauss
[174] and a compilation of references on these topics that in 1995
resulted in over 2400 entries [362], the FFT may be the most im-
portant numerical algorithm in science, engineering, and applied
mathematics. New theoretical results still are appearing, advances
in computers and hardware continually restate the basic questions,
and new applications open new areas for research. It is hoped that
this book will provide the background, references, programs and
incentive to encourage further research and results in this area as
well as provide tools for practical applications.
Studying the FFT is not only valuable in understanding a powerful
tool, it is also a prototype or example of how algorithms can be
made efficient and how a theory can be developed to define opti-
mality. The history of this development also gives insight into the
process of research where timing and serendipity play interesting
roles.
Much of the material contained in this book has been collected over
40 years of teaching and research in DSP, therefore, it is difficult
to attribute just where it all came from. Some comes from my ear-
lier FFT book [59] which was sponsored by Texas Instruments and
Available for free at Connexions
<http://cnx.org/content/col10683/1.5>
3
some from the FFT chapter in [217]. Certainly the interaction with
people like Jim Cooley and Charlie Rader was central but the work
with graduate students and undergraduates was probably the most
formative. I would particularly like to acknowledge Ramesh Agar-
wal, Howard Johnson, Mike Heideman, Henrik Sorensen, Doug
Jones, Ivan Selesnick, Haitao Guo, and Gary Sitton. Interaction
with my colleagues, Tom Parks, Hans Schuessler, Al Oppenheim,
and Sanjit Mitra has been essential over many years. Support has
come from the NSF, Texas Instruments, and the wonderful teach-
ing and research environment at Rice University and in the IEEE
Signal Processing Society.
Several chapters or sections are written by authors who have exten-
sive experience and depth working on the particular topics. Ivan
Selesnick had written several papers on the design of short FFTs to
be used in the prime factor algorithm (PFA) FFT and on automatic
design of these short FFTs. Markus P üschel has developed a theo-
retical framework for “Algebraic Signal Processing" which allows
a structured generation of FFT programs and a system called “Spi-
ral" for automatically generating algorithms specifically for an ar-
chiticture. Steven Johnson along with his colleague Matteo Frigo
created, developed, and now maintains the powerful FFTW sys-
tem: the Fastest Fourier Transform in the West. I sincerely thank
these authors for their significant contributions.
I would also like to thank Prentice Hall, Inc. who returned the
copyright on The DFT as Convolution or Filtering (Chapter 5) of
Advanced Topics in Signal Processing [49] around which some
of this book is built. The content of this book is in the Connex-
ions (http://cnx.org/content/col10550/) repository and, therefore,
is available for on-line use, pdf down loading, or purchase as a
printed, bound physical book. I certainly want to thank Daniel
Williamson, Amy Kavalewitz, and the staff of Connexions for their
invaluable help. Additional FFT material can be found in Con-
nexions, particularly content by Doug Jones [205], Ivan Selesnick
Available for free at Connexions
<http://cnx.org/content/col10683/1.5>
CHAPTER 1. PREFACE: FAST FOURIER
4
TRANSFORMS
[205], and Howard Johnson, [205]. Note that this book and all the
content in Connexions are copyrighted under the Creative Com-
mons Attribution license (http://creativecommons.org/).
If readers find errors in any of the modules of this collection or
have suggestions for improvements or additions, please email the
author of the collection or module.
C. Sidney Burrus
Houston, Texas
October 20, 2008
Available for free at Connexions
<http://cnx.org/content/col10683/1.5>