This book focuses on the discrete Fourier transform (DFT), discrete convolution, and, particularly, the fast algorithms to calculate them. These topics have been at the center of digital signal processing since its beginning, and new results in hardware, theory and applications continue to keep them important and exciting.
As far as we can tell, Gauss was the first to propose the techniques that we now call the fast Fourier transform (FFT) for calculating the coefficients in a trigonometric expansion of an asteroid's orbit in 1805 9. However, it was the seminal paper by Cooley and Tukey 5 in 1965 that caught the attention of the science and engineering community and, in a way, founded the discipline of digital signal processing (DSP).
The impact of the Cooley-Tukey FFT was enormous. Problems could be solved quickly that were not even considered a few years earlier. A flurry of research expanded the theory and developed excellent practical programs as well as opening new applications 3. In 1976, Winograd published a short paper 14 that set a second flurry of research in motion 4. This was another type of algorithm that expanded the data lengths that could be transformed efficiently and reduced the number of multiplications required. The ground work for this algorithm had be set earlier by Good 8 and by Rader 12. In 1997 Frigo and Johnson developed a program they called the FFTW (fastest Fourier transform in the west) 7, 6 which is a composite of many of ideas in other algorithms as well as new results to give a robust, very fast system for general data lengths on a variety of computer and DSP architectures. This work won the 1999 Wilkinson Prize for Numerical Software.
It is hard to overemphasis the importance of the DFT, convolution, and fast algorithms. With a history that goes back to Gauss 9 and a compilation of references on these topics that in 1995 resulted in over 2400 entries 13, the FFT may be the most important numerical algorithm in science, engineering, and applied mathematics. New theoretical results still are appearing, advances in computers and hardware continually restate the basic questions, and new applications open new areas for research. It is hoped that this book will provide the background, references, programs and incentive to encourage further research and results in this area as well as provide tools for practical applications.
Studying the FFT is not only valuable in understanding a powerful tool, it is also a prototype or example of how algorithms can be made efficient and how a theory can be developed to define optimality. The history of this development also gives insight into the process of research where timing and serendipity play interesting roles.
Much of the material contained in this book has been collected over 40 years of teaching and research in DSP, therefore, it is difficult to attribute just where it all came from. Some comes from my earlier FFT book 1 which was sponsored by Texas Instruments and some from the FFT chapter in 11. Certainly the interaction with people like Jim Cooley and Charlie Rader was central but the work with graduate students and undergraduates was probably the most formative. I would particularly like to acknowledge Ramesh Agarwal, Howard Johnson, Mike Heideman, Henrik Sorensen, Doug Jones, Ivan Selesnick, Haitao Guo, and Gary Sitton. Interaction with my colleagues, Tom Parks, Hans Schuessler, Al Oppenheim, and Sanjit Mitra has been essential over many years. Support has come from the NSF, Texas Instruments, and the wonderful teaching and research environment at Rice University and in the IEEE Signal Processing Society.
Several chapters or sections are written by authors who have extensive experience and depth working on the particular topics. Ivan Selesnick had written several papers on the design of short FFTs to be used in the prime factor algorithm (PFA) FFT and on automatic design of these short FFTs. Markus Püschel has developed a theoretical framework for “Algebraic Signal Processing" which allows a structured generation of FFT programs and a system called “Spiral" for automatically generating algorithms specifically for an architicture. Steven Johnson along with his colleague Matteo Frigo created, developed, and now maintains the powerful FFTW system: the Fastest Fourier Transform in the West. I sincerely thank these authors for their significant contributions.
I would also like to thank Prentice Hall, Inc. who returned the copyright on The DFT as Convolution or Filtering of Advanced Topics in Signal Processing 2 around which some of this book is built. The content of this book is in the Connexions (http://cnx.org/content/col10550/) repository and, therefore, is available for on-line use, pdf down loading, or purchase as a printed, bound physical book. I certainly want to thank Daniel Williamson, Amy Kavalewitz, and the staff of Connexions for their invaluable help. Additional FFT material can be found in Connexions, particularly content by Doug Jones 10, Ivan Selesnick 10, and Howard Johnson, 10. Note that this book and all the content in Connexions are copyrighted under the Creative Commons Attribution license (http://creativecommons.org/).
If readers find errors in any of the modules of this collection or have suggestions for improvements or additions, please email the author of the collection or module.
C. Sidney Burrus
Houston, Texas
October 20, 2008
Burrus, C. S. and Parks, T. W. (1985). DFT/FFT and Convolution Algorithms. New York: John Wiley & Sons.
Burrus, C. S. (1988). Efficient Fourier Transform and Convolution Algorithms. In Lim, J. S. and Oppenheim, A. V. (Eds.), Advanced Topics in Signal Processing. (p. 199–245). Englewood Cliffs, NJ: Prentice-Hall.
Cooley, James W. and Lewis, Peter A. W. and Welch, Peter D. (1967, June). Historical Notes on the Fast Fourier Transform. IEEE Transactions on Audio and Electroacoustics, 15, 260–262.
Cooley, J. W. (1992, January). How the FFT Gained Acceptance. [Also presented at the ACM Conference on the History of Scientific and Numerical Computation, Princeton, NJ, May 1987 and published in: A History of Scientific Computing, edited by S. G. Nash, Addison-Wesley, 1990, pp. 133-140.]. IEEE Signal Processing Magazine, 9(1), 10–13.
Cooley, J. W. and Tukey, J. W. (1965). An Algorithm for the Machine Calculation of Complex Fourier Series. Math. Computat., 19, 297–301.
Frigo, Matteo and Johnson, Steven G. (2005, February). The Design and Implementtion of FFTW. Proceedings of the IEEE, 93(2), 216–231.
Frigo, Matteo and Johnson, Steven G. (1997, September). The Fastest Fourier Transform in the West. (MIT-LCS-TR-728). Technical report. MIT, Cambridge, MA: Laboratory for Computer Science.
Good, I. J. (1958). Interaction Algorithm and Practical Fourier Analysis. [Addendum: vol. 22, 1960, pp 372–375]. J. Royal Statist. Soc., B, 20, 361–372.
Heideman, M. T. and Johnson, D. H. and Burrus, C. S. (1984, October). Gauss and the History of the FFT. [also in Archive for History of Exact Sciences, 1985]. IEEE Acoustics, Speech, and Signal Processing Magazine, 1(4), 14–21.
Jones, Douglas L. (2007, February). The DFT, FFT, and Practical Spectral Analysis. [http://cnx.org/content/col10281/1.2/]. Connexions.
Lim, J. S. and Oppenheim, A. V. (1988). Advanced Topics in Signal Processing. Englewood Cliffs, NJ: Prentice-Hall.
Rader, C. M. (1968, June). Discrete Fourier Transforms When the Number of Data Samples is Prime. Proceedings of the IEEE, 56, 1107–1108.
Sorensen, Henrik V. and Burrus, C. Sidney and Heideman, Michael T. (1995). Fast Fourier Transform Database. Boston: PWS Publishing.
Winograd, S. (1976, April). On Computing the Discrete Fourier Transform. Proc. National Academy of Sciences, USA, 73, 1006–1006.