Abstract
The Fourier transform of Boolean functions has received considerable attention in the last few years in the computational learning theory community, and has come to play an important role in proving many important learnability results. The aim of this work is to demonstrate that the Fourier transform techniques are also a useful and practical algorithm, in addition to having many interesting theoretical properties. In fact, this work was prompted by a genuine problem that was brought to our attention; researchers at a company were trying to come by a method to reverse-engineer a state-free controller. They had the capability of querying the controller on any input, thus setting them in the membership query model, in which the Fourier transform algorithm is set.
In order to keep the algorithm run-time reasonable and still produce accurate hypotheses, we had to perform many optimizations. In the paper we discuss the more prominent optimizations, ones that were crucial and without which the performance of the algorithm would severely deteriorate. One of the benefits we present is the confidence level the algorithm produces in addition to the predictions. The confidence level measures the likelihood that the prediction is correct.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Angluin, D. (1987). Learning regular sets from queries and counterexamples. Information and Computation, 75, 87–106.
Bellare, M. (1992). A technique for upper bounding the spectral norm with applications to learning. In 5th COLT (Workshop on Computational Learning Theory), number 5 (pp. 62–70).
Blum, A., Furst, M., Jackson, J., Kearns, M., Mansour, Y., & Rudich, S. (1994). Weakly learning DNF and characterizing statistical query learning using fourier analysis. In The 26th Annual ACM Symposium on Theory of Computing (pp. 253–262).
Furst, M. L., Jackson, J. C., & Smith, S.W. (1991). Improved learning of AC 0 functions. In 4th COLT (Workshop on Computational Learning Theory) (pp. 317–325).
Golden, R. M. (1996). Mathematical Methods for Neural Network Analysis and Design. MIT Press.
Jackson, J. (1994). An efficient membership-query algorithm for learning DNF with respect to the uniform distribution. In Proceedings of the 35th Symposium on Foundations of Computer Science (pp. 42–53).
Kushilevitz, E. & Mansour, Y. (1993). Learning decision trees using the fourier spectrum. Siam Journal on Computing, 22(6), 1331–1348. Earlier version appeared. In Proceedings of the 23rd Annual IEEE Symposium on Foundations of Computer Science, 1991.
Linial, N., Mansour, Y., & Nisan, N. (1993). Constant depth circuits, fourier transform, and learnability. Journal of the ACM, 40(3), 607–620. Earlier version appeared in FOCS 1989.
Mansour, Y. (1994). “Learning Boolean functions via the fourier transform” Advances in Neural Computation. Kluwer Academic Publishers.
Mansour, Y. (1995). An O(n loglogn) learning algorithm for DNF under the uniform distribution. Journal of Computer and Systems Sciences, 50(3), 543–550.
Nix, D. A. and Weigend, A. S. (1995). Learning local error bars for nonlinear regression. Advances in Neural Information Processing Systems 7 (NIPS*94), G. Tesauro, D. S. Touretzky, & T. K. Leen (eds.), Cambridge, MA: MIT Press.
Quinlan, J. R. (1993). C4.5: Programs for Machine Learning. Morgan Kaufmann. Received July 2, 1996
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Mansour, Y., Sahar, S. Implementation Issues in the Fourier Transform Algorithm. Machine Learning 40, 5–33 (2000). https://doi.org/10.1023/A:1011034100370
Issue Date:
DOI: https://doi.org/10.1023/A:1011034100370