Skip to main content

Feasible Iteration of Feasible Learning Functionals

  • Conference paper
Algorithmic Learning Theory (ALT 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4754))

Included in the following conference series:

Abstract

For learning functions in the limit, an algorithmic learner obtains successively more data about a function and calculates trials each resulting in the output of a corresponding program, where, hopefully, these programs eventually converge to a correct program for the function. The authors desired to provide a feasible version of this learning in the limit — a version where each trial was conducted feasibly and there was some feasible limit on the number of trials allowed. Employed were basic feasible functionals which query an input function as to its values and which provide each trial. An additional tally argument 0i was provided to the functionals for their execution of the i-th trial. In this way more time resource was available for each successive trial. The mechanism employed to feasibly limit the number of trials was to feasibly count them down from some feasible notation for a constructive ordinal. Since all processes were feasible, their termination was feasibly detectable, and, so, it was possible to wait for the trials to terminate and suppress all the output programs but the last. Hence, although there is still an iteration of trials, the learning was a special case of what has long been known as total Fin-learning, i.e., learning in the limit, where, on each function, the learner always outputs exactly one conjectured program. Our general main results provide for strict learning hierarchies where the trial count down involves all and only notations for infinite limit ordinals. For our hierarchies featuring finitely many limit ordinal jumps, we have upper and lower total run time bounds of our feasible Fin-learners in terms of finite stacks of exponentials. We provide, though, an example of how to regain feasibility by a suitable parameterized complexity analysis.

Case and Paddock were supported in part by NSF grant number NSF CCR-0208616. We are also grateful to anonymous referees for many helpful suggestions. One such referee provided hints about the truth and truth and proof, respectively, of what became, then, Lemmas 6 and 7; hence, these results are joint work with that referee. This same referee suggested, for the future, team learning as an approach to studying some probabilistic variants of our learning criteria.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Ambainis, A., Case, J., Jain, S., Suraj, M.: Parsimony hierarchies for inductive inference. Journal of Symbolic Logic 69, 287–328 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  • Case, J., Paddock, T., Kötzing, T.: Feasible iteration of feasible learning functionals (expanded version). Technical report, University of Delaware (2007), At http://www.cis.udel.edu/~case/papers/FeasibleLearningTR.pdf and contains complete proofs

  • Downey, R., Fellows, M.: Parameterized Complexity. In: Downey, R., Fellows, M. (eds.) Monographs in Computer Science. Springer, Heidelberg (1998)

    Google Scholar 

  • Dor, D., Zwick, U.: Median selection requires (2 + ε)n comparisons. SIAM Journal on Discrete Mathematics 14(3), 312–325 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  • Freivalds, R., Smith, C.: On the role of procrastination in machine learning. Information and Computation 107(2), 237–271 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  • Irwin, R., Kapron, B., Royer, J.: On characterizations of the basic feasible functional, Part I. Journal of Functional Programming 11, 117–153 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  • Jain, S., Osherson, D., Royer, J., Sharma, A.: Systems that Learn: An Introduction to Learning Theory. 2nd edn. MIT Press, Cambridge (1999)

    Google Scholar 

  • Kapron, B., Cook, S.: A new characterization of type 2 feasibility. SIAM Journal on Computing 25, 117–132 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  • Kearns, M., Vazirani, U.: An Introduction to Computational Learning Theory. MIT Press, Cambridge (1994)

    Google Scholar 

  • Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and Its Applications, 2nd edn. Springer, Heidelberg (1997)

    Book  MATH  Google Scholar 

  • Mehlhorn, K.: Polynomial and abstract subrecursive classes. Journal of Computer and System Sciences 12, 147–178 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  • Royer, J., Case, J.: Subrecursive Programming Systems: Complexity and Succinctness. Research monograph in Progress in Theoretical Computer Science. Birkhäuser Boston (1994)

    Google Scholar 

  • Rogers, H.: Theory of Recursive Functions and Effective Computability. McGraw Hill, New York, 1967. MIT Press (reprinted, 1987)

    Google Scholar 

  • Reischuk, R., Zeugmann, T.: An average-case optimal one-variable pattern language learner. Journal of Computer and System Sciences, Special Issue for COLT 1998, 60(2), 302–335 (2000),

    Google Scholar 

  • Sierpinski, W.: Cardinal and ordinal numbers. Second revised edn. PWN –Polish Scientific Publishers (1965)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Case, J., Kötzing, T., Paddock, T. (2007). Feasible Iteration of Feasible Learning Functionals. In: Hutter, M., Servedio, R.A., Takimoto, E. (eds) Algorithmic Learning Theory. ALT 2007. Lecture Notes in Computer Science(), vol 4754. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75225-7_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-75225-7_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-75224-0

  • Online ISBN: 978-3-540-75225-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics