Skip to main content

Reliability, Number of Stations, and Examination Length in an Objective Structured Clinical Examination

  • Chapter
Advances in Medical Education

Summary

Objective Structured Clinical Examinations (OSCEs) are widely used to document medical competence but reliability concerns raise questions about the number of stations and time that should be allocated for reproducible performance estimates. For example, when Standardized Patients (SPs) are used, Swanson and Norcini have estimated that 8 hours of testing or approximately 24 twenty-minute stations are needed to reliably assess information gathering and communication skills.1 The University of Michigan’s (UM) third-year OSCE consists of 12 stations using a variety of formats to assess a range of clinical skills. Generalizability theory was used to assess the reliability of the existing examination and estimate potential changes needed for improvement. The results indicated that the 12 station exam has a reliability of approximately. 60. If the exam were expanded to 20 stations, reliability would increase to the lower. 70s. In order to obtain score reliability of 0.80, it would be necessary to have approximately 30 stations or 10 hours. The fact that the UM OSCE was designed to assess a range of clinical skills may introduce variance from content area and skill domain as well from student and test-related sources and thereby account for the relatively lower reliability estimates compared with the Swanson and Norcini study.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Swanson DB, Norcini JJ. Factors influencing the reproducibility of tests using standardized patients. Teaching and Learning in Medicine 1989;1:158–166.

    Article  Google Scholar 

  2. Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education 1979;133:41–54.

    Google Scholar 

  3. Robb KV, Rothman AI. The assessment of clinical skills in general internal medicine residents: Comparison of an objective structured clinical examination to a conventional oral examination. Annals of the Royal College of Physicians and Surgeons of Canada 1985;88:235–238.

    Google Scholar 

  4. Brennan RL. Elements of generalizability theory. American College Testing program, 1983.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Gruppen, L.D., Davis, W.K., Fitzgerald, J.T., McQuillan, M.A. (1997). Reliability, Number of Stations, and Examination Length in an Objective Structured Clinical Examination. In: Scherpbier, A.J.J.A., van der Vleuten, C.P.M., Rethans, J.J., van der Steeg, A.F.W. (eds) Advances in Medical Education. Springer, Dordrecht. https://doi.org/10.1007/978-94-011-4886-3_133

Download citation

  • DOI: https://doi.org/10.1007/978-94-011-4886-3_133

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-6048-6

  • Online ISBN: 978-94-011-4886-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics