Abstract
The ‘Simulated Surgery’ is an alternative consulting skills component of the Membership examination of the Royal College of General Practitioners, which is the professional certifying examination for GP registrars (family medicine residents) in the UK. It consists of a 20 station OSCE and is taken by a small cohort of candidates (10--30) who are unable to provide a videotape of their patient interviews for assessment. The passmark for this examination has been set by a modified contrasting groups method, in which all the examiners make pass/fail judgements on all the candidates' performance by reviewing their whole-test grades. A consistent passmark was obtained for two different cohorts and this method should allow a constant passing standard to be maintained under changing circumstances in the future.
Similar content being viewed by others
References
Bingham L, Burrows PJ, Caird R, Holsgrove G, Jackson N and Southgate L. (1994). Simulated Surgery: A framework for the assessment of clinical competence. Education for General Practice 5, 143–150.
Bingham L, Burrows PJ, Caird R, Holsgrove G and Jackson N. (1996). Simulated Surgery — using standardised patients to assess the clinical competence of GP registrars — a potential clinical component for the MRCGP examination. Education for General Practice 7, 102–111.
Cusimano MD. (1996). Standard Setting in medical education. Academic Medicine 71, S112-S120.
Clauser BE and Clyman SG. (1994). A contrasting-groups approach to Standard Setting for Performance Assessments of Clinical Skills. Academic Medicine 69, S42-S44.
Clauser BE, Clyman SG, Margolis MJ and Ross LP. (1996). Are fully compensatory models appropriate for setting standards on performance assessments of clinical skills? Academic Medicine 71, S90–92.
Dauphinee DW, Blackmore DE, Smee S, Rothman AI and Reznick R. (1997). Using the judgements of physician examiners in setting the standards for a national multi-center high-stakes OSCE. Advances in Health Sciences Education 2, 201–211.
Jaeger RM. (1989). Certification of student competence in educational measurement. ed. Linn RL, pp. 485–514. National Council on Measurement in Education, Macmillan Publishing Co., New York.
Livingston SA. (1978). Direct testing of speaking proficiency: Theory and Application. ed. Clark JLD, pp. 255–270. Educational Testing Service, Princeton NJ.
Livingston SA and Zieky MJ. (1982). Passing Scores: A Manual for Setting Standards of Performance on Educational and Occupational Tests. Educational Testing Service, Princeton, NJ.
Miller F, Jaques A, Brailovsky CA, Sindon A and Bordage G. (1997). When to recommend compulsory versus optional CME programs? A study to establish criteria. Academic Medicine 72, 760–764.
Neighbour R. (1997). ‘Easier to take, no easier to pass’ — the MRCGP goes modular. Education for General Practice 8, 227–231.
Norcini J. (1994). Research on standards for professional licensure and certification examinations. Evaluation and the Health professions 17(2), 160–177.
Rothman AI, Blackmore D, Cohen R and Resnick R. (1996). The consistency and uncertainty in examiners' definitions of pass/fail performance on OSCE stations. Evaluation and the Health Professions 19, 118–24.
Van der Vleuten CPM and Swanson DB. (1990). Assessment of clinical skills with standardized patients: State of the art. Teaching and Learning in Medicine 2(2), 58–76.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Burrows, P.J., Bingham, L. & Brailovsky, C.A. A Modified Contrasting Groups Method Used for Setting the Passmark in a Small Scale Standardised Patient Examination. Adv Health Sci Educ Theory Pract 4, 145–154 (1999). https://doi.org/10.1023/A:1009826701445
Issue Date:
DOI: https://doi.org/10.1023/A:1009826701445