Predicting medical graduates’ clinical performance using national competency examination results in Indonesia

Background Indonesia has applied a national competency exit-examination for medical graduates since 2014, called The Indonesia Medical Doctor National Competency Examination (IMDNCE). This examination is administered to ensure the competence of medical graduates from at present 83 medical schools in Indonesia. Although many studies reported their evaluation on medical licensing examinations, there are not many studies performed to evaluate the correlation of a national licensing examination to the graduates’ clinical practice. Aims This research aimed to evaluate the performance of new medical doctors in Indonesia in their internship period after the IMDNCE completion, and whether it might become a predictive indicator for the new medical doctors’ clinical performance. Methods An observational cross-sectional study was performed in November–December 2017 on 209 doctors who were new medical graduates. Thirty-one senior doctors from a range of regions in Indonesia who were recruited and trained previously participated in the observation. The Clinical Performance Instrument (CPI) tool was developed as an evaluation tool of the new doctors’ clinical competence to be observed for three weeks. The obtained data were analysed using descriptive statistics and correlated to the IMDNCE scores. Results The mean (95% CI) of the CPI for all participants was 83.0 (80.8–85.2), with no correlation of CPI score with IMDNCE results in domains of communication, professionalism and patient safety (p > 0.05). However, the mean total of the CPI observation scores from doctors who graduated from public medical schools was higher than those graduating from private medical schools. Also, there were differences in scores related to the institution’s accreditation grade (p < 0.05). Conclusion There is no difference between CPI and national competency examination results. There was no statistical correlation between the clinical performance of new medical doctors during their internship to CBT and OSCE scores in the national competency examination. New doctors’ performance during internship is affected by more complex factors, not only their level of competencies.


Background
National medical competency examinations have been conducted in many countries to guarantee that the graduating medical doctors are competent based on the required standard. These tests of doctors' abilities are adminstered to ensure the quality of health-care. Assuring patient safety is one of the main considerations of performing national competency examination [1]. A study in the US indicated that the increased achievement on the national medical examination was correlated with the decrease of patient mortality rate [2]. Hence, developing a standardized national competency examination is very important, especially in countries such as Indonesia that have high mortality rates.
National medical competency examinations have been widely utilised as an evaluation tool in medical education [3][4][5]. The examination can assess communication, professionalism, patient safety, clinical management, and many other medical skills. National examinations can be used to measure knowledge, skills and attitude of clinical professionalism comprehensively [6], to assess professional development [5], to predict medical doctors' future clinical performance [7], and also to predict their performance on the subsequent medical training [3,8]. Norcini et al. [2] also reported that there was a performance difference between doctors who participated in a national licensing examination and those who did not participate.

National medical competency examination in Indonesia
The Indonesia Medical Doctor National Competency Examination (IMDNCE) is a national medical competency exit exam which has been established since 2014 based on Indonesian Medical Education Act No. 20/2013, which consists of two components. There is the multiple-choice questions using computer-based testing methods (MCQs-CBT) to assess candidates' knowledge, and an Objective Structured Clinical Examination (OSCE) to assess candidates' clinical skills performance. The IMDNCE is an important tool, not only to evaluate students' achievement towards the national standard, but also to evaluate medical graduates' knowledge/skills, accountability and development [9].

Medical internship programme in Indonesia
Newly graduated doctors in Indonesia are required to undergo the one-year internship programme. This compulsory programme serves as the first medical practice of the new doctors. During the one-year programme, each new doctor is assigned in a district hospital of the primary health-care facility under the supervision of a senior doctor. New doctors are allowed to conduct independent medical practice after completing the one-year internship programme.

Rationale
Many studies reported their evaluation of national licensing examinations [3,4,8,10]. However, studies about the effect of a national licensing examination to the subsequent medical practice are scarce. This study aimed to evaluate the performance of new medical doctors in Indonesia in their internship period after the IMDNCE completion. In addition, this study also investigated whether the IMDNCE performance might become a predictive indicator for the new medical doctors' clinical performance.

Study design and setting
This observational research was a cross-sectional study conducted from November to December 2017 in a range of regions in Indonesia. The details of the study's protocol are portrayed in Fig. 1.

Participants
A total of 209 newly graduated doctors who joined the internship program between February 2017 -February 2018 participated in this study. Medical schools in Indonesia are grouped into regions. Hence, purposive sampling was applied to ensure representativeness of each region, schools' nature (i.e., public or private), and accreditation levels of medical schools (i.e., highest A, B or the lowest C) where the new doctors graduated (see Fig. 1).
In addition, 31 senior doctors from a range of regions in Indonesia were recruited as observers. The observers were trained how to observe the new doctors' clinical performance using a validated instrument. The observation was conducted for a three-weeks period. The observers were requested to conduct clinical performance evaluation at least once-a-week for each of the new doctors to obtain three observation reports throughout the observation period. The observation was conducted in a range of clinical settings, such as outpatient clinic, emergency room, in-patient setting and field visit service. The observers were provided with information about the general purpose of the study, but were blinded to the characteristics of the interns to avoid possible biases.

Instrument
This study applied the Clinical Performance Instrument (CPI) which was based on the IMDNCE OSCE examination rubric. There are seven domains in the examination, which are anamnesis, physical examination, supportive examination, clinical diagnosis and differential diagnosis, patient management (pharmacology and non-pharmacology), health education, and professionalism. Discussion with an expert panel was conducted during CPI development to conclude the three domains of CPI. The CPI then underwent a validity study towards 58 medical doctors who practice in primary care setting. Using Likert scale for each items, the medical doctors completed in the instrument based on their perception and experience during clinical practices. The instrument showed a good reliability score (Cronbach-alpha = 0.93) [11].
The CPI consists of three domains of competence (i.e., doctor-patient communication/K1, professionalism/ K2 and patient safety/K3) which incorporates 25 observation items (see Table 1), in a 4-point scale. Technical skills (i.e., physical examination, procedural skills, clinical reasoning and written communication) fall under patient safety (K3) domain.

Data analysis
The obtained data were analysed using descriptive statistics to measure mean, median and standard deviations of the three-weeks observations scores. The CPI scores were analysed based on the medical schools' status and accreditation using the Mann-Whitney and Kruskall-Wallis tests.
The CPI observation scores were subsequently tested using correlation and multiple linear regression analyses against the IMDNCE scores (computer-based test/CBT and OSCE), to measure the predictive value. Multiple linear regression was capable of measuring the relationship between two independent variables, for this instance the IMDNCE score as predictor of CPI (significant if p value < 0.05).

Ethical considerations
This study was ethically approved by the Committee of Research Ethics, Faculty of Medicine Universitas Sebelas Maret (No: 1066/XII/HREC/2016). All study participants (i.e., new doctors and observers) had given their consent before the observer training and the three-weeks observation.

Demographics
The 209 new doctors were graduated from medical schools with different levels of accreditation (A, B and C), institutional status (private and public) and region of origin. The new doctors were also conducting their internship in a range of regions (Table 2).

CPI observation scores
The mean (95% CI) of the CPI for all participants was 83.0 (80. 8 Table 3. Furthermore, the CPI observation scores were tested against the IMDNCE CBT and OSCE scores of the corresponding domains (i.e., namely communication, professionalism and patient safety). There were no correlations between the CPI observation scores to IMDNCE CBT nor OSCE scores (p = 0.368 -0.928) as outlined in Table 4.

Discussion
IMDNCE serves as a means of standardization as well as quality assurance of medical school graduates in Indonesia. As a standardized exit examination, IMDNCE is expected to be able to predict the performance of fresh graduate doctors in their early phase of clinical practice, especially during the internship [1,12,13]. Standardization of graduates through IMDNCE has also been expected to assure the quality of Indonesian doctors.
This expectation is manifested in the study result, showing that there was no difference between CBT and OSCE scores and clinical observation results during internship (p > 0.05). There were no differences in their performances regarding communication, professionalism, and patient safety competences as well. These results may suggest that through IMDNCE, doctors could demonstrate standardized quality medical care and the test items (CBT and OSCE) could be used as tools to standardize the quality of clinical performance. National examinations are necessary to assure the quality of doctor candidates within the diversity of medical education institutions [14]. The diversity is influenced by several factors ranging from the input, process, and output of the curriculum. The Indonesian Doctors Standard of Competencies (SKDI) 2012 acts as a national standard for the medical education process with seven areas of competences. However, the implementation of the standard is still influenced by other factors that might affect the quality of medical education in Indonesia, e.g. human resource, learning resources, research development and organization, curriculum development and innovation, and a comprehensive internal quality assurance process. These factors emphasize the need for output standardization through IMDNCE.
Institutional factors are not the only concerns that contribute to the quality of output and the clinical performance of medical school graduates. Internship as the initial phase of clinical practice in real settings for Indonesian medical school graduates might be affected by several factors that are not taught during their educational period [15,16]. The national health care system with its universal health coverage principal becomes a major influence for clinical performance [1]. The Indonesian health care system prioritizes the strengthening of primary services, whether promotive, preventive, curative, and rehabilitative, in which the intern doctors are at   [17], which, of course, leads to a shift in the quality of care that is affected by the number of patients and the length of service time. This would be different if the phase of education, both pre-clinic and clinics are more ideal both theoretical and clinical practice. Although the professional performance of physicians in this study is not influenced by CBT and OSCE value output, it is necessary to review the efficiency of physician work in terms of implementation of the national health care system, especially in the primary care. This study identified that the IMDNCE results did not predict new-doctors' clinical performance scores. The results might be related to the interns' practice that not yet representing the competences in the Indonesian Doctors Standard of Competences (SKDI) since the IMDNCE items are developed based on SKDI. The new-doctors would not be able to practice in the optimum medical authority stated in SKDI since they were still under the supervision of senior doctors. Moreover, the real clinical setting might differ with what the new-doctors had learnt during their education. For instance, some of the newdoctors were assigned in a district hospital while some of them were in the primary health care centre which had several possible limitations such as funding, facility and the lower doctor-to-patient ratio. The new-doctors might use distinct approaches to cope with the diverse environments [18], which could be different to their approach examined during IMDNCE. Moreover, clinical competence judgement may show a different result when performed under unsimilar administration condition [19]. The disparity between the standards used in IMDNCE and the actual clinical setting makes it complicated to predict the clinical performance based on the IMDNCE results. The level of knowledge and skills are traditionally pertinent to the doctors' preparedness for practice. However, several factors are argued to influence the performance of doctors in the clinical setting. Doctors shape their conception of professionalism and clinical empathy through the complex cognitive process of role modelling [20]. Meanwhile, mentorship quality in the workplace is argued to influence doctors' confidence and perception about their own competencies [21]. The individual factors of the novice practitioners during the transitional period, such as the ability to adjust, adapt and manage stress, predispose their clinical performance [22].
Nevertheless, the new-doctors' clinical performance observed was significantly different for each of the medical schools' accreditation level. The new-doctors graduated from ' A'-level accredited medical schools achieved better CPI score than the other levels. This finding corresponds with the evaluation results in 2014-2016 in which IMDNCE participants from higher-accredited medical schools achieved the better score than others from lower-accredited medical schools [23,24]. Moreover, this result also resonates to another study in USMLE context in which candidates who trained in accredited educational institutions achieved better examination performance where accreditation of educational programmes was associated with the production of more highly skilled physicians [25]. Therefore, this study is capable of providing an evaluation of the educational outcomes at the higher, Level 4A, of the Kirkpatrick's hierarchy [26] based on graduates' actual practice observed.

Limitations
This study showed different results compared to other studies which suggested that national/licensing  [27,28], or performed after many years of practice [7,29]. Hence, this study provides new information about short-term performance evaluation following a national competency examination. The instrument used could further validated to ensure construct and concurrent validity, to be used in other settings. Additionally, this study was observational where it might be hard to control the possible confounding factors such as the variety of geographical area in Indonesia. Nevertheless, this study investigated participants from a range of provinces in Indonesia to represent both urban and rural areas which might mitigate the possibility of geographical confoundings. Since cause-effect relationships are difficult to establish even in an experimental research, a more rigorous methodology such as quasiexperiments with control groups would provide wider perspectives.

Conclusion
There is no difference between CPI and national competency examination results, but there was a significant difference of clinical performace score based on graduates' medical schools' nature and accreditation level. There was no statistical correlation between clinical performance of new medical doctors during internship to CBT and OSCE scores in national competency examination. This may suggest that new doctors' performance during internship is affected by more complex factors, not only their level of competencies. Further cohort studies with longer observation period are recommended to investigate the development of medical graduates' clinical practice as a follow up to this research.