Skip to main content

Specialist training: workplace-based assessments impact on teaching, learning and feedback to support competency-based postgraduate programs

Abstract

Background

Workplace-based assessments (WBAs) are part of a competency-based curriculum where training progression is dependent on the achievement of defined competencies in a real-world clinical environment. There is a significant literature gap on the impact of WBAs implemented in resource constrained countries and their contextual challenges. This study aimed to examine the use, impact, and educational context of WBAs in South African medical specialist training programs drawing on perspectives from both trainees and trainers to identify educational challenges and propose effective solutions.

Methods

A mixed methods national electronic survey was conducted with specialist medical trainees and supervising trainers from all eight specialist training institutions in South Africa involving 16 specialities. The survey responses were voluntary and anonymous. The survey was closed after seven months when data saturation was achieved. Descriptive statistical analysis was performed using SPSS Version 27 (SPSS Inc, 2012, Chicago, IL) for the quantitative analysis. The thematic coding framework for the qualitative analysis was facilitated by NVivo Version 12 software.

Results

There were 108 ethnically diverse supervising trainers and 248 specialist trainees’ survey respondents. Across the 16 medical specialities, 45% of the respondents were using WBAs. Despite contextual resource and staff challenges, this study found that WBAs had a positive impact to Kirkpatrick level 2 in providing actionable feedback to improve competency. WBA users had a significantly higher rating for trainee supervision (p < 0.01), general quality of feedback on trainee competence (< 0.01) and the specialist training program (p = 0.03) compared to WBA non-users. They also had a higher rating for the assessment of the trainee as a professional (p < 0.01); scholar (p < 0.01); communicator (p < 0.01); collaborator (p = 0.001) and leader/manager (p < 0.001) based on the AfriMEDS competency framework. Racism, sexism and favouritism were challenges that negatively affected the training programs.

Conclusion

Overall, this study reports that the use of WBAs had a substantially favourable impact on teaching, learning, feedback and supports a competency-based approach to specialist training programs. Addressing the contextual concerns that negatively impact training; training the trainees and trainers about their relationship, roles and responsibilities; and focusing on a trainee-centred, inclusive and empowering teaching approach will help further enhance its effectiveness.

Peer Review reports

Introduction

Assessment has beenan essential part of postgraduate medical education for decades but has historically focused on various types of examinations as assessments of learning which do not necessarily reflect the practice of trainees in the real world [1]. Persistent and growing concerns for patient safety and healthcare quality have highlighted the need for better WBA [2]. When WBAs are done effectively and formatively [3], they can more authentically capture the abilities of trainees to provide safe, effective patient-centred care. WBAs are also essential in cultivating effective interprofessional teamwork [2].

WBAs are part of a competency-based system intended to identify areas for improvement in the individual trainee, based on documented evidence. The implementation of competency-based medical education began in 2014 when the Health Professional Council of South Africa, the national healthcare regulatory body, published a list of competencies for medical professionals [4]. In 2017, the national conversations about mandatory WBAs were initiated but progress has been slow, and implementation has been limited [5]. However, there is a renewed interest to achieve this goal by the Colleges of Medicine of South Africa, the specialist examination board, and the South African Committee of Medical Deans [6]. Specialist training programmes in South Africa are conducted in hospitals affiliated to educational institutions with the summative examinations regulated by the different specialties within the Colleges of Medicine of South Africa. Conducting national surveys to engage with stakeholders helps inform future WBA design and implementation strategies with consideration of local contextual factors.

WBA complements the more traditional examination-based assessment of knowledge, and thus affords a more holistic and comprehensive assessment of trainees’ progress [7]. It also enhances feedback-based reflection and multidisciplinary integration [8]. A systematic review of the literature investigated feedback’s role in the implementation of WBAs and reported that trainees perceived feedback as the most useful aspect of WBA and believed that a greater focus on the feedback component of WBA will help improve its effectiveness as a formative assessment tool, thus improving trainees’ performance [9].

There are about 2600 medical schools globally with the largest numbers in resource constrained countries such as India and Brazil, yet the medical education literature lacks representation of perspectives from these environments [10]. There is a significant gap in the literature reporting on the effectiveness and impact of workplace-based assessments implemented as part of competency-based medical education in resource constrained countries in the Global South. A systematic review of the role of feedback in improving the effectiveness of WBA included 15 studies from the United Kingdom, Canada, the United States and New Zealand with no studies from resource constrained countries [9]. Another hermeneutic review on WBAs in postgraduate medical education included 30 studies but none from resource constrained countries in the global south [11].

This study aims to address this gap. The educational challenges and proposed solutions will also be investigated as culture and context frame individual experiences and powerfully influences the impact of WBAs [12, 13]. The educational challenges and solutions reported by the trainees and trainers also empowers the survey respondents voice and prioritized needs, which will be used to inform future programmes for WBA implementation considering local contextual factors.

In Africa, there is a critically inequitable distribution of health workers. The continent accounts for 24% of the global disease burden but only has 3% of the world’s health workers [14, 15]. One of the key solutions identified to address this is training and capacity building [16]. South Africa is an ethnically diverse, resource constrained country. The medical specialist training occurs in public hospitals with pressurized and limited human and infrastructure resources. These public hospitals serve 84% of the population, a total of 49 million people [17]. The public sector employs only 41% of medical specialists, with the majority working in the private sector [18].

This study aimed to examine the use, impact and educational context of WBAs in South African medical specialist training programs drawing on perspectives from both trainees and trainers to identify educational challenges and propose effective solutions.

Methods

Setting

Medical specialist trainees and trainers from eight universities involved in medical specialist training across seven provinces in South Africa were included. The Humanities and Social Sciences Research Ethics Committee of the University of KwaZulu Natal granted ethical approval Protocol Reference HSS/0532/019D. The survey responses were voluntary and anonymous with no incentives offered. There were no interviews or focus groups conducted to ensure anonymity of the responses. This made it safer for honest views to be shared without fear of negative consequences. Currently in South Africa, WBAs are recommended by the Colleges of Medicine of South Africa (CMSA), the specialist examination body, but it is not a requirement for progression. However, the CMSA plans to formalise WBAs for programmatic assessment for specialist training progression in 2025. None of the SA educational institutions have mandatory WBAs across all speciality training programs.

Instrument

Online surveys were adopted for this study as they offer a practical and efficient means for engaging with healthcare professionals across the busy clinical training environments in South Africa, ensuring wide accessibility, especially in remote regions. They are also cost-effective, save time, supports various question types, from open-ended to closed-ended, and ensure standardised data collection [19]. Anonymity can encourage more honest responses, and the digital nature of the survey simplifies and speeds up data analysis. In a review of WBA user perceptions, which included 31 studies globally, 21 included surveys as a method to conduct the research but none included surveys conducted in a country in the global south [20]. Most of the included studies [20] were from the UK, where WBAs are mandatory in specialist training programmes. Questionnaires included Likert scoring questions, tick box questions, demographic data and one free text question about WBA. Some questions included knowledge of its purpose and specific questions about WBA tools used within the educational context of their curriculum pathway [21, 22].

Hence, a novel 58 item survey was created for a South African context by the authors with a background in specialist training and medical education. Topic relevance, response process validity and content clarity were assessed and validated by several subject experts in survey and healthcare research known to the authors and expert reviews were used for improvements. The survey was then piloted by 9 trainees and 3 trainers. The results were analysed, and the individual respondents cognitively interviewed before the individual’s feedback was used to further improve the survey. An internal validity review of the main themes was also conducted. A second pilot was conducted on the SurveyMonkey platform using 3 trainees and a trainee. The feedback from these pilots resulted in final content and wording changes with 8 questions being removed. The final survey is a 50-item questionnaire with 10 open text responses within it. Further, open-ended questions were included to gather in-depth responses and clarify the quantitative data collected. The instrument was also used to investigate surgical trainees and supervisors’ perspectives of WBA in South Africa reported in another study [23]. Questions covered trainee and trainer demographics, the use of workplace-based assessments and its impact on the medical specialist program, educational gaps, contextual challenges, suggested solutions and the assessment of the African Medical Education Directions for Specialists (AfriMEDS). The survey was sent out during the height of COVID-19 and included several questions relating to this challenge for local planning purposes but do not form part of the focus of this paper. The electronic survey design applied logic, so trainer or trainee specific questions were only accessible to the respective groups, relevant to two questions about the experience of supervising trainers and the trainee’s specialist training year. The appendix includes the questions analysed in this paper. (Appendix 1) The AfriMEDS document outlines a competency-based framework designed to develop a holistic healthcare practitioner [23]. It was adapted from the CanMEDS framework to ensure a healthcare professional can fulfil their many roles in practice [4]. These roles include the healthcare practitioner as a professional, communicator, collaborator, health advocate and scholar [24]. Using a 5 point Likert scale, the respondents rated the quality of different components of the medical specialist training. This article focuses on the medical trainees’ and trainers’ responses in relation to workplace-based assessments and the educational context.

Data collection and sampling

Specialist medical trainees and supervising trainers responded to a national electronic survey between March 2020 and September 2020. For an 80% power and 95% confidence interval with a 5% margin of error, the sample size needed to include 292 respondents in total. To encourage honesty without fears of repercussion, survey responses were voluntary and anonymous. No incentives were offered. To optimise response rate, several strategies were used. University stakeholders of medical specialist training programs contacted potential respondents electronically. The snowball sampling method was also used to disseminate via faculty and trainees known to the authors. The Colleges of Medicine of South Africa is the national examination authority for specialists and sent electronic emails with the survey link to trainees and trainers. The South African Medical Association for medical practitioners also disseminated the survey to the relevant members of their organization. Follow up emails and reminders were sent to encourage participation. Respondents also used social media platforms such as WhatsApp hospital work groups to disseminate the survey. The survey was closed after seven months when data saturation had been achieved.

Data analysis

Descriptive statistical analysis was performed using SPSS Version 27 (SPSS Inc, 2012, Chicago, IL) for the quantitative analysis. The reliability of the instrument was measured using the quantitative Likert items to calculate Cronbach’s alpha, the corrected item-total correlation and the average inter-item correlation. Using the Chi-square test, the training program ratings of the trainer-trainee groups were compared between the trainer-trainee WBA user group and the trainer-trainee group not using WBA. (Table 2) The trainer and trainee responses were also compared using the Chi-square test, irrespective of WBA use. The Mann Whitney U-Test was used to rank the rating of the training programme components between WBA users and non-users (Table 3) and these ranking were combined (Table 4) for an overall comparison. Differential rating of the assessment of the trainee using the AfriMEDs competency framework was compared between WBA users and non-users and trainees and trainers irrespective of WBA use (Table 5). The open text responses for all trainers and trainees were thematically analysed [25]. The two authors conducted the initial coding independently. Inductive coding was used for qualitative analysis. The codes and themes were identified directly from the data itself, without any pre-established framework, theory, or set of categories. Data analysis followed a six-phase process of thematic analysis [16]. This included: (1) Immersing in the data via multiple readings; (2) Creating preliminary codes rooted in the data; (3) Categorizing the codes into emerging themes; (4) Assessing and fine-tuning these themes to ensure clarity and uniqueness; (5) Clearly defining and labelling each theme; and (6) Compiling a comprehensive report that weaves the analysis with pertinent data snippets and related literature. This cyclical process emphasises consistent reflection and the possibility of revisiting prior stages to guarantee a thorough and accurate interpretation. There was good agreement for the development of an initial coding framework using NVivo Version 12 software. Data coding was complete after codes were iteratively compared and refined. All respondents’ comments were included within the framework with a high degree of commonality, implying that data saturation had been reached. Examples of the themes and sub-themes of the qualitative analysis are included in the results section. Thematic analysis is an effective method to analyse qualitative data, but it is influenced by those undertaking the analysis. SB is a consultant ophthalmologist and actively participates in specialist training initiatives across various programs in South Africa and the United Kingdom as an experienced clinician educator and supervisor with significant expertise in this area. She is a South African of Indian ethnicity and brings her own lived experiences, resonating with many themes that had unfolded during her personal training journey. On the other hand, VS is an established medical educationalist with a research background in medical education, assessment, and feedback, specifically within the South African context. The research team was enriched by a multitude of viewpoints. To enhance the credibility of our data interpretation, we took the additional step of presenting our findings at various national academic gatherings for validation and feedback.

Results

There were 108 supervising trainers and 248 specialist trainees from 8 South African universities across 16 medical specialities in 7 provinces (n = 356). Specialist trainees from all years were represented in the sample, including those in 1st year (13%); 2nd year (13%), 3rd year (12%), 4th year (16%) and beyond 4th year (14%). Most respondents were South African (90%). Other nationalities include were respondents from Zimbabwe (2%); Nigeria (1.5%) Mauritius (0.5%), Tanzania (0.5%), Chad (0.5%), Botswana (1.5%); Sudan (0.5%); United Arab Emirates (0.5%); Mozambique (0.5%); Zambia (0.5%); Croatia (0.5%); Namibia (0.5%) and United Kingdom (0.5%). The medical specialities of the respondents are shown in Table 1. Most of the respondents were female 58% with 2% non-binary and 40% male. There were 32% Black respondents, 4% Coloured respondents, 25% Indian respondents and 40% White respondents. The respondents from each of the 16 medical specialities is described in Table 1. The largest groupings were from Paediatrics (16%); Anaesthetics (15%); Internal medicine (14%); Pathology (12%) and Psychiatry (10%).

Table 1 Medical specialities

Workplace-based assessments (WBAs)

WBA use

Across the medical specialities, 45% of the respondents were using WBAs to enhance their training program. Seven of the eight educational institutions had both WBA users and non-users in different specialist training programs.

As illustrated in Table 1, most respondents using WBAs were from Paediatrics (14%), Psychiatry (14%), Family Medicine (13%) and Anaesthetics (12%). Of the 45% respondents using WBAs, only 14% were involved in the instrument development and most of them (70%) did not receive training/briefing on the use and purpose of the tools before implementation. The frequency of the tool used ranged from daily 6%; weekly 13%; monthly 25%; 2 monthly 7%; 3 monthly 26%; 6 monthly 22% and end of rotation 1%.

There were a variety of WBA tools used, but the most frequently reported was the Mini-CEX and DOPS. Other tools used included critical reflection tools, portfolios, a video consultation tool, OSCE assessments, competency assessments; the Royal College Workplace-Based Assessment Implementation Guide; the Royal Australian and New Zealand College of Psychiatrists 360-degree evaluation; 6 weekly self-designed WBA tool based on the Canadian model; a Communication Skills Observation Tool and case-based discussions.

Rated effectiveness of WBA

Rating the effectiveness of WBA to enhance feedback, 58% reported it partially enhanced feedback, 35% reported it sufficiently or effectively enhanced feedback. This contrasted with 7% who reported it did not enhance feedback at all.

Of the respondents, 59% reported that the tool partially provided actionable steps to improve competency; 31% reported it sufficiently or effectively provided actionable steps to improve competency while 10% reported it did not at all provide actionable steps to improve competency.

Willingness to use WBA and preferences

Of the total respondents, the majority were willing to use WBA in the training program (96%); attend training on the use of WBA (94%) and how to give and receive effective feedback (92%), which is a critical component of WBAs. The majority also preferred computer-based WBA as the medium of delivery (70%). The preferred frequency of WBA use was monthly (52%) followed by weekly (20%).

Formative assessments

Knowledge of formative assessments was poor, with the majority associating it with a more summative definition (69%) while 23% were not familiar with the term. Most (64%) agreed that WBAs should be an essential part of the medical speciality training program while 32% were not familiar enough with it to decide. There were 80% of WBAs users who associated its intention with a more summative definition. While only 8% of WBA users were not familiar with formative assessments.

Reliability of the instrument

Cronbach’s alpha (α) for the quantitative Likert items was 0.905, all corrected item-total correlations were positive, and the average inter-item correlation was 0.463. Overall, these Likert scale items showed high levels of internal consistency and can be considered reliable measures of the instrument. Tables 2, 3, 4, 5, are based on the analysis of these Likert Scale item responses.

Table 2 Rating of the different components of the medical specialist training
Table 3 Comparison of specialist training program ratings between the trainers and trainees irrespective of WBA use
Table 4 Combined ranking of different training program components between WBA users and WBA non-users p < 0.001
Table 5 Differential rating of the assessment of the trainee using the AfriMEDs competency framework

Table 2 indicates that the WBA users had a higher rating for the medical specialist training program (p = 0.03), trainee supervision (p < 0.01) and the general quality of feedback given to trainees on their competence (p < 0.01) compared to WBA non-users.

The rating differences between the consultant trainers and trainees on the AfriMEDS competency framework were not significant (Table 5). In contrast, the WBA users rated the specialist program’s assessment of each of the roles significantly higher than WBA non-users across specialities. (Table 5).

Reported gaps in the practical training of specialist trainees in the workplace

Most of the supervising trainers (74%) and trainees (80%) reported educational gaps in the medical specialist training programs. In the clinical category, 83% of trainers and 71% of trainees reported deficits. In the lab-based category, 21% of supervisors and 41% of trainees reported gaps.

There were several clinical and lab-based gaps reported with a recurrent theme of lack of supervision within these environments, as illustrated in Table 6. This was linked to a high workload for trainers and trainers with service delivery pressures and staff shortages negatively impacting teaching and learning. Some of these impacts include decreased time for training, supervising, reflecting, assessments and feedback. (Table 6).

Table 6 Training gaps reported in the specialist training program

Perceptions of important area/s of the training program requiring the structured monitoring of competence

The area of specialist training that most required structured monitoring of registrar competence was reported as clinical competence for patient facing specialities and lab-based practical skills for the relevant specialities. The second most frequent component of registrar competence critical to assess was interpersonal skills, followed by decision making and critical thinking. Regular feedback mechanisms reporting progress during training and the use of WBA to facilitate this were also reported in the text responses (Table 7). Other aspects reported included leadership, teamwork, professionalism, ethical practice, effective communication, problem-solving abilities, safe patient management and the supervision of junior trainees whilst clinically interacting with patients.

Table 7 Areas of the training program that respondents wanted to change or improve

The areas within the training program highlighted for improvement include a more trainee-centred approach to teaching with increased staff to facilitate protected teaching and research time. (Table 7).

The solutions described by the trainees and trainers to various reported challenges included protected time for teaching and supervision; more trainee responsibility for their own learning and timely, continuous formative workplace-based assessments. They also wanted more trainee centred teaching by motivated and empowering trainers with fair standards set and goal-directed feedback. Faculty development and assessing the competency of supervisors to teach and train were also part of the solutions proposed (Table 8).

Table 8 Solutions to the challenges in the specialist training program

Discussion

Use and impact of workplace-based assessment

In this study, we explored the trainers’ and trainees’ perspectives regarding the use and impact of WBAs on teaching and learning in medical specialist training. It was concerning that 69% of the respondents in our study associated the purpose of WBA with a more summative definition, while 23% were not familiar with formative assessment. WBA are most effective when their purpose of encouraging learning is clear to both the trainees and trainers. In a UK-based qualitative study on WBA focusing on the trainee’s perspective, the greatest impact on learning through WBA was the formative dialogue [26]. When WBAs are perceived to be used summatively, there is negative sentiment and reduced educational value [27]. Poor understanding of the purpose of WBAs have also been shown to negatively impact their educational potential [20]. Thus training to clarify the purpose of WBA is critical.

There is good evidence that feedback from well-implemented WBAs results in a positive effect on clinical practice [6]. Feedback gives trainees insight into their action and informs them about the progress made towards their personal objectives [28]. Most respondents in our study who were using WBA reported that it did improve feedback and that it did partially or sufficiently provide actionable steps to improve competency. The Kirkpatrick model of evaluation includes four levels of outcomes for a training program: reaction (level 1), learning (level 2), behaviour (level 3), and results (level 4) [29,30,31,32,33,34]. Like other studies, this study reported that high-quality feedback during the WBA teaching and learning encounter enhances its effectiveness in achieving its purpose of encouraging learning to develop competency to Kirkpatrick Level 2 [35].

Compared to the trainee-trainer group not using WBA, the WBA users had a significantly higher rating for the medical specialist training program (p = 0.03); trainee supervision (p < 0.01) and the general quality of feedback given to trainees on their competence (p< 0.01). This is a novel, encouraging finding as the indirect positive impact of WBA on the medical specialist training program has not previously been reported. A literature review of trainees and trainers’ perception of WBAs mostly in the English-speaking global North has shown in contrast that there is widespread negativity towards WBAs [20].

Rating comparison of the specialist training component between the trainees and trainers

The consultant trainers rated their supervision and the feedback quality given to trainees on their competence significantly higher than the trainees (p < 0.001). (Table 3) This lack of insight about the trainer’s training skills could be related to a lack of supervisors training as alluded to by an anaesthetic consultant trainer, “no formal training of supervisors, most do like the person who trained you and try to improve on your own, without any assessment.” A nuclear medicine consultant also recognised that “Giving constructive feedback” was a gap that needed improvement.

Some trainees called for a confidential means to assess the supervisory abilities of the consultant trainers, “A method to address the competencies of the consultants as supervisors in a confidential manner. There is always going to be a power dynamic between the consultant and [trainees] and that needs to be acknowledged – with said power dynamic it is sometimes very hard to address situations where the consultants require feedback about their ability to supervise,” fourth year Psychiatry trainee. Similar to these findings, other studies found that inadequate training of trainers has also been reported to negatively impact on the effectiveness of WBA [20]. In addition, for trainees to engage meaningfully in WBAs, they must have trust in their trainer assessor, but the trainees dependent position complicates trust [36]. Hence there is a need to understanding how trust and power influence WBAs and a need to implement strategies to address this to ensure that WBAs are effective learning opportunities [36].

WBA also had a very positive statistically significant impact on the assessment of the AfriMEDS core competencies (Table 5) and helps to build the different roles into the training programme in the specialist trainee’s practice environment. This is a novel finding and an indirect positive result of WBA use. This implies that WBA tools could be used to effectively assess trainees more holistically including assessing competencies such as communication, collaboration, leadership, and managements, some of which were gaps identified by both the trainees and trainers in the different specialist training programs [37]. The use of WBAs could also help cultivate technical and relational competencies critical to the development of a holistic healthcare practitioner.

The implications of a resource constrained context for WBA implementation

The additional challenges in South Africa relating to resources, workload and supervision impacts on the use and effectiveness of WBA [38]. Long working hours with heavy workloads have an adverse impact on the wellbeing of healthcare staff [39]. There is also a sense of emotional and physical overwhelm related to this incredibly pressurized work environment that is conveyed in the qualitative data. However, despite the many challenges reported in this context including the service delivery pressures and lack of resources [40] compromising teaching and learning encounters, our study found that the group using WBAs had a significantly higher rating for the specialist training program overall. This implies that WBAs may be even more important to developing competencies within this context [5]. WBA use positively impacting on teaching and learning in medical specialist training programs in resource constrained contexts is a novel finding.

Trainee-trainer relationship

“There is evidence that the quality of the trainee–trainer relationship is a better predictor of successful training outcomes than supervisory skills or helpfulness.” [41]. Discrimination in medicine has been associated with negative impacts in the workplace and on the person’s personal life. This includes decreased productivity, increased alcohol use, depression, attrition, and suicidality among physicians [42, 43]. Racism, sexism and the favouritism reported in this survey are alarming and need to be addressed. South Africa has a history steeped in racial discrimination and colonization with medical educational institutions upholding policies that marginalized people of colour during the apartheid regime [44].The legacy of the power struggles remains and historically and culturally contextualizes the relationship dysfunction. In democratic post-apartheid South Africa, there have been several studies reporting on racism in specialist training [45] and the dysfunctional trainee-trainer relationship existing within a wider institutional and cultural context [46].

The issue of racism in medical education is prevalent in many countries. In the UK, doctors from ethnic minority backgrounds are less likely than white doctors to be considered suitable for appointment to specialty training jobs [47]. In the US, residents from minority ethnicities report a daily barrage of microaggressions and bias. They also reported difficulties negotiating personal and professional identity while seen as “other” [48]. This study has revealed significant challenges reported in the trainee-trainer relationship, which negatively impacts on the teaching and learning encounter and the effectiveness of WBA. Addressing these challenges during future training and implementation phases for WBA within this context will help enhance its effectiveness and address educational barriers related to the relational dynamic and dysfunction.

There is a discordance in trainee and trainers’ perspectives regarding lack of trainer supervision. The negative framework of the trainer in their teaching role is captured by disgruntled expectations to where they are expected to “spoonfeed adults!” and indirectly supports a trainee’s response describing the “bad attitude of certain trainers” when the trainees are “not coping”. Faculty development and a critical reflection of the supervisor’s teaching skills and training; and the trainer-trainee’s roles and responsibilities are needed to address the discordance. The resource limitations, time constraints and sometimes overburdened clinical responsibilities of supervisors need to also be taken into consideration as it impacts on their role and approach to teaching. Efficient and nurturing techniques of verbal bidirectional feedback and the use of effective and efficient digital WBA tools will help to improve the educational environment within its constraints.

More recognition of the challenges faced by these trainees and trainers to teach and learn within a difficult work environment is needed. The implementation of trainee-trainer centric solutions will help improve training and the effectiveness of WBA within these resource-constrained environments.

Recommendations

For the successful implementation of WBAs, a shift from a more trainer-centred critical approach to a more trainee focused empowering relationship-centred teaching style will positively impact on the learning encounter. Developing participatory supervisor and learner training covering their roles, responsibilities, the purpose of WBA and feedback, and how to give and receive effective feedback will also help harness the full learning potential of formative assessments.

The quality of the trainee-trainer relationship impacts on the effectiveness of the teaching and learning encounter, so a relationship-centred approach is needed [11, 23]. Trust [36] and respect are vital to strong trainee-trainer relationships. A mutually trusting, respectful and collaborative focus will help address the power dynamics and its negative impact. Addressing racism will require ongoing training, monitoring, and evaluation from a diverse team to create a relational and psychologically safe [49, 50], inclusive environment for teaching and learning, especially for marginalized trainees.

Limitations

While surveys offering anonymity can lead to more truthful answers, they also pose a challenge for providing immediate support in cases of distressing responses. However, contact information for the researchers was provided to ensure support could be reached by respondents when needed.

Surveys also have limitations related to the narrow scope of the response given for each question, which may limit the respondent from expressing their full views. There were 10 free text sections allowing the respondents to report their view with more depth and details on a range of topics covered. Although interviews may have yielded richer data, the anonymity of the survey protected the respondents from the perceived negative effects of “observer bias”. The survey length had the potential to cause survey fatigue, which may have affected some responses.

Surveys investigate perceptions and experiences. Future longitudinal observational studies are needed to provide further insights into the study findings. Using snowball sampling, a response rate cannot be calculated, but it enables a richer and wider network of responses. There may also be sampling bias with this method. However, several national networks were used, and the respondents represent trainees and trainers from all educational institutions training medical specialists in South Africa. The demographics of all medical specialist trainees and trainers across South Africa are not published, so it is not known whether this respondent group is representative, although the group was ethnically diverse and appropriately powered.

Implications

Despite contextual service delivery pressures, resource challenges, reported staff shortages and relationship challenges within this resource constrained environment in the global South, there has been a significantly favourable impact of workplace-based assessments on the specialist training programs across South Africa. Its use has added meaningful value to teaching, learning, supervision, and feedback in the specialist training programs. This study suggests the use of WBA may be even more important to supporting teaching and learning in resource constrained environments. The challenges associated with the trainee-trainer relationship also need to be addressed to further enhance its effectiveness. The trainee and trainer recommendations to address the gaps must also be duly considered and implemented.

Conclusion

Almost half of the respondents were using workplace-based assessments. The WBA user group had a significantly higher rating for trainee supervision (p < 0.01), general quality of feedback on trainee competence (< 0.01) and the specialist training program (p = 0.03). WBA users also had a higher rating for the assessment of the trainee as a professional (p < 0.01); scholar (p < 0.01); communicator (p < 0.01); collaborator (p = 0.001) and leader/manager (p < 0.001). The use of WBAs also had a positive impact to Kirkpatrick level 2, providing actionable feedback to improve competency. Overall, WBA had a substantially favourable effect on teaching, learning, feedback and supporting a competency-based medical education specialist training program despite significant contextual challenges. Future studies and interventions are needed to address the concerns within a resource constrained environment that negatively impact on training. Training the trainees and trainers about their relationship, roles and responsibilities is also vital for effective teaching and learning encounters. A relationship-centred, trainee-focused, inclusive and empowering approach will help further enhance WBAs effectiveness.

Availability of data and materials

Most of the survey data was analysed and included in this article. The datasets used and/or analysed during this study are available from the corresponding author on reasonable request.

References

  1. Irby DM, Cooke M, O’’Brien BC. Calls for reform of medical education by the Carnegie Foundation for the Advancement of Teaching: 1910 and 2010. Acad Med. 2010;85(2):220–7. https://doi.org/10.1097/ACM.0b013e3181c88449.

    Article  Google Scholar 

  2. Holmboe ES. Work-based assessment and co-production in postgraduate medical training. GMS J Med Educ. 2017;34(5):Doc58. https://doi.org/10.3205/zma001135. Published 2017 Nov 15.

    Article  Google Scholar 

  3. Tooke J. Aspiring To Excellence: Findings and Final Recommendations of the Independent Inquiry into MMC. Aldridge Press, 2008

  4. HPCSA . 2014. (AFRIMEDS). Core Competencies* for Undergraduate Students in Clinical Associate, Dentistry and Medical Teaching and Learning Programmes in South Africa [homepage on the Internet]. Blog, 7 August 2020 [cited 2023 May]. Available from: http://www.hpcsa-blogs.co.za/wp-content/uploads/2017/04/MDB-Core-Competencies-ENGLISH-FINAL-2014.pdf

  5. Sathekge MM. Work-based assessment: A critical element of specialist medical training. S Afr Med J. 2017;107(9):12059. https://doi.org/10.7196/SAMJ.2017.v107i9.12655. Published 2017 Aug 25.

    Article  Google Scholar 

  6. Nel D, Burch V, Adam S, Ras T, Mawela D, Buch E, Green-Thompson L. The introduction of competency-based medical education for postgraduate training in South Africa. S Afr Med J. 2022;112(9):742–3.

    Article  Google Scholar 

  7. Menon S, Winston M, Sullivan G. Workplace-based assessment: attitudes and perceptions among consultant trainers and comparison with those of trainees. Psychiatrist. 2012;36(1):16–24. https://doi.org/10.1192/pb.bp.110.032110.

    Article  Google Scholar 

  8. Prakash J, Chatterjee K, Srivastava K, Chauhan VS, Sharma R. Workplace based assessment: a review of available tools and their relevance. Ind Psychiatry J. 2020;29(2):200–4. https://doi.org/10.4103/ipj.ipj_225_20.

    Article  Google Scholar 

  9. Saedon H, Salleh S, Balakrishnan A, Imray CH, Saedon M. The role of feedback in improving the effectiveness of workplace-based assessments: a systematic review. BMC Med Educ. 2012;12:25.

    Article  Google Scholar 

  10. Duvivier RJ, Boulet JR, Opalek A, van Zanten M, Norcini J. Overview of the world’s medical schools: an update. Med Educ. 2014;48(9):860–9.

    Article  Google Scholar 

  11. Prentice S, Benson J, Kirkpatrick E, Schuwirth L. Workplace-based assessments in postgraduate medical education: a hermeneutic review. Med Educ. 2020;54(11):981–92.

    Article  Google Scholar 

  12. Martin, L., Blissett, S., Johnston, B., Tsang, M., Gauthier, S., Ahmed, Z., & Sibbald, M. (2022). How workplace‐based assessments guide learning in postgraduate education: A scoping review. Medical Education.

  13. Jackson Tan, Connie Tengah, Vui Heng Chong, Adrian Liew, Lin Naing, “Workplace Based Assessment in an Asian Context: Trainees’ and Trainers’ Perception of Validity, Reliability, Feasibility, Acceptability, and Educational Impact”, Journal of Biomedical Education, vol. 2015, Article ID 615169, 2015. https://doi.org/10.1155/2015/615169

  14. World Bank. World development indicators 2012. Worldbank.org/data-catalog/world-development-indicators/wdi-2012 (accessed June 30, 2012).

  15. WHO. World health report. working together for health. Geneva: World Health Organization; 2006. p. 2006.

    Google Scholar 

  16. Oleribe OO, Momoh J, Uzochukwu BS, et al. Identifying key challenges facing healthcare systems In Africa and potential solutions. Int J Gen Med. 2019;12:395–403. https://doi.org/10.2147/IJGM.S223882. Published 2019 Nov 6.

    Article  Google Scholar 

  17. Maseko L, Harris B. People-centeredness in health system reform. Public perceptions of private and public hospitals in South Africa. Afr J Occup Ther. 2018;48(1):22–7.

    Google Scholar 

  18. ECONEX , 2013, The South African private healthcare sector: Role and contribution to the economy, November 2013 edn., Pretoria, South Africa, viewed 01 February 2016, from https://econex.co.za/wp-content/uploads/2015/03/econex_researchnote_32.pdf [Google Scholar]

  19. Tenforde AS, Sainani KL, Fredericson M. Electronic web-based surveys: an effective and emerging tool in research. Pm&r. 2010;2(4):307–9.

    Article  Google Scholar 

  20. Massie J, Ali JM. Workplace-based assessment: a review of user perceptions and strategies to address the identified shortcomings. Adv Health Sci Educ Theory Pract. 2016;21(2):455–73. https://doi.org/10.1007/s10459-015-9614-0.

    Article  Google Scholar 

  21. Bindal T, Wall D, Goodyear HM. Trainee doctors’ views on workplace-based assessments: are they just a tick box exercise? Med Teach. 2011;33(11):919–27.

    Article  Google Scholar 

  22. Bindal N, Goodyear H, Bindal T, Wall D. DOPS assessment: a study to evaluate the experience and opinions of trainees and assessors. Med Teach. 2013;35(6):e1230–4.

    Article  Google Scholar 

  23. Baboolal SO, Singaram VS. The use, effectiveness, and impact of workplace-based assessments on teaching, supervision and feedback across surgical specialties. J Surg Educ. 2023;80(8):1158–71.

    Article  Google Scholar 

  24. CanMEDs . CanMeds framework [homepage on the Internet]. Royal College of Physicians and Surgeons of Canada. No date[cited 2023 May]. Available form: http://www.royalcollege.ca/rcsite/canmeds/canmeds-framework-e

  25. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

    Article  Google Scholar 

  26. Bussey M, Griffiths G. The Feedback FREND: An aid to a more formative WBA dialogue. The Bulletin of the Royal College of Surgeons of England. 2017;99(6):231–4.

    Article  Google Scholar 

  27. Shalhoub J, Marshall DC. Ippolito K Perspectives on procedure-based assessments: a thematic analysis of semi structured interviews with 10 UK surgical trainees. BMJ Open. 2017;7:e013417.

    Article  Google Scholar 

  28. Hattie JA. Influences on student learning. Auckland: University of Auckland; 1999. https://cdn.auckland.ac.nz/assets/education/about/research/documents/influences-on-student-learning.pdf.

  29. Kirkpatrick DL, Kirkpatrick JD. Kirkpatrick’s four levels of training evaluation. Alexandria, VA: ATD Press; 2016.

    Google Scholar 

  30. Hammick M, Dornan T, Steinert Y. Conducting a best evidence systematic review. Part 1: from idea to data coding. BEME Guide No. 13. Med Teach. 2010;32(1):3–15.

    Article  Google Scholar 

  31. Kirkpatrick DL. Evaluating training programs: the four levels. 1st ed. San Francisco, CA: Berrett-Koehler; 1996.

    Google Scholar 

  32. Kirkpatrick DL, Kirkpatrick JD. Evaluating training programs: the four levels. 3rd ed. San Francisco, CA: Berrett-Koehler; 2006.

    Google Scholar 

  33. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. https://doi.org/10.1080/01421590500046924.

    Article  Google Scholar 

  34. Kirkpatrick DL. Techniques for evaluating training programs. Am Soc Train Direct. 1959;13:3–9.

    Google Scholar 

  35. Aryal K, Hamed M, Currow C. The usefulness of work-based assessments in higher surgical training: a systematic review. Int J Surg. 2021;94:106127. https://doi.org/10.1016/j.ijsu.2021.106127.

    Article  Google Scholar 

  36. Castanelli DJ, Weller JM, Molloy E, Bearman M. Trust, power and learning in workplace-based assessment: The trainee perspective. Med Educ. 2022;56(3):280–91. https://doi.org/10.1111/medu.14631.

    Article  Google Scholar 

  37. Kurzweil AM, Lewis A, Pleninger P, et al. Education research: teaching and assessing communication and professionalism in neurology residency with simulation. Neurology. 2020;94(5):229–32. https://doi.org/10.1212/WNL.0000000000008895.

    Article  Google Scholar 

  38. Naidoo KL, Van Wyk J, Adhikari M. Comparing international and South African work-based assessment of medical interns’ practice. Afr J Health Prof Educ. 2018;10(1):44–9.

    Article  Google Scholar 

  39. Al-Momani M. Improving nurse retention in Jordanian public hospitals. Adv Pract Nurs eJournal urse Journal. 2008;8(4):1–6.

    Google Scholar 

  40. Manyisa ZM, Van Aswegen EJ. Factors affecting working conditions in public hospitals: a literature review. International Journal of Africa Nursing Sciences. 2017;6:28–38. https://doi.org/10.1016/j.ijans.2017.02.002.

    Article  Google Scholar 

  41. Goodyear RK, Bernard JM. Clinical supervision: Lessons from the literature. Coun Educ Superv. 1998;38(1):6–22.

    Article  Google Scholar 

  42. Fnais N, Soobiah C, Chen MH, et al. Harassment and discrimination in medical training: a systematic review and meta-analysis. Acad Med. 2014;89(5):817–27.

    Article  Google Scholar 

  43. Hu YY, Ellis RJ, Hewitt DB, et al. Discrimination, abuse, harassment, and burnout in surgical residency training. N Engl J Med. 2019;381(18):1741–52.

    Article  Google Scholar 

  44. Digby A. Black doctors and discrimination under South Africa’s apartheid regime. Med Hist. 2013;57(2):269–90. https://doi.org/10.1017/mdh.2012.10643.ThackwellN,SwartzL,DlaminiS,PhahladiraL,MuloiwaR,ChilizaB.Racetrouble:experiencesofBlackmedicalspecialisttraineesinSouthAfrica.BMCIntHealthHumRights.2016;16(1):31.Published2016Dec3.Doi:10.1186/s12914-016-0108-9.

    Article  Google Scholar 

  45. Thackwell N, Swartz L, Dlamini S, Phahladira L, Muloiwa R, Chiliza B. Race trouble: experiences of Black medical specialist trainees in South Africa. BMC Int Health Hum Rights. 2016;16(1):31. https://doi.org/10.1186/s12914-016-0108-9. Published 2016 Dec 3.

    Article  Google Scholar 

  46. Bagwandeen CI, Singaram VS. Feedback as a means to improve clinical competencies: registrars’ perceptions of the quality of feedback provided by consultants in an academic hospital setting. Afr J Health Prof Educ. 2016;8(1):117–20.

    Article  Google Scholar 

  47. Iacobucci G. Specialty training: ethnic minority doctors’ reduced chance of being appointed is “unacceptable.” BMJ. 2020;368:479. https://doi.org/10.1136/bmj.m479. Published 2020 Feb 12.

    Article  Google Scholar 

  48. Osseo-Asare A, Balasuriya L, Huot SJ, et al. Minority resident physicians’ views on the role of race/ethnicity in their training experiences in the workplace. JAMA Netw Open. 2018;1(5) Published 2018 Sep 7.

    Article  Google Scholar 

  49. Polanco Walters F, Anyane-Yeboa A, Landry AM. The not-so-silent killer missing in medical-training curricula: racism. Nat Med. 2020;26:1160–1. https://doi.org/10.1038/s41591-020-0984-3.

    Article  Google Scholar 

  50. Sanders M, Fiscella K. Anti-racism Training Using the Biopsychosocial Model: Frederick Douglas’ Earthquake, Whirlwind, Storm and Fire. Front Psychiatry. 2021;12:711966 Published 2021 Oct 5.

    Article  Google Scholar 

Download references

Acknowledgements

The valuable contributions of the medical specialist trainees and trainers who participated in the survey. University of KwaZulu Natal School of Public Health statistical team. Anand Pillay for his assistance with parts of the quantitative analysis.

Funding

Discovery Foundation Academic Fellowship Award. The funding organization was not involved in the study.

Author information

Authors and Affiliations

Authors

Contributions

Author SB was involved as the principal investigator in all phases of the research including the research conceptualization, design, ethics application, data collection, data analysis and manuscript preparation. Author VS collaborated in the research conceptualization,  design, ethics application and data collection. She also participated in the data analysis and supervised the manuscript preparation.

Corresponding author

Correspondence to Sandika O. Baboolal.

Ethics declarations

Ethics approval and consent to participate

The Humanities and Social Sciences Research Ethics Committee of the University of KwaZulu Natal granted ethical approval Protocol Reference HSS/0532/019D. Informed consent was obtained from all subjects for the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Baboolal, S.O., Singaram, V.S. Specialist training: workplace-based assessments impact on teaching, learning and feedback to support competency-based postgraduate programs. BMC Med Educ 23, 941 (2023). https://doi.org/10.1186/s12909-023-04922-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04922-w

Keywords