Dispersed assessment: A novel approach to enhancing student engagement during and beyond Covid-19

The Covid-19 pandemic and the subsequent need to shift between face-to-face, online, and blended learning caused considerable disruption to student engagement and the workload of academics. Drawing on a Brand Management course (final year undergraduate) in a UK university, this study discusses the students' perception of, and reflection on, a novel method of assessment that we have termed dispersed assessment. This is defined as multiple credit-bearing tasks that are spread throughout the teaching period, are related to taught learning materials and sessions, assist in completing a related final assessment task, and are evidenced in the submission of the final task to limit additional burden on students and markers. Based on scholarly work on assessment and student engagement, we used Leximancer-assisted thematic analysis to examine the data from three focus groups and students’ written reflections (n = 99). Findings show that dispersed assessment significantly enhanced student engagement, without overburdening students or increasing the marking workload. This novel method of assessment helped nurture learning communities, increase motivation, and reduce procrastination. Regarding policy, we recommend the implementation of dispersed assessment to encourage active and continuous student engagement and improve student experience. Practical examples are offered for the implementation of dispersed assessment using VLE technologies.


Introduction
Across the world, the pandemic strongly affected the assessment of student learning (Wise et al., 2022). Both students and educators have faced considerable disruption to their learning and teaching experience throughout the COVID-19 pandemic, lockdowns, and the need for quick and multiple transitions between face-to-face, online, and blended learning (Baber, 2021;Jung et al., 2021;Leigh & Edwards, 2021;Mali & Lim, 2021;Sangster et al., 2020). While online courses have seen an increase in student numbers (Heilporn & Lakhal, 2021), they have previously been associated with lower levels of student satisfaction (Maqableh & Alia, 2021), higher rates of student dropout (Phirangee & Malec, 2017), and can create additional challenges when recurring academic routines are no longer present (Lund Dean & Forray, 2020). Prior to these significant disruptions from the pandemic, student engagement was identified as an increasing concern within higher education (Bond et al., 2020) as students are reported to often procrastinate with their work, which can lead to receiving lower marks (Cormack et al., 2020). Compounded with the psychological impact of the pandemic, the disruptions above and the increasing concern regarding student engagement and procrastination reignite the need to better understand how assessment methods can enhance student engagement. However, while the shift to online and blended learning during the pandemic provided an opportunity to adopt new technologies (Mali & Lim, 2021), not least in assessment, the move to online learning also increased workload for many academics already fighting to keep on top of their work commitments (Jung et al., 2021;Rapanta et al., 2020). This suggests that more innovative assessment methods are needed to increase student engagement without overwhelming students or increasing associated marking workload and that insights from their utilisation are needed for the post-pandemic world.
To that end, amendments to a final year undergraduate Brand Management module at a UK university were implemented. Consequently, we developed the method of "dispersed assessment", defined as multiple credit-bearing tasks that are spread throughout the teaching period, are related to taught learning materials and sessions, assist in completing a related final assessment task, and are evidenced in the submission of the final task to limit additional burden on students and markers (italics is authors' emphasis).
Assessment is a major research area within higher education and in recent years, there has been growing interest in its role in developing the learner rather than in simply judging student knowledge (Sambell et al., 2012;Vos, 2015). There is, however, limited research investigating the impact of innovative assessment on student experience in higher education (Bevitt, 2015). The dramatic shift in education due to Covid-19 and the introduction of this novel assessment method provides a rare opportunity to gain insight into student experiences, which is suggested to be critical in developing novel approaches to learning (Lomer & Palmer, 2021;Mali & Lim, 2021). This study therefore aims to fill this gap by exploring students' experiences of the novel assessment method of dispersed assessment and its role in shaping their engagement. In this paper, we first identify and critically review key literature related to student engagement and assessment. Then, we conduct Leximancer-assisted thematic analysis of the data from three focus groups and written reflections (n = 99) from students registered on the module. Finally, we make policy recommendations for how dispersed assessment can be used more broadly within education to increase student engagement.

Conceptualising assessment
Broadly speaking, there are three different approaches to understand assessment, as shown in the explanation provided by Yan and Boud (2021).
• Assessment-of-learning: primarily judgements of what a student has finally achieved • Assessment-for-learning: primarily judgements to aid students on their path towards meeting learning outcomes; and • Assessment-as-learning: primarily assessment that has value as a learning task in its own right A typical example of assessment-of-learning is the use of summative assessment while assessment-for-learning more closely aligns with formative assessment. Some studies show assessment-for-learning is a broader category that includes assessment-as-learning (Earl, 2013) while others highlight that teachers take the lead role in assessment-for-learning, but students play a more active role in assessment-as-learning (Berry, 2008). Indeed, in assessment-as-learning, students use feedback to make adjustments and adaptations, while teachers need to help students develop, practise, and become comfortable with a critical analysis and reflection of their learning (Earl, 2006).
Various methods of assessment-as-learning that assess multiple tasks such as portfolio assessment, phased assessment, and continuous assessment have been used to improve student learning and engagement. Firstly, portfolio assessment is defined as a "purposeful compilation and reflection of one's work, efforts, and progress" (Milman, 2005, p. 375). This method has been shown to increase student engagement, particularly when tasks are credit bearing (Baeten et al., 2008). However, it has also been shown to encourage only a surface level of learning in some circumstances (Baeten et al., 2008) and does not necessarily require students to demonstrate engagement with work at different points in the semester. Theoretically, the student work provided in a portfolio could be created very close to the deadline and therefore have minimal impact on continuous engagement with learning. This is not to say that portfolio assessment does not have its advantages. For example, Nicholson (2018) found that online portfolios, combined with progress monitoring, peer learning, feedback practice, and intrinsic motivation, can promote student engagement.
Secondly, phased assessment enables students to submit work at various points rather than one summative piece at the end of the course (Abadi et al., 2022), while continuous assessment includes weekly assessments such as weekly tests (Holmes, 2018). Van Gaal and De Ridder (2013) found that student results in final examinations were improved due to having submitted other elements of assessment at three points throughout the semester. Regarding continuous assessment, Holmes (2015) found that the use of a low-stakes continuous weekly summative e-assessments (comprising multiple-choice, short answer and data interpretation questions) improved students' perceived engagement and the understanding of module materials. Holmes' (2015) work used both conventional marking by the tutors and automated marking (by the computer). Similarly, Holmes' (2018) later study shows that adding weekly tests (both in class and online) had a positive impact on student engagement, which was measured by student interaction with the module virtual learning environment (VLE). However, adding more weekly tests and e-assessment could increase administrative and marking demands, encourage a surface level of engagement, and potentially over-burden students (Holmes, 2015). Furthermore, while weekly tests may increase perceived engagement, if these tasks are not associated with the final assessment, this may not be reflected in overall learning and student experience.
In addition, peer feedback is another measure to enhance student engagement (Arnold, 2021). Boud and Molloy (2013, p. 25) identify feedback "as a complex system that needs to permeate the curriculum, rather than an activity that appears within it from time to time" and suggest that students must be positioned as active agents in managing feedback information. Research has found that students value feedback as an indication of their progress and that it should be timely (Paterson et al., 2020;Williams & Kane, 2009). Students also learn from and with each other in both formal and informal ways (Boud, 2013). All these factors contribute to the increased popularity of peer assessment and peer feedback (van den Berg et al., 2006;Double et al., 2020). Peer assessment and peer feedback create learner-centred practices stimulating deep learning and critical thinking, while enhancing students' autonomy, self-confidence, and reflection (Dochy et al., 1999;Panadero & Alqassab, 2019, 2019van Popta et al., 2017.

Aims and principles of dispersed assessment
Similar to the aforementioned assessment practices, dispersed assessment aims to.
• Heighten active participation (Rust et al., 2005;Van der Kleij et al., 2019) and student engagement (Holmes, 2015) • Improve student work (Holmes, 2018;Van Gaal & De Ridder, 2013) • Reduce procrastination (Ackerman & Gross, 2005;Salas Vicente et al., 2021) Given the limitations of phased and continuous assessment outlined above, dispersed assessment is unique from these practices in that it adheres to the following five principles.
• Encourages participation through ensuring that tasks are credit bearing or in other words count toward the students' final mark for the class (Baeten et al., 2008;Lomer & Palmer, 2021) • Encourages deep learning (Davidson, 2002) through ensuring that dispersed tasks contribute to a final assessment task.
• Encourages active and ongoing participation through ensuring that dispersed tasks are related to weekly learning materials and taught sessions • Seeks to limit additional burden for students (Holmes, 2015) • Ensures no additional marking burden is created for marker(s) through using peer feedback or other methods These principles have been visualised in the following image. The remainder of this paper is structured as follows: literature review of student engagement in higher education, background and context, methodology, findings and discussion, and conclusion.

Student engagement in higher education
Engagement has become a pivotal focus of attention as institutions locate themselves in an increasingly marketized and competitive higher education environment (Krause, 2005;Lomer & Palmer, 2021). It has been suggested to have "important repercussions, especially on perseverance, in-depth learning, student satisfaction, and academic success" (Heilporn & Lakhal, 2021, p. 2). The term "student engagement" refers to how involved or interested students appear to be in their learning and how connected they are to their classes, their institutions, and each other" (Axelson & Flick, 2010, p. 38) (italics in original). Student engagement is a complex construct, which has been shown to have cognitive, affective, and behavioural dimensions (Ben-Eliyahu et al., 2018;Fredricks et al., 2004;Heilporn & Lakhal, 2021). The multidimensional nature of student engagement has been further refined by Zhoc et al. (2019) in their conceptualization of the Higher Education Student Engagement Scale (HESES), that is, (1) Academic Engagement, (2) Cognitive Engagement, (3) Social Engagement with Peers, (4) Social Engagement with Teachers and (5) Affective Engagement.
Extensive efforts have been made within higher education, to understand, evaluate, quantify, and improve student engagement (Trowler, 2010;Zepke, 2014). Some studies focused on how students engage with their learning processes (Coates, 2007;Kuh, 2009) while others examined how different student cohorts engage with their university life in general (Harper, 2009;Pike & Kuh, 2005). It is widely accepted that, within an increasingly challenging higher education sector (Lomer & Palmer, 2021), improving student engagement plays a critical role in producing more satisfied and productive individuals, both during their studies and beyond (Carini et al., 2006). It is in this context that creating a learner-centred environment is instrumental to student success as it helps to sustain and promote student engagement, thereby encouraging students to become active in their own learning experience and responsible for their own learning (Doyle, 2009;Sangster et al., 2020).
For educators to understand more about the complexity of student engagement and to assist in the creation of effective learning environments, Coates (2007) recognised the following five key elements of student engagement: (1) active and collaborative learning (2) participation in challenging academic activities (3) formative communication with academic staff (4) involvement in enriching educational experiences (5) feeling legitimated and supported by university learning communities.
Active and cooperative learning positively impacts students' personal and social development, effort, competence, and engagement (Delialioglu, 2012;Zhao & Kuh, 2004), though it is often underused (Scager et al., 2012ss of just forming groups and introducing activities is not enough to generate better learning and motivation. Students are often reluctant to engage in collaborative learning activities because of the risk of "free riders," logistical issues, or interpersonal conflicts (Livingstone & Lynch, 2002). It is therefore imperative that educators develop thoughtful engagement, which is one of the most important predictors of student success (Pinheiro & Simões, 2012).
Challenging academic activities have also been identified as a critical aspect of student engagement. Indeed, challenging intellectual and creative work and high expectations are central to student learning and institutional quality (Huntley-Moore & Panter, 2013, p. 148). Zepke and Leach (2010, p. 171) also state that "the evidence is compelling that enriching experiences and academic challenges are successful in engaging students." Factors that are instrumental in challenging students to achieve in higher education include, 'tough' assessment tasks (Kuh et al., 2011), higher levels of complexity and learning (Hockings, 2009), student autonomy, and teacher expectations (Scager et al., 2013).
Thirdly, formative communication with academic staff plays a significant role in enhancing student engagement, as the interaction between students and staff has been found to improve module content, evaluation, and assessment techniques, leading to more engaged and higher performing students (Troisi, 2014). Academic staff have a responsibility to ensure they are communicating effectively, thereby creating more opportunities for productive and engaging interaction (Zhang et al., 2015). In a similar vein, Bryson and Hand (2007) found that the disposition of the teaching staff influences the disposition of the student. That is, a more engaged teacher leads to a more engaged student.
Fourthly, students' involvement in enriching educational experiences impacts positively on student engagement (Umbach & Wawrzynski, 2005). This is echoed in other recent studies that have investigated the link between enriched educational experiences and student engagement (Billett & Martin, 2018;Menkhoff & Bengtsson, 2012;Robinson & Hullinger, 2008). Kezar and Kinzie (2006, p. 162) state that "an enriching educational environment becomes one where students are charged to create experiences on their own to challenge each other." Lastly, establishing effective learning communities in higher education results in increased student engagement (Rocconi, 2011;Zhao & Kuh, 2004). Students' feeling legitimated and supported by university learning communities leads to mutual benefits that continue well beyond graduation (Krause, 2005). Given the significance of student engagement in shaping student success and subsequently the competitiveness of institutions in an increasingly marketized higher education environment as outlined above, it is necessary to gain insight into students' perceptions of how this is affected by the novel approach of dispersed assessment (see Fig. 1).

Background and context
The context of the current study provides an example of the application of the principles of dispersed assessment outlined above. In line with these principles, multiple assessment tasks were introduced throughout the semester. The first of these submissions required students to post a comprehensive visual summary of their work, namely a mind map, to the virtual learning environment (VLE) discussion forum in week four of teaching. This mind map provided a visual summary of the work each student carried out during the first three weeks of tutorials. Fig. 2 is a screenshot of one student's submitted mind map.
Students were then required to provide feedback on a peer's mind map through replying to their post on the discussion forum by the end of week six. This gave students the opportunity to not only receive feedback on their own work but to benchmark their early work against that of other students. Fig. 3 provides an example of the feedback one student gave to another regarding their mind map.
Students were awarded marks for these activities through providing time-stamped screenshots of both their mind map and peerfeedback, in the final submission at the end of the semester, along with a reflection on the learning gained from the process. The screenshots and written reflection were worth a combined 25 percent of the overall mark on the module. The reflection required students to critically reflect on their experience of dispersed assessment and participating in peer feedback. This task was designed to deepen students' learning, as previous studies have shown that reflective journaling engages students in evaluating course content in relationship to their own experiences to create personally meaningful connections between topics, enhances their self-awareness, and encourages active learning (Choi et al., 2022;Dyer & Hurd, 2016;Muncy, 2014). The submission of time stamped evidence within the reflection was intended to incentivise students to engage throughout the semester rather than only at the point of the final submission demonstrating adherence to principle one above. The dispersed tasks required students to apply their weekly learning to the brand they had chosen for their final task, which demonstrates adherence to principle two to encourage deep learning. The utilisation of a summary of activities from the previous teaching sessions as one of the dispersed tasks also ensured adherence to principle three whereby active and ongoing participation was encouraged through relating the dispersed tasks to weekly learning materials. The dispersed tasks also represented steps that students would have needed to take toward the final task. This ensured that principle four was followed given that it limited any additional burden placed on students. Given that this time-stamped evidence was submitted with the final assessment and that peer-feedback was utilised, markers also saw no increase in their marking load, which ensured that principle five was followed.

Methods
This study used a mixed methods approach using three online focus groups, and written reflections from those participating in the above-mentioned module. Nine students participated in the three semi-structured focus groups, which lasted 1 h each and given the small number of participants enabled more detailed responses and in-depth insights into students' lived experience of dispersed assessment. Students were asked about their levels of motivation while studying during the pandemic, and their experience of having earlier and dispersed submissions, as well as peer feedback. Open ended questions were utilised: for example, 'Can you tell me about  your experience studying during the pandemic', 'what was your experience with having multiple assessment points throughout the semester', and 'would your level of engagement with the various tasks and learning materials have differed if they had not counted toward your final mark for the class and if so, how'. Written transcripts were then created for each of the focus groups.
Given that student reflections on assessment practices have gained importance in recent years, especially on how to best engage student learners (Christie & Morris, 2019;Trowler, 2010), the pre-existing written reflection portion (750 words) of the students' assignments were also used for this study. The student reflections were reviewed by the authors to assess their suitability for inclusion in the study, which was based on whether they incorporated discussion related to dispersed assessment and peer feedback. 99 student reflections were deemed suitable based on these criteria.
A Leximancer-assisted thematic analysis was then carried out to derive the key themes. Leximancer and NVivo are two discrete software packages that can be used on similar sets of empirical materials (e.g., interview transcripts, documents, and open survey responses) (Sotiriadou et al., 2014). While NVivo requires the researcher to code the data and to develop themes or categories, Leximancer identifies concepts and interrelationships without the requirement for manual intervention (Sotiriadou et al., 2014). Leximancer is thus suitable for dealing with large groups of text, assisting thematic analysis by highlighting key themes and correlations in the text (Campbell et al., 2011), without researcher intervention. The transcripts of three focus groups and the students' written reflections were uploaded to the Leximancer software. The three focus group transcripts had a word count of 13809, 9106, and 7073 respectively totalling 29,988. The 99 reflections had a total of 23,841 words identified as being relevant to aspects of dispersed assessment. In the text processing options, file tags were switched on to allow the analysis to consider each file (total n = 102) as a case. Leximancer generated a list of 56 word-like concepts in order of declining occurrence, of which the top 12 concepts are listed in the synopsis of concepts in Fig. 4 below.
The module leader's name was the most frequently appearing word, but it was taken out from the synopsis of concepts as this is not a concept. To explore further, we analysed the implications and inter-relationships of the above concepts in the following section in more detail.

Findings and discussions
A bird's eye view of the key themes and concepts are illustrated in Fig. 5. Themes are represented by coloured circles, while lines indicate the most likely path between concepts and the proximity of concepts indicates how often they appear in similar conceptual contexts (Sotiriadou et al., 2014).
The above visualised map (Fig. 5) shows the contributing factors to student experiences of dispersed assessment with peer feedback. Themes such as "brand" and "market" indicate the assessment content alongside concepts like "brand extension", "research" and "report". Similarly, themes such as "module", "tutorials" and "assessment" are generic elements in the context of the current research. In addition, "people" and "person" do not provide any specific findings. Thus, these are disregarded for further analysis.
A series of verbs such as "helped", "allowed" and "improve" illustrated students' feelings about their experiences in this module. In addition, such concepts as "useful", "understanding" and "able" indicate a sense of student empowerment, especially through giving and receiving peer feedback. For example, Fig. 5 clearly demonstrates that "understanding" and "feedback" are closely linked, while containing key themes such as "peer", "discussion", "gave" and "improve". This indicates that the current module design with dispersed assessment and feedback "reassuringly" "allowed" the students to engage with the module materials. We then examined the Fig. 4. Analyst synopsis of concepts from Leximancer. Z. Thompson et al. The International Journal of Management Education 21 (2023) 100811 Document Summary Index (Fig. 6) to contextualise the key concepts to derive the recurring themes.
Based on the above Leximancer-assisted analysis, there were three recurring themes as follows: learning communities, steppingstones and puzzle pieces, and increased motivation and reduced procrastination. Leximancer provided the conceptual map, the list of most frequently used words, and the context those words were used. Based on the Leximancer results, two independent, trained coders further examined the transcripts and derived these three themes as recurring and dominant themes. Direct quotations from the Document Summary Index are included to support these themes in each discussion.

Learning communities
Many students commented on their appreciation of the fact that they were able to engage with other students through the means of dispersed assessments, for example, working together in tutorials and giving/receiving feedback. This sentiment was particularly significant as the module ran at a time of national lockdown when students did not have any in-person contact with each other either  through university or socially. It is typically quite difficult to create a learning community through online classes but ensuring that students utilised the discussion forum was one way of achieving such a goal. In other words, having different tasks set in different points, many students felt that they could connect with each other and engage with their peers as a part of the learning community. A participant stated: "Joining uni not knowing anyone and putting something out there when I haven't studied for over 10 years, it was nerve wracking, daunting, exciting. It was a real mix of emotions but it also did make me feel a little bit like part of the community because once you saw other people doing it, I felt more confident to just go on and put something out there. And so I think you know it kind of gave a sort of a sense of we're all in this together.
There was also a sense of realisation within the data about the usefulness of peer feedback that was not previously anticipated by participants. The following quotations exemplify this.
"The positive aspect was to be able to engage with other students through the discussion board by providing and receiving feedback, in depth discussions with the mind maps and extension ideas throughout tutorial classes." "I think that was a really good way of communicating with each other, since we can't do group work as much, this was a way for students to connect with each other." "I was also glad I got feedback which I thought was going to be pointless because it was done by a student, but instead gave me clarity." These comments indicate to what extent students appreciate their experiences of active and cooperative learning, as well as their participation in challenging academic activities. As discussed above, challenging intellectual and creative work with a higher level of complexity and learning are critical factors influencing student engagement. Furthermore, active and cooperative learning play a positive role in enabling student's personal and social development, as well as competence and engagement level (Delialioglu, 2012;Zhao & Kuh, 2004). The way that dispersed assessment was designed in this module facilitated students' active involvement in enriching educational experiences, without imposing additional workload for students or staff. This was even more appreciated by the students in challenging times like the COVID-19 era. In addition, as all students were required to submit a piece of work and feedback for another student, this collaboration removed concerns of "free-riding", typically associated with peer assessment and group work processes (Livingstone & Lynch, 2002). This was also another positive aspect which helped solidify the sense of the learning community, according to our findings.
Students also appreciated that they were able to present their work for formative feedback through online tutorial sessions as well as through VLE discussion forum. Thus, formative communication with academic staff was also an integral part of current module design. In addition, the positive response to peer feedback suggests that it was highly valued by participants and that it is not only formative feedback from academic staff that can encourage engagement as suggested by Troisi (2014), but that this can be supplemented with feedback from peers. This positive response to peer feedback also suggests that these online interactions can counter the perception among students that online learning activities weaken relationships (Lomer & Palmer, 2021).

Steppingstones and puzzle pieces
Across the three focus groups and written reflections, students also frequently shared their evaluation of dispersed assessments related to the connectedness of the various submissions. There was a strong sense of appreciation for the fact that students continuously engaged with these pieces of work and that they were all connected, which in the end, led to the final assessment task. These measures provided students with a continuous learning experience with a sense of reassurance, as the following quotations reveal. They also acknowledged the fact that each point of dispersed assessment was beneficial for building their confidence and sense of achievement.
"I found that the little steps, like the steppingstones, help. Keeps like engagement and like you know, reassurance that this is what you're supposed to be doing." "I was looking at how those kind of pieces fitted together. The module was very good at giving you lots of jigsaw pieces to think about it. But it was kind of up to us to put all these pieces in to make the puzzle complete."

"Doing it this way is kind of like jumping along on blocks so to speak and yeah it gave … it made you use the theory, not just take it on board and use it in 10 weeks' time. It made you use it then and there and I think that's really valuable."
This appreciation from students of the challenge a multiplicity of assessment points creates supports the suggestion that students value challenging assessments, particularly when they are able to receive timely feedback (Carini et al., 2006;Zepke & Leach, 2010).

Increased motivation and reduced procrastination
Lastly, we found that most students regarded their experience of dispersed assessments in this module as engaging, motivating, and enabling. Students reflected on their tendency to procrastinate with their work and this self-awareness enabled them to appreciate the different points of engagement via dispersed assessment even more. Students commented that the module design helped reduce procrastinating their work as well as encouraged an active and collaborative environment through engagement with regular activities in the form of submissions and collaboration through providing and receiving peer feedback. The following quotes support Ackerman and Gross' (2005) suggestion that students are less likely to procrastinate when social norms and rewards such as early feedback are present.
"Actively engaging and achieving goals within the first weeks of the module was extremely motivating. It helped to concrete the learnings from the lecture and apply them." "The structure of the brand assignments allowed me to clearly understand the objective of the extension report and be able to organise my work well over that period of time causing less confusion towards the report." "It ensured I was aware of the task at hand and prevented the assignment from being left until the last minute as an extension idea was developed from the onset." "I found it is highly motivating to engage in the mind map task at an early stage, as it provided the opportunity to explore and understand the essence and equity of [the brand name -Body Shop] to ensure my brand extension was consistent with the brand, before engaging in the task ahead." Cues in our data and Leximancer concept map, for example, "freedom", "choose", "allowed", "responsibility", "control" and "decide") also highlighted the extent of which students positively reflected upon their autonomy. This finding supports previous research that learning through innovative assessment practices is more effective and engaging for students than the conventional written test (Pereira et al., 2016). More importantly, as shown in extant literature, we found that dispersed assessment created a learner-centred environment in which students took the lead role in their learning, while actively engaging in peer feedback and critical reflection. Not only did this environment increase student engagement but also led to the heightened sense of autonomy and self-confidence.

Conclusion and recommendations
This study introduces a novel method of assessment, namely, dispersed assessment, to enhance student engagement. In addition, it explores the ways in which students themselves critically reflect on the use of dispersed assessment and how it affects their engagement with learning. Our findings suggest that dispersed assessment helps create the learning community using reassuring step-by-step guidance and supports previous studies showing that student experience can be enriched through student-to-student interactions (Kezar & Kinzie, 2006). We also found that dispersed assessment prevented most students from procrastinating with their work. As discussed above, we also found evidence to support Umbach and Wawrzynski's (2005) argument that module related interactions between students are correlated with student engagement. In line with previous research such as Zhao and Kuh (2004) and Rocconi (2011), we found that the peer feedback element in dispersed assessment encouraged the feeling of being legitimated and empowered by peers on the module, which increased overall student engagement.
Our findings indicated that the use of dispersed assessment enhanced student engagement, autonomy, and self-confidence. Specifically, the first assessment set at an early stage of the module, followed by peer feedback embedded in the assessment structure, played a significant role in motivating students to be more critical, reflective, and proactive learners. Dispersed assessment differs from phased assessment as each task is credit-bearing and yet does not overburden students or staff. This is consistent with previous work suggesting that students place a higher value on tasks that are relevant to their assessment (Lomer & Palmer, 2021). Thus, our findings clearly demonstrate that dispersed assessment enhanced student engagement in all five facets outlined by Coates (2007). Active and collaborative learning communities were created by having the tasks that were set at multiple points infused with a learner-centred approach, which allowed students to engage in peer feedback and critical reflection. Their participation in such academic activities, while keeping communication channels open through VLE (e.g., discussion forum), also enriched their educational experiences. In addition, dispersed assessment was beneficial for students as it helped increase autonomy and self-confidence as well as reduce procrastination. While dispersed assessment was developed to address concerns related to student engagement during the Covid-19 pandemic and the associated shift to online learning, this novel method of assessment has been continued as classes return to face-to-face and blended delivery.
Based on these findings, it is suggested that colleagues seeking to enhance student engagement on their modules should consider the utilisation of dispersed assessment across face-to-face, online, and blended modes of delivery. For many modules, this would require minimal effort to implement and would provide students with increased motivation to participate in weekly activities as well as earlier and continuous engagement with the assessment. Where colleagues choose to deploy dispersed assessment, it is suggested that they should first structure their lecture and tutorial or seminar materials in a way that enables students to engage in a task each week that supports their learning and acts as a steppingstone toward their final assessment. A method of amalgamating and submitting tasks from multiple weeks should then be identified such as the mind map submission to the VLE forum utilised in this research. If peer feedback is to be utilised, a method for providing this and evidencing it should then be established such as the reply to another student's forum post with feedback on their work, as used in this research. A method of providing time-stamped evidence of these multiple submissions should then be established such as with the use of time-stamped screenshots in the final submission in this research. We recognise that a lack of resources has the potential to create greater disparity among institutions (Mali & Lim, 2021;Sangster et al., 2020). However, while a proprietary VLE software was used for the implementation of dispersed assessment within this research, other resources that are free for certain applications could be utilised for this type of work such as Google Workspace for Education Fundamentals, Google Drive, or Moodle.
This study is not without its limitations. Focus groups were carried out weeks after the module finished to ensure ethical compliance and that students could be certain they would not be negatively impacted in any way if they chose not to participate or responded negatively during the focus group. It could be suggested that this extended length of time between participating in the module and the focus groups could have impacted on the accuracy of students' perceptions and ability to recall their level of engagement and experiences. Equally, it could be suggested that having a higher number of focus group participants would have given a broader understanding of the student experience of dispersed assessment. However, this provided the opportunity for gathering more detailed insight from those that participated. Efforts were made to counter these two limitations using the written reflections, which gathered insights from all students on the module and were collected at the end of the semester. As these reflections were written and submitted as part of the final assessment, it is possible that they were skewed toward more positive responses. However, to counter this, students were encouraged throughout the semester that an honest and critical reflection was welcomed and that reflections that were critical of the dispersed assessment process would not be marked unfavourably.
While our study has shown that students in this case value dispersed assessment and the perceived increase in motivation, engagement, and interactions with other students it enables, further research would be beneficial. This case focused on one module at one university in the UK and therefore, further study would benefit from a wider investigation of the perception of students at multiple institutions globally and across multiple subject areas and levels. Further insight into an optimal number of assessments spread throughout the semester would also be beneficial. For instance, a comparison of perceived engagement of students participating in continuous assessment and other students participating in dispersed assessment could help establish the optimal number of assessments for increasing engagement without overburdening students or colleagues engaged in marking. Finally, further research could explore the individual contribution of the five principles of dispersed assessment to enhancing student engagement.

Funding acknowledgements
The study has not been funded by external funds or grants but the internal funding was acquired from the host university to incentivise student participation in the focus groups with gift vouchers.

Declaration of competing interest
No potential conflict of interest was reported by the author(s).

Data availability
Data will be made available on request.