Evaluation of improvements to the student experience in chemical engineering practical classes: from prelaboratories to postlaboratories

: Practical classes are an important and essential part of undergraduate programs in Chemical Engineering, as each experiment provides an opportunity to reinforce the theory of discrete unit operations that are taught elsewhere in the course. While an expensive pedagogical method, when practical sessions are delivered well, they can be one of the best learning experiences for students. As with all pedagogical methods, for students to gain maximum benefit of practical classes, a high level of engagement is required. Consequently, lab assignments need to be designed in a way that guides and instructs students on the theory, procedure, and risks associated with any practical and its associated assessments. This paper describes the outcomes of a qualitative investigation that evaluated student perceptions of updated prelab content combined with a new variation in postlab assessments and a renewed focus on practical skills during practical classes. The overall aim was to improve the student experience in practical classes. Paradoxically, periods of remote teaching enforced by the COVID-19 pandemic created further opportunities to make innovative changes to practical class resources. Subsequent student evaluations also indicated perceptions about each newly introduced component (instructional videos, online multiple-choice prelab quiz, variation in postlab assessment, introduction of grading rubrics, and a practical skills assessment), and more than 75% wanted these resources retained.


■ INTRODUCTION
Chemical Engineering graduates play a vital role in addressing global challenges such as air quality, pharmaceutical production, sustainable energy, and clinical diagnostics.It is therefore vital that we continue to ensure that graduates are equipped with the necessary knowledge and skills to be able to deliver on these challenges.Practical classes are essential to the training of Chemical Engineering students and are designed to provide opportunities to develop technical skills and apply theoretical principles.In terms of practical learning skills, the main aims are: • Students learn how to work safely in a laboratory environment.• Students become familiar with common instrumentation.
• Students understand how such instrumentation/equipment works.• Students can assess, process, and interpret data obtained.
• Students can communicate their findings via reporting.
The hands-on learning opportunities that practical classes provide are also vital in the student preparation for their Design projects, 1 which are conducted to varying degrees in each year of study, as practical classes provide an opportunity to observe and manipulate in real life the unit operations for which they learn the associated theory through lectures, seminars, tutorials and workshops.The development of communication skills and the contextualization of unit operations in real-world applications also prepares the students for their final-year research projects. 1ractical class sessions are longer in duration on average compared to traditional classroom teaching, while also providing the opportunity for lower instructor−student ratios, offering a great opportunity for students to practice their skills while also being able to ask questions in a less formal setting. 2 Although laboratory classes are sometimes seen as an expensive pedagogical method, 3 they provide a unique opportunity for learning 4 in the practice of science and engineering, which can justify the time and effort.The success of students in laboratory classes is not solely dependent on how they prepare.It is also reliant on the design of the practical and assessments, 5 which should be constructed in a way to guide the preparation and participation.Despite much research on improving the delivery of practical classes, the results and recommendations are not being widely adopted. 6While some improvements are undoubtedly required, it is perhaps the efficiency and focus of the delivery that need to be addressed.
The continued delivery of practical classes was considered according to Bloom's Taxonomy of Educational Objectives. 7he provision of the practical classes, whether in person or via remote online resources, must still build up the cognitive levels as described by Bloom. 7The lowest level (i.e., Knowing) can be achieved via some of the questions of a prelab assignment, as these can focus on recalling facts of the procedure and risk assessment, which includes Control of Substances Hazardous to Health (COSHH) considerations.The highest level (i.e., Evaluation) is achieved through the postlab assignments, which encourage students to consider the procedure, the instrumentation, and the data and critically evaluate them in terms of expected trends, anomalous results, and possible causes of error combined with remedial improvements that could be made.It is therefore necessary that any content that is positioned between the pre-and postlab, whether the traditional in-person practical or an alternative remote version, must bring students through the other cognitive levels.
When the content is designed and delivered in this way, each experiment allows student learning to follow Kolb's Experiential Learning Cycle. 1,8The prelab resources and activities allow for Abstract Conceptualization, as students comprehend provided information and consider the challenges and expectations of an experiment.When the students attend each lab class, this is the opportunity for Active Experimentation, which includes best scientific/engineering practice of assessing risk, hands-on application, data collection and recording, and subsequent analysis.Undertaking this process multiple times (in this case 10 experiments) provides Concrete Experimentation, whereby each of those skills of scientific/engineering practice is refined and honed.The postlab assessments, where students summarize their learning and understanding, then provide the opportunity for Reflective Observation.

Formative Assessment and Evaluation Goals
The goal of the current qualitative work was to evaluate student perceptions of how a change in the approach has impacted the student experience in practical classes.While several interventions were made (see Methodology), the two main themes of the evaluation were: • What was the student perception of the impact of new targeted prelab activities on their knowledge?• What was the student perception of the impact of the introduction of new practical skills assessments on their learning experience?It should be noted that further iterations of the change in approach to practical class content were accelerated because of the COVID-19 pandemic, and these are included as part of this qualitative evaluation.

■ METHODOLOGY
This change in approach (and evaluation thereof) were carried out across two different year groups; first in the academic year 2020/2021 (Cohort 1) and then in 2021/2022 (Cohort 2).The participants were all second-year undergraduate Chemical Engineering students at Queen's University Belfast (QUB) and were enrolled in either the Bachelor of Engineering (BEng) or Master of Engineering (MEng) program.There were 52 students in Cohort 1 and 53 students in Cohort 2.
This was an iterative process, one that was codesigned with the students, initially gathering feedback from Cohort 1 preintervention in Semester 1 AY2020/2021 (which had inperson delivery), implementing initial changes with Cohort 1 for Semester 2 AY2020/2021 (which reverted to remote delivery) and then further changes with Cohort 2 for AY2021/2022 (which was in-person).Hence, Cohort 1 was used initially as the baseline group using existing/preintervention content in their Semester 1, before having updated prelab content in their Semester 2.
In AY2021/2022, most of the new features introduced for Cohort 1 in AY2020/2021 were retained, while a further update was introduced for Cohort 2 with the introduction of a practical skills assessment; Cohort 2 could compare experiments with and without a practical skills assessment.While student grades were used by way of evaluation, the students were also surveyed anonymously on their own perceptions, as is recommended by Juwah et al. 9 This initially included informal feedback with Cohort 1 via minute papers so that students could provide information on what they found problematic in order to guide the instructor on improvements. 10As the project progressed, more formal evaluation was utilized, including five-point differential semantic scales (to allow students to rate the usefulness of a resource from not at all useful to extremely useful), and open questions (allowing students to respond freely without word limits or restricted predefined options), were used in anonymous surveys.
As the involvement of teaching assistants in the planning and delivery of lab classes is also recommended, 11 informal conversations were also conducted with the teaching assistants to gauge their observations of interactions with students.The digital gradebook, which contained the grades of the students' assignments, was used for comparison of performance.
Each cohort had the same 10 experiments during their course.Prior to the interventions, students were provided with a printed lab manual which contained procedures for each experiment along with the associated risk assessments, including COSHH.The assessments consisted of a prelab and report-style postlab for all 10 experiments, i.e., there was no variation in assessment style, which is contrary to the principles of Universal Design for Learning (known as UDL). 12,13There were no grading rubrics provided for any assignments.A transition to an updated virtual learning environment, Canvas, provided the opportunity to improve the information provided to the students in advance of the practical classes, and this included the digitization of the experimental procedures and risk assessments.Printed and laminated versions of each of these documents were also available in the lab at the relevant workstation for ease of reference during lab classes.Changes were also made to assessment, which included: • The format of prelabs.
• Style of postlabs to provide variety of assessment.
• Introduction of a practical skills assessment for one of the experiments.• Introduction of grading rubrics that were accessible by students in advance.
Each of these modifications is individually discussed in subsequent sections.Institutional policy has mandated that all assignments are graded out of 100 and that the overall mark for an assignment must reflect the grading system of the UK Degree Classifications that are used at QUB.This policy is maintained in the introduced modifications.It is acknowledged that other instructors wishing to follow a similar process will have their own preferences and/or institutional policies on grading boundaries and rubrics.

Ethics
Ethical standards were upheld by adhering to the guidance of the British Educational Research Association Ethical Guidelines for Educational Research 14 in relation to informed consent.Approval for the evaluation was sought and granted through QUB's Engineering and Physical Science Research Ethics Committee.Participants were provided with an information sheet (see the Supporting Information) which outlined the aims and contained a clear explanation that participation was voluntary.Participants were also informed of their right to withdraw their consent at any time.

Prelab Content
The transition to Canvas provided an opportunity to pilot new online prelab activities, as recommended by Clemons et al. 15 Good preparation via engagement with prelab resources (which includes theory and procedures) is important for students to achieve the learning outcomes of practical classes. 16Although the experiments were diverse in nature, the approach undertaken previously had been a generic prelab assignment as reported in Table S1.It was felt that this encouraged students to take what Biggs defines as a surface approach 17 to their preparation (i.e., they rote learn only some of the provided content).This method of assessment was maintained for Cohort 1 in Semester 1.
Cohort 1 students were asked to share their opinions of the generic prelab assessment reported in Table S1 at the end of Semester 1 via anonymous minute papers.Overall, students stated a preference for an improved prelab assessment style.Consequently, updated experiment-specific prelab assessments were developed for Semester 2 of Cohort 1, an example of which is provided in Table S2.It was envisaged that this would motivate students to take what Biggs defines as a deep approach 16 to their preparation (i.e., students want and are encouraged to understand and so focus on the themes and main ideas provided).All the updated prelab assignments are multiplechoice quizzes based on the theory, procedure, and risk assessment of the experiment.The prelab assignments were automatically graded, and students could view the correct answers once grades were released.
The new prelab assignments were introduced for Cohort 1 in Semester 2, and depending on the experiment, the students were asked to name, explain, interpret, apply, and solve across various aspects related to the theory, procedure, and risk assessment of the respective practical.This adheres to the guidance of Bloom's Taxonomy Question & Task Design in assessing Knowledge, Comprehension, and Application. 18The aim was to improve student understanding of the practical in advance of the practical class.It was anticipated that this would improve the students' ability to interpret and report their findings.Such targeted prelab activities have been demonstrated to improve student confidence and decrease trepidation while also improving on how prepared the students are for the practical experiments. 15he improvement in design of the prelab assignments, with specific questions on the theory, procedure, and safety considerations (as opposed to the previous version of generic questions), encouraged the students to understand more and thus better supported and guided students in their preparation and understanding of topics covered by practical classes.These also aided students to become more accustomed with the style of higher education assessment. 19

Postlab Assignments
Assessment methodology is known to influence the transition of students to university life and can affect dropout rates. 19Having 10 report-style postlab assignments for students was considered repetitive and restrictive given that throughout the undergraduate program students would undertake other types of assignments, not to mention an issue of student fatigue with respect to the repetitive nature.
Report writing is of course very important to develop scientific communication skills, and therefore, five postlab assignments were retained as reports.Four of the postlabs were redesigned to be quiz-type assignments, and an excerpt from one is reported in Table S3.The final postlab was the preparation/design of a poster (without the need to present it).The inclusion of the poster was particularly significant, as a poster is typically a major component of the assessment of the final-year dissertation research project on the undergraduate MEng program, and in the past students had often stated concerns of never having had the opportunity to practice making a poster prior to that highstakes final-year assessment.
The provision of grading rubrics is an important aspect of transparency of assessment, 20 since these help students understand in advance how grades are assigned and hence are better informed on what is expected.The use of rubrics also ensures that the quality of assessment is maintained student to student and year to year.Consequently, grading rubrics were created for all postlab assignments and were shared from the outset; an excerpt from a rubric for a quiz-type postlab is reported in Table S4, while a rubric for a report-style postlab is reported in Table S5.As reported in Table S5, the report-style postlab rubric has criteria for each section that should be included by students.Within those rubric criteria there are references to the standard of scientific writing; it is important to note that at the start of the academic year the students are given a resource on how to approach the postlab assessments.That resource includes sections that directly correlate to the criteria of the rubrics (for example, each section of a report) but also contains a section on technical/scientific writing, which is designed so that students are aware of what is expected.This guide to scientific writing is based on that previously reported by Lombardo, 21 who drew inspiration from the works of Williams. 22dditionally, the rubric for the poster design (see Table S6) was based on the design criterion from the rubric for the fourthyear poster presentation assessment, and thus, the individual feedback associated with the second-year poster design is intended to help students prepare and improve for that fourthyear assessment.Note that this rubric only has one criterion that has multiple aspects of the assessment, including the presentation of information, organization, and technical quality.Individual educators will have their own approach to designing such a rubric, but on reflection the author believes that separating these different aspects into their own criteria will provide greater transparency to students as well as improve feedback on the individual components of the poster design.
Submissions for all assignments were graded and returned promptly within 1 week of submission.This allowed students to read feedback and act upon it within the same semester/ academic year.

Adapting for Remote Delivery during COVID-19
The impact of the COVID-19 pandemic subsequently meant further, accelerated changes to approach, including a temporary Journal of Chemical Education replacement of the in-person practical classes for Semester 2 of Cohort 1 with remote and/or blended alternatives, as was also the case elsewhere. 23Consulting Fink's taxonomy, 24 alternative provision that was cognizant of learning how to learn, foundation knowledge, application, and integration was designed.Alongside the digitized lab manual with the operating procedure and risk assessment, a short instructional/familiarization video which demonstrated the procedure and the apparatus was provided for students through each experiment's Canvas page.Videos were of durations between 3 and 6 minutes (depending on the experiment) and identified the key components of each piece of apparatus as well as a step-bystep guide of how to conduct the experiment.For conciseness, videos were not a complete recording of an experiment since all procedures required various changes to parameters that necessitated equilibration time before observing/recording results.Students were directed to read the lab manual components and then watch the instructional video prior to completing their prelab assessment.For remote sessions, students were also provided with sample data sets to complete postlab assignments.As mentioned previously, students had also been provided with detailed guide specifications on how to complete lab assignments.As mentioned above, grading rubrics were also provided in advance.Students were expected to engage with these resources before joining a small-group tutorial-style session led by teaching assistants.These sessions offered students an opportunity to seek any clarifications while also giving the teaching assistants the opportunity to interrogate student understanding.

Practical Skills Assessments
For teaching lab delivery to be further modernized with a renewed focus on practical skills themselves, a detailed practical assessment rubric was introduced for one experiment on a trial basis for Cohort 2 (see Table S7).The rubric focused on the students' handling/utilization of equipment and proficiency of lab-based skills and hence gave students advance notice to focus on what would be expected in the practical session.The actual practical skills were assessed during the in-person sessions, while calculations and scientific communication skills were still assessed with a postlab assignment.While the utilization of a detailed rubric can reduce subjectivity itself, to reduce this further, one teaching assistant conducted the practical skills assessment for each student (done in groups of three).

Summary of Interventions
A comparison of the lab resources provided to students pre-and postintervention is summarized in Table 1.Each lab experiment had equal weight toward a student's overall grade for the course.

Journal of Chemical Education
The weight of each prelab to an experiment was 20%.All postlabs had a weight of 80% to an experiment, apart from the Liquid/Liquid Extraction experiment once the practical skills assessment was introduced; the postlab in that case (a short quiz similar in format to that in Table S3 which still required an upload of calculations) contributed 40% to the overall experiment, as the practical skills assessment also contributed 40%.

Prelab Grades
The grades for 52 students across three prelab assignments (total of 156) in Semester 1 are reported in Figure 1.The mean score for Semester 1 was 61.39%, with a median score of 60.00% and a standard deviation of 12.72%.The grades for Cohort 1 students across three prelab assignments in Semester 2 are reported in Figure 2 (the same 52 students with a total of 156 assignments).The mean score for Semester 2 was 89.81%, with a median score of 95.00% and a standard deviation of 12.62%.It is interesting that while the mean and median scores increased from Semester 1, the standard deviation only decreased slightly (by ∼0.1%).This indicates that between the two data sets there is very little difference in the variation of each grade with respect to the mean and median.This provides some confidence in the robustness of the new assignments in that there is still a similar spread of grades and still an opportunity to differentiate student performance.
−27 As such, this provided the opportunity for students and academics/teaching assistants to ask more applied and probing questions during the laboratories and therefore deepen their understanding of the subject matter.The impact of the new approach to prelabs was further evaluated through informal discussions with the team of teaching assistants who worked with Cohort 1 in both Semester 1 and Semester 2. Anecdotally, the teaching assistants (some of whom had been in the role for several years) also found that the advance knowledge of most students had improved.For example, in Semester 1 many queries in the session were of the genre of "what do I do next?".Conversely, in Semester 2 a reduction was observed in the frequency of procedure-based  questions posed by the students.Instead, student queries focused much more on "why?" and "what if ?" related to the practical and its theory.As such, the new assessments resulted in a visibly marked improvement in understanding of both the science and engineering aspects of the practical.It also provides a better foundation for teaching assistant to student interactions, as recommended by Velasco et al. 28

Responses to Differential Semantic Scale Questions
Cohort 1 students were asked to evaluate the new online provision by means of an anonymous survey, with a question directly related to each of the components: prelab, instructional video, and live Q&A session with a teaching assistant (known as demonstrators to the students).Figures 3 and 4 report the responses to the questions on the contribution to student knowledge of the changes to delivery for Cohorts 1 and 2, respectively.From the Cohort 1 responses, the percentage of students finding the prelab Extremely Useful was equal to that of students finding it Moderately Useful (38%), while the percentages of those finding it Very Useful, Slightly Useful, or Not at All Useful were also equal (8%).For Cohort 2, the highest response rate for the prelab was for Moderately Useful (50%), with Extremely Useful next (25%) and Very Useful and Slightly Useful having the same response (12.5%).No students in Cohort 2 responded that they found the prelab to be Not at All Useful.Most of the students found the online prelab to be at least a Moderately Useful contribution to their knowledge.Applying a scale analysis (using 5 = Extremely Useful, 4 = Very Useful, 3 = Moderately Useful, 2 = Slightly Useful, and 1 = Not at All Useful), the data were analyzed using a single-factor ANOVA via MS Excel.The single-factor ANOVA revealed that there was statistically no significant difference between Cohort 1 and Cohort 2 for the perceived impact of the prelab on knowledge and that these results are not likely to have occurred by chance (F = [0.04],p = 0.84).The student evaluations appear to indicate that the new format of prelabs had a positive impact on student experience.
From the Cohort 1 responses, the percentage of students finding the instructional video Extremely Useful was equal to that of students finding it Moderately Useful (30.7%), while the percentages of those finding the instructional video Very Useful and Slightly Useful were also equal (15.4%).Also, 7.7% of Cohort 1 students responded that they found the instructional video to be Not at All Useful.For Cohort 2, the highest response rate for the instructional video was for Very Useful (50%), with Moderately Useful next (25%) and Extremely Useful and Slightly Useful having the same response (12.5%).No students in Cohort 2 responded that they found the instructional video to be Not at All Useful.Most of the students found the instructional video to be at least a Moderately Useful contribution to their knowledge, and overall, the contribution is somewhat similar to that of the prelab activities.Applying the same scale analysis as above, the single-factor ANOVA revealed that there was statistically no significant difference between Cohort 1 and Cohort 2 for the perceived impact of the instructional video on knowledge and that these results are not likely to have occurred by chance (F = [0.09],p = 0.76).This would indicate that the students felt that the instructional videos had a positive impact on their experience.
Figure 3 also reports the responses to the question on the contribution of the online synchronous Q&A sessions with a teaching assistant to the student knowledge, which showed that 39% of the students found the Q&A sessions to be Extremely Useful.The percentage of students finding the Q&A sessions Very Useful was equal to that for students finding it Slightly Useful (23%), and 15% of Cohort 1 students responded that they found the Q&A session to be Moderately Useful; no students reported that they found the Q&A session to be Not at All Useful.Overall, the contribution of this provision was evaluated by Cohort 1 students higher than the other two components (online prelab and instructional video).These findings are of interest to those who wish to design a distance learning model, though the author does not advocate removing in-person practical classes entirely for the reasons outlined in the Introduction and discussed further below.It should be noted that this question was only put to Cohort 1 students, as with the return of in-person teaching there was no online Q&A session available for Cohort 2.
Figure 4 also reports the responses of Cohort 2 students to the question on the contribution of the practical skills assessment rubric to student knowledge.The percentages of students finding the skills assessment rubric Extremely Useful, Very Useful, Moderately Useful, and Slightly Useful were equal (25%), and no students reported they found the skills assessment rubric to be Not at All Useful.Cohort 1 students were also asked to evaluate which of these components (if any) they would like to see retained, even once face-to-face undergraduate practical classes return, and the results of this are reported in Figure 5.It was observed that the students are largely in favor of the retention of each of these components, albeit to varying degrees.The outcomes of this were the retention of the new online prelab activities while also retaining the online instructional videos to enable familiarization with apparatus and techniques prior to use.With the return of face-to-face teaching, there was no need to provide the online Q&A sessions; in any case, this was the component which scored the lowest in terms of desire to retain (∼50%).However, given that a substantial number of respondents still wanted this retained, there was a renewed focus on engagement during the in-person practical classes with the introduction of a laboratory performance mark associated with the nine experiments that did not have the practical skills assessment for subsequent cohorts (i.e., from AY2022/2023 onward).The rubric for this is included in Table S8, where one of the criteria specifically focuses on questions answered and asked by students.For these nine experiments, this laboratory performance component became worth 10% while the prelab remained worth 20%, with the corresponding postlab being worth 70%.
Figure 6 reports the responses of Cohort 2 students to the question on the comparison of the practical skills assessment experiment to the other experiments that they undertook.Equal percentages of students found the practical skills assessment experiment to be Much Better and Slightly Better (50%), with no students considering it to be Much Worse or Slightly Worse or to have No Difference.It is an encouraging result that the students are strongly in favor of this type of assessment, as is also evidenced by the answers to the open questions (vide infra).Figure 7 reports the Cohort 2 student responses to the question of whether they would like more of the experiments to be assessed in this manner, with 87.5% agreeing that this would be preferable.

Responses to Open Questions
The students of Cohort 1 also had an opportunity to respond to two open-ended questions on the virtual sessions: one question concerned what was most useful, and the other asked what further improvements could be made.The responses to these can be found in Tables 2 and 3, respectively.
It is of note that a range of views, positive and negative, were expressed, but this is understandable since the process is always subjective and students' personal experiences and interpretation of content will always vary.In Table 2 many of the comments were favorable, and there were certainly some useful comments; it is, however, important to consider the negative comments in more detail below.
I found them lacking compared to physical practical classes in both understanding the topic the practical wants to reinforce, and the ability to talk to tutors face to face and query them is a much superior option.This is an understandable response to a period of remote teaching in that students would have preferred to experience the sessions in person.For the record, it is also the author's preference to conduct such sessions in person; it was not possible at the time due to local government restrictions.It has been reported elsewhere that during the pivot to emergency remote instruction, online practical classes were particularly suboptimal and were more of a passive observational experience. 29This was reflected more in Table 3, which reports the responses in terms of what could be improved.
The perennial issue of lab sessions timetabling being out of sync with the teaching content for some students was encountered; this occurs in normal years due to limited equipment, meaning that experiments are done on rotation They can be done at home The online Q&A I feel all the elements of the virtual practicals work well The teams sessions with the demonstrator were very useful.The demonstrators went into more detail in these online sessions than they sometimes did in person.This helped my understanding of the experiments greatly.None I think the instructional videos were very informative in being able to picture what is happening, especially when we cannot carry out the lab in person.Although if this is continued I feel they will still be very helpful as they aid in understanding the lab manual before carrying out the lab.The instructional videos were useful for referring back to when completing the postlab report.I believe they would be useful on Canvas whenever in person practical classes return.I found them lacking compared to physical practical classes in both understanding the topic the practical wants to reinforce, and the ability to talk to tutors face to face and query them is a much superior option.The discussion in the Q&A sessions made the postlabs much easier as I knew exactly what I needed to do Table 3. Cohort 1 Response to "What Could Be Improved about the Virtual Practical Classes?" The preparatory videos could be more detailed Providing a more detailed description of the analysis that should be included in the post lab.Given that a post lab could be submitted before learning information about that topic in class time not discussing everything to be analyzed is handicapping students who submit work early.It would be very helpful to see the experiment taking place.In the instructional videos the equipment is explained well but at no point can we see how the experiment actually looks.It can be difficult to visualize what is happening without seeing the experiment take place.Actually demonstrate the experiment, not just read the instructions at us.Ensure all lecturers know that no mark is awarded for the Q&A section Nothing Recording the Q&A session with the demonstrator would be useful but it is understandable that it is not done.Have the lab done by the instructors Better opportunities for the student to interact and have a chance to gain a semblance of practical knowledge.Having to work with a couple of preset values for calculations does not do much for practical knowledge.Maybe longer video of said demonstration if we're not physically allowed to do it?Perhaps a longer video of the whole process of each practical, although I would hope that once covid restrictions are lifted we would gain practical experience through in person sessions rather than virtual.
and thus some students may not have had lectures on the theory, though they are provided with sufficient background material.
For instance, the lab manual procedure section includes a stepby-step approach on how to undertake the data analysis, including equations and additional references.Students were also encouraged to seek clarification and guidance from the teaching assistants to ask any specific questions in relation to the data analysis and postlab.This was one of the main purposes of the Q&A session.In terms of recording the Q&A sessions, this was not done, as previous research has reported that students were less likely to ask questions when a session was recorded, a view which associated teaching staff also observed. 30This had been discussed with students in advance, thereby explaining the acceptance and understanding of why this was the case.
In terms of the comments regarding the instructional videos, as explained in Methodology, the intention was for these to be explanations of the apparatus and a talk through the procedure of the experiment.Thus, these videos should be concise, since it has been reported that videos lasting less than 6 minutes were more engaging than videos lasting more than 12 minutes. 31rom that same study, student engagement decreases significantly after 6 minutes and again after 9 minutes. 31It was also reported that student engagement time fell to 50% after 14 minutes. 31All instructional videos were therefore less than 6 minutes in duration.The videos did include all steps (one comment mentions the reading of instructions); as mentioned in Methodology, to undertake the full experiment would have required changes to parameters (typically a repetitive process of changing, for example, a flow rate) which would have required equilibration time before making observations and recording results.Such a video would have easily exceeded 6 minutes and, most likely, 14 minutes, which would have meant they would be less engaging.
Unfortunately, there seems some disconnect from intended outcomes in that some students expect more in terms of "practical" skills or more use of equipment; many of the chemical engineering experiments are modifying settings and observing readings, so the major focus is data handling, interpretation/ analysis, and communication.It is still good that such concerns are raised, as it indicates that those outcomes need to be better signposted.It is also an indication that other more hands-on experiments/practical skills content should be considered in future.These were already considerations prior to the pandemic, but the nature of these comments reinforced the need to improve the student perception of practical skills.This led to the introduction of the practical skills assessment which Cohort 2 experienced.
Cohort 2 also had an opportunity to respond to two openended questions on these practical skills sessions: one question concerned what was most useful, and the other asked what further improvements can be made.The responses to these can be found in Tables 4 and 5, respectively.For Cohort 2 there was a greater focus on practical skills and how to assess them.Of course, this was the first time this style of delivery was attempted at QUB, and there will be improvements which can be made, most notably from the student evaluation that they would prefer to do it in even smaller groups (it was done in groups of 3) and to have some more time for the session.Both are reasonable expectations and can be accommodated during further rollout.With respect to the comment about remembering exact values, postgraduation and working in industry chemical engineers would be expected to consult an operating procedure in advance of undertaking a task, and thus, the intention of the practical skills assessment was to mimic this requirement and hence help prepare students for their future careers.This point will be reinforced for future academic years.
It is clear from the feedback, including the open questions, that students appreciated the more direct engagement with the teaching assistant.An important outcome of this for QUB (highlighted by Figure 7) is to try to replicate this style for other lab sessions to maintain these high levels of engagement while being cognizant of the fact that it is also important to have some variety in the assessment too. 32It was encouraging to note that the students also scored highly overall in the practical skills assessment.One of the most rewarding open comments from the student survey in Table 4 was: There was more freedom to apply your knowledge.I really understood the LLE practical while doing it, whereas with others I usually only understood the concept after the practical.This was particularly pleasing, as it demonstrated and captured that the main aims of the changes were being fulfilled.Skills were assessed and marks were awarded fairly, in other laboratories this year we have had a quick demonstration with the equipment, were not able to use it ourselves then asked if we had questions, most of the occasions we did not yet were still significantly marked down for our lab performance which I understand unfair.This was not the case for LLE.More weighting in marks for actual practical skills Due to the practical skills being assessed it forced us to prepare more beforehand for the lab and have a better understanding of the process.It also built confidence as we were completing the experiment independently.The lab itself was more directly engaging Much more interaction compared to other laboratories As the performance was weighted more, it took a lot of pressure off for the postlab and ensured that the content was fully understood before entering the lab.There was more freedom to apply your knowledge.I really understood the LLE practical while doing it, whereas with others I usually only understood the concept after the practical.More engagement with demonstrator

■ LIMITATIONS
The first limitation is the response rate of around 50% from both Cohort 1 and Cohort 2, so one must be aware that the responses do not represent the entire student body; even had it been a 100% response rate, it would only be representative of the local population at QUB.It should also be noted that not all of those who responded to the survey replied to every question.It is a relatively small sample size (Cohort 1 was 52 students, and Cohort 2 was 53 students), reduced further by the 50% response rate.The evaluation also focused on a set number of experiments (10) that were already part of the curriculum that are likely to be different elsewhere, so the findings in this evaluation should not be generalized. 33n hindsight, the differential semantic scales applied to Figures 3 to 5 and Figure 7 could have been better designed to include more clearly defined polar adjectives and a neutral term.However, it would not be possible to revisit the impact of these interventions with the participants due to the passing of time, now more than 2 years for Cohort 1 and more than 1 year for Cohort 2. In any case, the participants responded to the survey using the defined statements as they were and stated their own opinions in relation to each resource.
Furthermore, these interventions were developed partly in response to periods of remote teaching as well as the desire to update previous formats of delivery and thus have not been tested with other populations not exposed to remote teaching and/or other institutes that already had different approaches to practical classes.The interventions, by design, were considered within the Chemical Engineering discipline and therefore may not readily transfer across all science or engineering subjects.Additionally, validity and reliability can only be established for the evaluations gathered during this iterative process, and therefore, one must be cognizant of the need for reliability with every new data set when these or similar interventions are embedded. 34

■ CONCLUSION
In this qualitative investigation on the impact of a change of approach on the student experience of practical classes, Cohort 1 students agreed that the generic approach to prelabs did not encourage them to fully prepare for the practical sessions.The newly introduced specific prelab quizzes were based on the experimental manual and relevant health and safety documents, and multiple-choice questions have been reported to be more equitable. 27Student perceptions of benefit and impact were obtained via an anonymous survey.The survey questions were five-point differential semantic scales and two open questions.In particular, students were asked to evaluate individual components (e.g., online quizzes, instructional video, etc.).It was observed that most students found each component to be at least moderately useful or better, and more than 75% wanted the resources retained.
Original prelab, excerpt of exemplar revised prelab, grading rubrics for a sample postlab quiz, postlab reports, poster design, and practical skills assessments (PDF, DOCX)

Figure 3 .
Figure 3. Cohort 1 student perceptions on contributions of online prelabs, instructional video, and Q&A sessions to student knowledge.

Figure 4 .Figure 5 .
Figure 4. Cohort 2 student perceptions on contributions of online prelabs, instructional video, and skills assessment rubric to student knowledge

Figure 6 .
Figure 6.Cohort 2 student comparison of new format of lab (with practical skills assessment) compared to standard format of other laboratories.

Figure 7 .
Figure 7. Cohort 2 student responses to being asked if they wanted more laboratories like the new format with practical skills assessment.

Table 1 .
Comparison of Resources and Assessment of Practical Classes Pre-and Postintervention

Table 2 .
Cohort 1 Reponses to "What Did You Find Most Useful or Enjoyable about the Virtual Practical Classes?"

Table 4 .
Cohort 2 Responses to "What Did You Find Most Useful or Enjoyable about the Practical Skills Assessment for Liquid/ Liquid Extraction?"

Table 5 .
Cohort 2 Reponses to "What Could Be Improved about the Practical Skills Assessment for Liquid−Liquid Extraction?" Smaller groups if possible Nothing You should not be expected to remember exact values N/A Maybe give a bit more time to complete the assessment