Introduction

Remote and virtual instructional laboratories, commonly referred to as online laboratories, have become increasingly prevalent in engineering and science education. These laboratories enable students to conduct experiments remotely through the Internet, allowing them to operate equipment and perform tasks from a distance or use fully virtualized laboratory equipment (De Jong, Linn, & Zacharia, 2013; May, 2020; May et al., 2023). The user-friendly nature and inherent flexibility of these systems contribute to fostering a positive and impactful learning experience, ensuring that learners can engage with the material in a memorable manner.

Online laboratory solutions have traditionally been viewed as an alternative option for student experimentation alongside in-class activities (De Jong et al., 2013; de la Torre et al., 2013; Faulconer & Gruss, 2018; Heradio et al., 2016; Hernández-de-Menéndez et al., 2019; Jahnke et al., 2010; Ma & Nickerson, 2006; Mackay & Fisher, 2012; May et al., 2016, 2019; Terkowsky et al., 2010; Toth et al., 2009; Wei et al., 2019). However, the global pandemic in 2020–2021 had a significant impact on laboratory-based learning (May et al., 2021; Reeves et al., 2021), forcing educational institutions to swiftly transition from face-to-face classes to online courses. This transition posed particular challenges for laboratory courses even prior to COVID-19. During the pandemic, online laboratories became the sole option available to provide students with any form of laboratory activities. Leveraging existing research findings, this unprecedented shift acted as a catalyst for numerous new developments, approaches, and initiatives in this field.

This Special Issue (SI) Call aimed to gather new research findings related to both recent experiences during the COVID-19 pandemic and long-term research endeavors concerning online laboratories and virtual experimentation in educational contexts. However, as the co-editors of this Special Issue, we recognize the importance of placing COVID-19-related experiences within a broader framework that encompasses long-term considerations (Baker, Clarke-Midura, & Ocumpaugh, 2016; Makransky et al., 2020) and relates to existing research. In other words, the fact that scholars have engaged in technical developments, often under time pressure during the pandemic, and conducted on-the-spot classroom evaluations does not guarantee that all of these “new” online laboratory approaches will ultimately lead to significant innovations and research outcomes, despite their value during the unprecedented circumstances of the pandemic. Therefore, researchers and scholars from outside the domain of online learning research need to familiarize themselves with existing contemporary online learning research, which has been established for over 30 years in the educational technology and online learning scientific communities. This connection needs to be made now, post-pandemic, for many of the new online learning approaches, not only within the field of online laboratories. Furthermore, there is already a wealth of international scholarly work and numerous publications on online laboratory design, development efforts, and educational research, such as the works of Bortnik et al. (2017), Ekmekci and Gulacar (2015), Fang and Tajvidi (2018), and Zacharia and Constantinou (2008), to name just a few. However, within the field of online laboratory research, much work still needs to be done in establishing online access to experimental equipment, developing simulations, and assembling the necessary technical infrastructure. In short, technical design has been the primary focus for many years, but this focus has evolved in recent years.

While the technical development of online laboratories has been continuously advancing and experienced a significant acceleration during the pandemic, the associated educational research has gained increasing attention. This is evident in the articles featured in this Special Issue. However, the field of online laboratory research still lacks a comprehensive understanding of the factors that determine how online laboratories truly facilitate learning or potentially impede educational processes. To enhance our knowledge in this domain, we sought contributions for this Special Issue that go beyond technological usability and encompass criteria such as learning effectiveness, efficiency in achieving educational goals, and overall quality of educational design. Therefore, the primary research question underlying this Special Issue and the presented articles was: “How can we measure or evaluate online-based educational laboratories, and what criteria should be addressed in future research endeavors?“ Consequently, we aimed to solicit manuscripts focused on “online lab design and research” with a strong emphasis on the research perspective. The included articles not only present the design of online laboratories but also engage in comprehensive discussions of theoretically grounded research on the applications of respective online laboratories. It is our fundamental belief that attention should be directed not only toward the technological development of an online laboratory but also toward its integration into a broader teaching and learning ecosystem, encompassing technological, pedagogical, and social usability.

Contributing to the intricate discourse and presenting an alternative and adaptable method for educational research in the area of online laboratories and learning experience design, we will delve into a sociotechnical-pedagogical perspective in the subsequent section. Sociotechnical-pedagogical heuristics serve as a valuable, efficient, and pragmatic approach for assessing the design of online learning environments, including online laboratories. However, it is crucial to emphasize that the utilization of heuristics in learning experience design is supported by educational research, thereby offering both evidence-based and practical viewpoints to the ongoing discussion (Sect. 2).

Lastly, this chapter provides a comprehensive overview of the papers that have been accepted for inclusion in this Special Issue (refer to Sect. 3).

Sociotechnical-pedagogical heuristics for learner experience design and online laboratory evaluation

Various approaches exist for evaluating the quality of digital, educational technology, including online laboratories. One promising but not yet widely used approach is the use of heuristics. In a general sense, heuristics serve as a checklist or rule of thumb to identify significant issues and challenges in digital systems. They are commonly employed to assess the user-friendliness and usability of digital technologies, services, or products. Heuristics have found extensive application in various fields such as software development, marketing, and human-computer interaction. Their primary focus lies in the user’s interaction with the technology. A heuristic-based evaluation involves employing a predefined set of items and applying them to a specific system in order to identify potential issues. The ultimate goal of this evaluation is to enhance the user experience with the technology by addressing and resolving identified concerns.

Nielsen’s usability heuristics (Nielsen, 1995) have proven to be effective in identifying usability issues related to system ease of use, efficiency, error frequency, and error severity (Botella, Rusu, Rusu, & Quiñones, 2018; Khajouei et al., 2018). These methods primarily focus on the user’s interaction with digital systems and aim to enhance the user experience, which, in turn, can lead to improved learner engagement. However, evaluating the quality of online laboratories extends beyond the technological dimension. It is essential to consider other dimensions, such as pedagogical usability when assessing their quality. Pedagogical usability relates to the quality of the learning experience and should be a central focus of evaluation (Lim & Lee, 2007; Moore et al., 2014; Silius & Tervakari, 2003). Notably, a prominent set of pedagogical heuristics has been developed by Nokelainen (2006).

In addition to the technical and pedagogical dimensions, Jahnke et al. (2021) have highlighted the significance of the social dimension in the design of positive and engaging learning experiences within digital environments. The empirical study conducted by these authors underscores the importance of considering the social aspect, as learning is inherently a social endeavor and is deeply intertwined with social group activities (Jahnke, 2015). The social dimension emphasizes the interactions between learners and diverse peers or instructors, recognizing that social relations with both instructors and peers play a vital role in the learning process. It is essential to acknowledge that social interactions and roles are equally crucial in fostering learner-centered approaches to education. In that same line, the framework of social, cognitive, and teaching presence by Garrison et al. (2003) reinforces the significance of the social dimension by illustrating how online platforms such as discussion boards and chat functionalities facilitate direct communication and enhance learner interaction. These features enable learners to engage actively with the learning content and with one another, contributing to a richer and more meaningful learning experience.

Finally, the social dimension also encompasses the broader socio-cultural context of learners, aiming to address diversity, equity, and inclusion. Moore and Tillberg-Webb (2023) presented a socio-technical framework for ethics in educational technology, arguing that this is more reflective of actual technology design and implementation processes where human choices and values greatly shape resulting designs and decisions. By considering the socio-cultural aspects, educational designers and practitioners can create inclusive learning environments that respect and value the diverse backgrounds, perspectives, and needs of learners. This holistic approach can also help to ensure that the learning experiences provided are accessible and meaningful for all learners, promoting a sense of belonging and fostering an inclusive educational environment. Socio-technical frameworks for technology - such as Social Construction of Technology (Pinch & Bijker, 1984), Actor Network Theory (Dolwick, 2009; Fenwick & Edwards, 2010), and Cultural-Historical Activity Theory (Engeström, 1999) - illuminate how technologies are both situated in social and cultural contexts that inform their development and implementation and are shaped by human actors and decisions in social systems. Understanding technologies as social artifacts and processes leads to better centering humans-as-agents in the design and development process, which can foster design decisions and features that reflect human diversity and varying needs and enable improved learning or use through responsive and flexible designs.

In conclusion, the quality of online education settings is contingent upon three crucial dimensions: technological, pedagogical, and social usability. Speaking in the context of online laboratories: While having a technologically sound online laboratory is necessary, it alone is not sufficient. It is imperative to situate online laboratories within the broader learning ecosystem, encompassing learning goals, student activity design, assessment methods, and the facilitation of social presence and interactions mediated through digital tools. Acknowledging the technological dimension ensures that the online laboratory functions effectively and efficiently, providing a seamless user experience. However, it is equally important to consider the pedagogical dimension, which involves aligning the online laboratory with intended learning outcomes, instructional strategies, and appropriate student activities. Additionally, the social dimension must not be overlooked, as fostering social presence and facilitating interactions among learners and instructors enhances engagement and collaboration within the online learning environment. By embracing all three dimensions, educational designers and practitioners can create comprehensive and effective online laboratory experiences. This integrated approach ensures that online labs are not isolated entities but rather integrated components within the broader educational context, optimizing their potential to support meaningful learning and enrich the overall learning experience.

Sociotechnical-pedagogical heuristics

The term “sociotechnical-pedagogical heuristics” was introduced by Jahnke and colleagues, as exemplified in their above-mentioned study published in 2021. According to that study (Jahnke et al., 2021), it has been found that the existing heuristics proposed by Nielsen (1995) and Nokelainen (2006) are insufficient in identifying the most critical issues in digital environments. As a result, their study introduces a new set of heuristics that comprehensively covers the three dimensions of social, pedagogical, and technological aspects. The study’s findings highlight the need to consider the interplay of social, pedagogical, and technological factors when evaluating digital environments. By incorporating heuristics that address these three dimensions, the researchers aim to provide a more comprehensive framework for detecting and addressing significant issues in digital environments. This new set of heuristics is intended to enhance the evaluation process and improve the overall quality of digital experiences in various contexts.

“Learning experience (LX) encompasses all aspects of a learner’s interaction with: (a) the digital technology/service/space; (b) the pedagogical components, such as course type, learning goals, learning activities, process-based assessment, and learner control; and (c) the social dimension, such as quality of communication, collaboration, sociality, social presence, and social interactivity” (Jahnke et al. 2021)

In their research, the authors compiled a comprehensive list of heuristics based on a review of 193 items found in the literature. Through their investigation, they formulated a set of 14 heuristics and compared them to the heuristics proposed by (Nielsen, 1995) and (Nokelainen, 2006). The findings indicated that the newly developed sociotechnical-pedagogical heuristics, abbreviated as STP heuristics, were more effective in identifying issues in online courses compared to the individual heuristics of Nielsen and Nokelainen.

Table 1 provides an overview of the 14 STP heuristics, which can be categorized into three dimensions: social (2 heuristics), technological (6 heuristics), and pedagogical (6 heuristics), as discussed earlier. For further exploration, a detailed description of the STP heuristic items is accessible online, offering interested readers more in-depth information on each heuristic: https://sites.google.com/view/stp-heuristics/redefined-14-stp-heuristics-final-set.

Table 1 Set of 14 STP heuristics (adopted from Jahnke et al. (2021))

Study online laboratories with sociotechnical-pedagogical heuristics

The evaluation of online laboratories encompasses various methods, such as post-intervention student surveys, user experience research, and analysis of learner performance. Nonetheless, our straightforward approach discussed in this editorial involves the application of heuristics, a well-established methodology utilized in diverse research domains, including human-computer interaction. Before we go into a more detailed description of how to use those heuristics in the context of online laboratories, we still want to explain the concept of online laboratories a bit more in detail, as we have been using this term all along this text without further explanation.

Online laboratories as an instructional tool

Broadly speaking, online laboratories offer educators new opportunities to design innovative and engaging learning experiences for students, surpassing the limitations of traditional in-class laboratory instruction. These online laboratories can be utilized in various ways to enhance student learning. One approach involves having students access a fully virtualized laboratory beforehand to familiarize themselves with experimental procedures prior to in-class activities. Alternatively, guided in-class laboratory sessions can be conducted to introduce students to the laboratory’s environment, teach necessary procedures, and develop a practical understanding of the exercises. Subsequently, students can leverage online laboratory capabilities to engage in self-designed and self-guided experiments that expand beyond the constraints of in-class activities, benefiting from the flexibility of online resources.

The existing literature discusses the affordances, advantages, and disadvantages of online laboratories in engineering curricula. For example, Faulconer and Gruss (2018) provide a comprehensive analysis of the pros and cons of online, remote, and distance science laboratory experiences. Additionally, several review papers and bibliometric analyses have examined online laboratories in engineering and STEM education, offering diverse perspectives on the scholarly discourse surrounding this topic (Brinson, 2015; Heradio et al., 2016; Nikolic et al., 2021).

While acknowledging the critical examination of online laboratory affordances in instructional settings, this Editorial does not specifically focus on that discussion (Bower, 2017). Instead, we discuss the use of the presented heuristics to evaluate online laboratory solutions and applications.

The importance of the instructor for the evaluation process

In contrast to conventional usability studies that primarily focus on user interaction with digital services or spaces, online laboratories present a distinct challenge due to their integration within a broader learning ecology. Unlike standalone websites, online laboratories are situated within a larger context that encompasses course design, learning management systems, and instructor engagement. Consequently, the application of sociotechnical-pedagogical heuristics in evaluating online laboratories requires a collaborative approach, also involving instructors. This can be achieved through asynchronous communication, where evaluators formulate probing questions and engage in dialogue with instructors. By soliciting feedback from instructors, evaluators can address any uncertainties or ambiguities that may arise during the evaluation process. This reflective conversation facilitates a deeper understanding of instructors’ perspectives, enabling a more comprehensive assessment of the usability and efficacy of the online laboratory within the broader learning ecosystem. This means that the evaluation of online laboratories through the above-described heuristics also recognizes the significance of involving instructors in order to obtain valuable insights and ensure a thorough assessment of the technology’s usability and effectiveness within the larger educational framework.

In the next section, we will discuss the application of the sociotechnical-pedagogical heuristics to assess one exemplary online laboratory.

Exemplary application of sociotechnical heuristics to evaluate an online laboratory

For the following example, we employed the 14 STP heuristics (see Table 1), as discussed above, to assess the usability and effectiveness of a fully virtualized online virtual laboratory implemented in an undergraduate course at the College of Engineering of the University of Georgia in the field bioengineering. This online laboratory was integrated into a traditional face-to-face course within the respective undergraduate program. In practice, this meant that the instructor followed a conventional in-person teaching format but utilized the online laboratory for out-of-class assignments. The examined online laboratory represents a fully virtual laboratory simulation for tissue engineering developed by the company Labster. The application of the STP heuristics allowed us to examine the user experience, pedagogical effectiveness, and technological aspects of this online laboratory environment. By conducting this evaluation, we aimed to gain insights into the strengths and limitations of this virtual laboratory in supporting student learning and engagement in the context of the given course.

The tissue engineering online simulation

This online laboratory is an online simulation that focuses on the synthesis of tissue for cartilage replacement. Developed by the company Labster, this desktop-based laboratory simulation provides students with a virtual environment where they can learn and apply the necessary elements of tissue engineering. The simulation presents a realistic scenario in which students are tasked with assisting in the treatment of an injured athlete. In the virtual laboratory, students are required to develop a scaffold that can aid in the regeneration of the athlete’s cartilage. Drawing on their knowledge of chemistry and material science, students engage in two crosslinking experiments: the ionic and Michael addition methods. They are given the opportunity to select the polymers they wish to use in these experiments. Upon completion of each experiment, students observe the chemical or physical reactions occurring within the hardening hydrogels. Figure 1 provides a visual representation of the laboratory environment, demonstrating the immersive experience that students engage in using their personal computers. Through this online simulation, students can actively participate in the process of tissue engineering and gain practical insights into the field.

The Labster tissue engineering lab was integrated into a graduate tissue engineering course at the University of Georgia in 2020. It was the first time the lab was introduced in this particular course, which traditionally did not include any hands-on laboratory component. The online lab served as an additional component to the regular course content, providing students with practical experience. As part of a homework assignment, students were required to engage with the virtual laboratory and complete the assigned tasks. Their participation in the online lab activity was incentivized through the allocation of extra points. Prior to the study conducted for this research, the simulation had been used in two separate fall semester courses, offering valuable insights and feedback for its implementation.

Fig. 1
figure a

Screenshot of tissue engineering laboratory simulation (developed by Labster) used in the presented study

Data collection and analysis

To collect data, the research team utilized a shared Google document containing the 14 STP heuristics and their respective subitems. The data collection process involved a two-step approach: In the first step, an expert discussion was conducted between the two researchers who were authors of this editorial. One researcher served as the heuristics expert, while the other was the expert on the online laboratories. The heuristics expert described each item, and the laboratory expert determined whether the specific item was represented in the online laboratory or its associated activity. During this initial round of applying the heuristics, the research team observed that asking questions instead of presenting a list of items was more easily understood by individuals unfamiliar with the heuristics. In response to this, the team decided to transform all 195 items into questions and proceeded with using the questions in the data collection process.

The second step included the instructor, who had recently used the tissue engineering online laboratories with students. The research team converted the document into a table format, allowing instructors to view all items, along with the corresponding questions, and provide their answers independently and asynchronously. The used table had columns for the original item, the related question, and the instructors’ responses (including a brief description, a clear answer indicating whether there were minor or major issues, and optional additional comments). An example of such a table is provided in Fig. 2 for the “social presence” heuristic.

Fig. 2
figure b

Example items incl. questions for “social presence” heuristic

Once the data collection was complete, the research team performed data analysis by consolidating the results from both steps. The team counted the total number of issues found for each sub-heuristic, focusing on the instances where the answer was “no” (indicating the presence of issues). These issues were further categorized as minor or major based on their impact on the learning experience design. The team examined the major issues in detail to ensure a comprehensive analysis. It is important to note that the analysis considered not only the laboratory activity itself but also the overall course design in the context of the online laboratories.

Results

In total, 19 major issues in seven different STP heuristics were detected for the tissue engineering virtual laboratory with eight major issues of social usability, six major issues of technological usability, and five major issues of pedagogical usability (see Table 2). In the following, we briefly discuss the detected issues in the respective heuristic items.

Table 2 Result for 14 STP heuristics used to examine tissue engineering laboratory

Heuristic 1: social presence (social usability)

Heuristic items:

  • The course provides learners with opportunities to access extended feedback from instructors, experts, peers, or others through e-mail or other Internet communications.

  • Instructor plays different roles (e.g., expert, mentor, coach, learning-companion).

Description and discussion of the detected issue: The analysis of the collected data revealed an issue concerning social learning and the connectivity between learners. Specifically, it was observed that the inclusion of the virtual laboratory in the overall course design lacked opportunities for peer-to-peer or peer-to-instructor interaction. The virtual laboratory was designed as an individual activity, with the only available feedback being provided by automated systems. According to the STP heuristics, a stronger emphasis on interaction with other learners or the instructor would have been more beneficial in fostering a social learning environment. This finding highlights the importance of critically examining the integration of virtual laboratories within the broader course design to ensure the inclusion of effective social learning elements.

Heuristic 2: (group) activities (social usability).

Heuristic items:

  • Learning activities are active and facilitate engagement via learner-content, learner-learner, and learner-instructor interactions.

  • Activities promote active reflection, collaboration, discussion, and real-world engagement.

  • Course tools promote student engagement and active learning by facilitating interactions with the instructor, course materials, and other learners.

  • Activities engage students in higher-level thinking skills, including critical and creative thinking, analysis, and problem-solving.

  • A wide variety of learning strategies may have to be employed, including memorization, direct instruction, drill-and-practice, deduction, and induction.

  • Learners can make decisions about what sections/content to study through interactive material.

Description and discussion of the detected issue: The analysis of the data showed that one particular heuristic was effective in identifying a significant number of issues. One of the identified issues related to the lack of group interaction, which aligns with a previous observation. It is worth noting that the occurrence of similar issues across different heuristics is not problematic, as the purpose of these heuristics is to comprehensively identify potential issues rather than avoid overlap. Furthermore, the online laboratory itself was found to present limitations in terms of learner engagement in higher-level thinking activities and the application of diverse learning strategies. The laboratory provided a predefined experimentation activity with detailed guidance from the system, which constrained the learner’s ability to engage in independent and exploratory learning. The rigid structure of the laboratory restricted flexibility and opportunities for self-guided experimentation. This constraint also raised concerns regarding the learner’s decision-making opportunities within the laboratory setting. These findings shed light on the need to critically evaluate the level of learner autonomy and decision-making within the laboratory experience.

Heuristic 3: ease of use (technological usability).

Heuristic item:

  • The system allows the learner to leave whenever desired and easily return to the closest logical point in the system.

Description and discussion of the detected issue: One of the identified issues was related to the inability to save progress and resume the activity at a later time. The online laboratory required the activity to be completed in a single session without the option to save progress. This limitation hindered the learners’ flexibility and convenience, as they were unable to pause the activity and return to it at a more convenient time. This finding highlights the importance of providing learners with the option to save their progress and resume the activity at their own pace, promoting a more learner-centered and flexible experience.

Heuristic 5: ecosystem (technological usability).

Heuristic items:

  • If the course includes links to external resources, the links are kept up to date.

  • The syllabus should be listed on the main navigation menu for quick access to it.

  • The course provides access to all resources necessary to support effective learning.

Description and discussion of the detected issue: The identified items shed light on the lack of connection between the laboratory activity and the overall course design. The laboratory activity was treated as an isolated component, separate from other relevant course resources. Students were expected to independently establish connections between the laboratory activity and the rest of the course materials without explicit guidance or support. This disconnected approach may have hindered students’ ability to fully integrate and apply their knowledge from the laboratory activity within the broader context of the course. Creating a more cohesive and interconnected learning experience by aligning the laboratory activity with other course resources could enhance students’ understanding and application of the acquired knowledge.

Heuristic 8: accessibility (technological usability).

Heuristic items:

  • Images and graphics contain alternate text or descriptive captions.

  • For accessibility, provide a means for the learner to access the text of the narration.

Description and discussion of the detected issue: The identified items highlighted the issue of limited accessibility, particularly for learners with visual impairments. The virtual laboratory environment did not adequately accommodate the needs of visually impaired learners, making it difficult for them to fully engage with the content and navigate the interface effectively. Additionally, the absence of descriptive captions within the virtual environment further compounded the accessibility challenges. Descriptive captions can provide important visual information in alternative formats, enabling learners with visual impairments to access and comprehend the content more effectively. Enhancing the accessibility features within the virtual laboratory, such as providing alternative text descriptions and implementing descriptive captions, would contribute to a more inclusive learning experience for all learners, regardless of their visual abilities.

Heuristic 10: material delivery/organization (pedagogical usability).

Heuristic items:

  • Information and instructions are provided regarding how the tools support the learning objectives or competencies.

  • Information and instructions are provided regarding how the tools required/ recommended for use in the course support the learning objectives or competencies.

Description and discussion of the detected issue: The identified items underscored a lack of alignment between the laboratory activity and the overall course design and learning objectives. While the content of the laboratory activity itself was relevant to the concepts covered in the course, there were minimal efforts made to establish strong connections between the specific learning objectives of the laboratory and the broader goals of the course. The laboratory activity appeared somewhat disconnected from the larger context of the course, which may have limited its effectiveness in reinforcing and enhancing the overall learning outcomes. Establishing clearer links between the laboratory’s learning objectives and the course goals would help ensure a more cohesive and integrated learning experience, enabling students to better grasp the practical applications of the course concepts within the laboratory setting.

Heuristic 11: assessment (pedagogical usability).

Heuristic items:

  • Wherever appropriate, higher order assessments (e.g., analysis, synthesis, and evaluation) are provided rather than lower order assessments (e.g., recall and recognition).

  • Teacher gives active, specific, and consistent feedback and feedforward to student learning progress.

  • Assessment provides sufficient feedback to the learner.

Description and discussion of the detected issue: The identified items shed light on assessment-related issues from multiple perspectives. Firstly, in terms of the experimentation activity, the laboratory lacked higher-order assessments that could challenge students to engage in critical thinking and demonstrate a deeper understanding of the subject matter. The assessments provided within the laboratory were limited in their scope and did not effectively promote higher-level wcognitive skills. Additionally, the absence of teacher interaction as part of the online laboratory activity resulted in a lack of specific teacher feedback in the form of assessment. Students relied solely on the generic feedback provided by the online laboratory, which might not be tailored to their individual needs or support personalized learning. The feedback provided within the laboratory environment did not offer the level of specificity and relevance that could contribute to a more effective learning experience. These assessment-related issues highlight the importance of incorporating diverse and meaningful assessments that go beyond basic tasks and allow students to demonstrate higher-order thinking skills. Furthermore, establishing opportunities for teacher interaction and personalized feedback within the online laboratory setting can enhance the assessment process and provide students with more targeted guidance and support.

Discussion

The results for the above-described application of sociotechnical pedagogical heuristics in the context of online laboratories indicate that STP heuristics are a useful approach to detect major issues of online laboratories related to social, technological, and pedagogical usability. The application of the 14 sociotechnical-pedagogical heuristics to the learning experience design of online laboratories proved to be valuable in detecting potential major issues. Through the systematic procedure described in this example, we were able to identify and pinpoint specific issues that were previously identified based on anecdotal evidence. This highlights the effectiveness of the STP heuristics in providing a structured framework for evaluation. The application of the 195 items as questions, rather than using them as standalone items, proved to be highly relevant in facilitating the understanding of usability aspects for instructors. Framing the items as questions provided a clearer context and made it easier for the instructor to comprehend and respond to the evaluation criteria. This approach enhanced the effectiveness of the evaluation process by ensuring that the evaluators fully grasped the intended meaning of each item. This collaborative approach helped reveal aspects that may have been overlooked in an individual evaluation and contributed to a more robust evaluation process.

The application of the heuristics also revealed areas for further improvement in the online laboratory that had not been considered before. This study served as an initial exploration of the design efforts needed to enhance the learner experience. The findings offer valuable insights and direction for future design enhancements. The STP heuristics can also serve as a tool for formative evaluation. By applying these heuristics, designers and instructors can continuously assess and improve the online laboratory experience. The heuristics provide a systematic approach to identifying and addressing potential issues, allowing for iterative design improvements that enhance the overall learning experience. The heuristics not only help identify existing issues but also guide the formative evaluation process, enabling ongoing refinement and enhancement of the learner experience.

While this exemplary application of heuristics provided valuable insights into the evaluation of online laboratories, further work is needed to broaden the scope and assess a wider range of online labs. Conducting evaluations on a larger scale and encompassing different types of online laboratories would provide a more comprehensive understanding of the strengths and limitations of the heuristics in various contexts. This would contribute to the refinement and adaptation of the evaluation approach, ensuring its applicability to a broader range of online labs and facilitating continuous improvement in the field.

After this excursion into specific and not yet widely used tool to assess and evaluate online laboratories, we will now switch back to this Special Issue’s specific content. In the next section we will provide and briefly describe the included articles in this issue.

Special issue results and lessons learned

The articles included in this Special Issue center around the advancements in pedagogy and instruction, instructional design, and learning design & technologies within the realm of online or virtual laboratories. Each article presents findings and conclusions based on empirical investigations, offering valuable insights into effective practices, methodologies, and technologies in this domain:

Harish Thampy, Sarah Collins, Elora Baishnab, Jess Grundy, Kurt Wilson, and Timothy Cappelli focus in their article “Virtual clinical assessment in medical education: an investigation of online conference technology” on online laboratories in medical education. Due to the Covid-19 pandemic, medical education institutions had to adapt their clinical assessments to comply with social distancing measures. The research team conducted a study to evaluate the effectiveness of an online virtual Objective Structured Clinical Examination (OSCE) as a potential solution. Their work involves a qualitative analysis of decision-making processes, consultations, and perspectives of key stakeholders (students, examiners, simulated patients, and faculty staff). The study provides guidance for future online assessment applications, highlights effective practices, and sets the stage for further research on technology in healthcare education and practice.

The study provided by Eunbyul Yang, Sanghoon Park, Jeeheon Ryu, and Taehyeong Lim titled “How does Dental Students’ expertise influence their clinical performance and Perceived Task load in a virtual Dental Lab?” also discussed medical education. Their work aimed at introducing a virtual dental lab for supporting virtual clinical examinations in a dentistry program and investigating the influence of students’ expertise levels on their clinical performance and perceived task load in the virtual dental lab. A total of 93 students participated and were divided into groups based on their expertise levels. They performed virtual reality simulation tasks related to dental caries detection and diagnosis. The outcome variables included clinical examination performance (total dwell time and examination time) and perceived task load. The results indicated that expertise level had a significant impact on examination performance, except for anterior maxillary teeth.

In their article “A multimodal analysis of college students’ collaborative problem solving in virtual experimentation activities: a perspective of cognitive load”, Xu Du, Miao Dai, Hengtao Tang, Jui-Long Hung, Hao Li, Jinqiu Zheng discuss online education in the context of engineering education. According to the authors the effectiveness of online courses, particularly engineering courses with experimentation activities, is still a subject of debate. One of the main challenges is fostering collaborative problem-solving skills for novice students, as online collaboration can increase their cognitive load. To address this issue, their research work focused on understanding novice engineering students’ cognitive load and its impact on their performance in collaborative problem-solving during virtual experimentation activities. The study aimed to provide a detailed and multimodal perspective on how cognitive load influences student performance.

“Exploring collaborative problem-solving in virtual laboratories: a perspective of socially shared metacognition” is the second article by Tang in this issue. The author group Hengtao Tang, Okan Arslan, Wanli Xing, and Tugba Kamali-Arslantas discuss here metacognition. According to the authors, previous studies have examined the isolated effects of each dimension on problem-solving, but a comprehensive understanding is still lacking. Their study utilized learning analytics techniques to gain insights into socially shared metacognition during collaborative problem-solving in virtual laboratories. The analysis revealed four distinct clusters. Statistical analysis was then conducted to explore the relationship between the clusters and the outcome of collaborative problem-solving, as well as the difficulty level of problems. The findings of this study have theoretical implications for advancing our understanding of socially shared metacognition in virtual laboratory settings.

In “Remote labs in higher engineering education: engaging students with active learning pedagogy”, Antoine Van den Beemt, Suzanne Groothuijsen, Leyla Ozkan, and Will Hendrix draw the focus on remote laboratories and student engagement. They argue that limited research exists on effective pedagogies for fostering engagement in remote labs. Their paper aims to explore how an active learning pedagogy in remote labs supports student engagement in higher engineering education, with the ultimate goal of enhancing students’ ability to transfer knowledge from theory to practice. Findings indicate that remote labs, which offer flexibility in terms of time and location, require students to regulate their learning and schedule experiments independently. However, open-ended lab assignments promote engagement by creating a sense of curiosity and the need to acquire knowledge. The study also highlights the importance of structured arrangements for lab assignments, teamwork to support peer learning and discussion, progress meetings focused on feedback and formative assessment, and reflective reports.

Davy Tsz Kit Ng, Jiahong Su, and Ross Chi Wui Ng offer a research focus on an otherwise not often discussed area of education. In “Fostering non-aviation undergraduates’ aviation literacy in an online aviation laboratory: effects on students’ perceptions, motivation, industry optimism”, the authors provide a study that focuses on examining the learning perception of university students who took part in a series of online aviation career exploration activities during the pandemic in Hong Kong and China. These activities included virtual visits, career talks by aviation professionals, hands-on flight simulation exercises, and online discussions in a virtual lab environment. A mixed research method was utilized, incorporating a motivational survey, teacher observations, and semi-structured interviews to understand students’ perceptions of their learning experiences. The findings indicate that engaging in flying laboratory activities effectively motivated students to learn about aviation and enhance their knowledge in the field. The article provides recommendations for online engineering educators to leverage emerging technologies in teaching aviation for future career readiness.

As the articles provided in this Special Issue show, the ongoing development of online laboratories has seen significant progress, especially during the pandemic. However, research on the educational aspects of online laboratories is still lacking a comprehensive understanding. This Special Issue aimed to address this gap by focusing on factors that contribute to effective learning and educational design beyond technological usability. In that line, the included articles not only discuss the design of online laboratories but also provide theoretical research on their applications. Our goal with this Special Issue and this editorial was to emphasize the integration of online laboratories within a broader teaching and learning ecosystem that considers technological, pedagogical, and social aspects. This task is complex in nature and benefits from the integration of diverse perspectives from different educational fields well as research approaches. We hope that you as a reader benefit from this Issue’s content and that it provides valuable insight for your own research and practice.