Next Article in Journal
Hospital Disaster Preparedness: A Comprehensive Evaluation Using the Hospital Safety Index
Previous Article in Journal
‘I Do It for Others’! Prosocial Reasons for Complying with Anti-COVID Measures and Pro-Environmental Behaviours: The Mediating Role of the Psychological Distance of Climate Change
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Construction of an Evaluation Index System for Assistive Teaching Robots Aimed at Sustainable Learning

1
Faculty of Education, Fujian Normal University, Fuzhou 350117, China
2
School of Economics and Management, Xiamen University of Technology, Xiamen 361024, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sustainability 2023, 15(17), 13196; https://doi.org/10.3390/su151713196
Submission received: 3 August 2023 / Revised: 29 August 2023 / Accepted: 30 August 2023 / Published: 1 September 2023
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
A typical example of a robot used in education is the assistive teaching robot, which has much potential to enhance teaching and learning as well as to promote sustainable learning. However, there needs to be formalized selection and evaluation procedures for robotic teaching assistants. To address this need, this paper presents a function evaluation system framework for assistive teaching robots, which includes four dimensions: system structure, appearance interface, teaching function, and auxiliary support. This study employed the framework of the DANP method to examine the extent of influence of various indicators. The analysis focused on determining the critical components of the function evaluation system for assistive teaching robots. Ultimately, the study concluded that two crucial factors in this evaluation system emerged: teaching function and auxiliary support. These two aspects are also key elements in promoting sustainable learning. Moreover, recommendations are made for designing and selecting suitable assistive teaching robot products, aiming to serve as an exemplary framework for future product development and implementing educational activities within school settings, while further contributing to the realization of sustainable learning.

1. Introduction

With the arrival of ChatGPT 4.0, people have felt the power and convenience of natural language processing technology. The popularity and advancement of artificial intelligence technology have been significantly aided by the open source and promotion of ChatGPT [1]. Because ChatGPT is a substantial language model based on deep learning technology, it is regarded as a significant advancement in artificial intelligence techniques. Moreover, since robots themselves belong to the broad realm of AI technology applications, they will rightfully benefit from the technological progress [2]. To develop students’ analytical, creative, and practical skills, robots suitable for structured and practical disciplines such as Science, Technology, Engineering, and Mathematics (STEM) fields are called educational robots. These robots have the qualities of teaching applicability, openness, scalability, and friendly human–computer interaction [3], and their introduction into the educational field, which focuses on structured learning and the development of practical skills, can contribute to the development and transformation of education to a greater extent [4], and also help to promote the achievement of sustainable learning [5]. For example, in English classes, robots are used to help students practice speaking, listening, and grammar, and to provide personalized learning advice [6]; Cozmo robots, designed by Anki, provide students with an open programming environment [7]. Google Classroom has begun to use educational robotics to provide students with online learning tools and resources [8]. These robots are able to provide a more personalized learning experience by automatically adapting the learning material and level of difficulty based on students’ performance and need as they interact with them. This approach not only makes it easier for students to acquire knowledge but also enhances their ability to learn independently and continuously. Social education has long promoted a commitment to developing people’s willingness and ability to learn, so that they can continually adapt to new demands for knowledge and skills and improve themselves. The integration of technology through artificial intelligence can bring new opportunities and challenges to the education field, which focuses on structured learning and developing practical skills [9], offering new possibilities for the development of sustainable learning and helping society to support the growth of the younger generation.
At the same time, it is important to note that while educational robots can enhance students’ interest and ability to learn, they cannot completely replace the human interaction and emotional communication provided by human teachers [10]. Educational robots excel at providing personalized learning support, helping students understand concepts, answering questions, and automating assessment and feedback. However, the scenarios in which robots can be used are limited when it comes to complex emotional support, ethics and moral education, and other domains [10]. These domains typically require deep human interaction and emotional communication from human teachers, as well as understanding and respect for individual differences.
Since they make up the majority of robots used in educational robotics applications, assistive teaching robots will receive most of the users’ and makers’ attention [11]. Assistive teaching robots are also known as robots that assist teachers with classroom support or repetitive tasks [12]. Humanoid robots such as NAO, SAYA, Bioloid, Aisa (ASIMO), Eddie, RoboTutor, and so on, play a significant role as teaching assistants in academic environments [13,14]. At present, robots come in many different varieties, with a wide range of products on the market and very different areas of application. From the point of view of robot manufacturing, although there is a flourishing trend in the market, there is an obvious homogenization in terms of the underlying technology, modeling, functions, and so on [15,16], which leads users to wonder which robot to use in the teaching practice process or whether they should just heedlessly follow the trends. As a result, it is not possible to effectively meet the market’s diverse needs. The inability to meet different demands has a negative impact on the development and proper use of educational robots. Additionally, it is not conducive to enhancing the sustainable learning ability of the educated. More options are available due to the growing use of assistive teaching robots in education today [17]. While the use of educational robots in relatively structured and explicit educational environments, such as STEM fields and basic subjects, can have a positive impact, we still need to be aware of their limited nature. When evaluating the use of educational robots, we need to focus on both their positive and potential negative impacts. So an evaluation framework that can support efficient development and assessment is needed to help with the selection. However, the lack of a framework with extensive coverage and a high degree of application and the fragmented functional evaluation criteria in the current literature have prevented this issue from being fully addressed [12,18,19].
This study proposed a functional evaluation paradigm with broad applicability to close this research gap. The Decision-Making Trial and Evaluation Laboratory (DEMATEL) and the Analysis Network Process (ANP) method, also known as the DANP method, were combined in this study’s hybrid, multi-criteria decision-making (MCDM) model to create interdependencies between evaluation dimensions and criteria [20,21,22]. Although the use of robots in the classroom has been studied in the past, such as the use of the NAO robot to help children learn English [23], Zhexenova et al. (2020) [24] found that students’ writing skills and confidence improved after interacting with the CoWriter robot. There are also studies that have introduced the SAYA robot into the classroom as a support for a teleclassroom system [25]. In general, these studies have focused on specific applications and there is still a lack of methods to support their development and evaluation. At the same time, with continuous learning as the key to personal growth and development, and its close relationship to the sustainable development of society and the long-term educational goals of social education, the ability of educational robots to promote continuous learning has become a major concern. It is necessary to organize and develop a set of functional evaluation metrics to direct the development of assistive teaching machine goods and to guide the selection of robots in schools.
The following sections make up the remaining portions of this study. Section 2 describes the creation of the evaluation metrics, and the pertinent literature is reviewed. Section 3 explains how the influential network relationship map (INRM) impacts relationships, and how criterion weights were generated using the DANP method. The data are analyzed in Section 4. The relevant discussion is found in Section 5, and some concluding remarks are presented in Section 6.

2. Literature Review

The advent of the digital era and developments in information technology and engineering science has led to the widespread use of robots and related products in various industries. Robots can perform tedious, complicated, and even dangerous tasks more accurately than people [26]. In recent years, robots have also begun to be progressively used in education [27]. While there is an intersection between robotics and its related components and products and the sustainable development of education [28], robots positively impact the ability of students to continue learning in a shorter period of time by providing personalized learning support [8], assessment of learning outcomes and massive learning resources [29,30], robots improve students’ learning effectiveness in a shorter period of time [31], thus actively promoting the sustainable development of education in the whole society. Correspondingly, the improvement of educational effectiveness also gives impetus to the continuous iteration and rapid development of robotics [32]. These two elements mutually stimulate and reinforce each other, creating a symbiotic relationship that drives progress.

2.1. Current Status of Research

Educational robots are intelligent systems or bodies of intelligence that integrate information technology and innovative technologies like computing, sensing, networking, and artificial intelligence to achieve digital modeling and computation of knowledge systems, educational participants, educational scenarios, and educational processes [33]. With the ability to help with teaching, manage to teach, handle teaching, and even host teaching [34], educational robots can create an engaging and interactive learning environment by offering students intriguing activities and unique engagement opportunities for class participation [35]. Interacting with educational robots can meet the different learning needs of students as the robots can provide personalized educational content and support [8]. As a result, educational robots are seen as a valuable tool for raising students’ motivation, interest, and academic performance [36].
With the rapid development of artificial intelligence and natural language processing technologies, educational robots in teaching can be broadly classified into three categories: assistive teaching robots, companion robots, and teaching tools [37].
Robots can perform on a par with or even better than humans in perception, memory, arithmetic, continuous work, and concentration [38]. By storing student learning data in the system, robots can pace teaching based on a student’s proficiency, helping human teachers with labor-intensive repetitive tasks. Robots are thus most frequently used as teaching assistants in educational settings. Using assistive robots in the teaching and learning process helps to alter some of the work teachers typically do. When we look at students’ access to knowledge, it is easy to see that the opportunities for one-to-one interaction between students and teachers are quite limited in the traditional general classroom environment. This limitation restricts students’ individuality and their ability to develop a deeper understanding of knowledge. In a class with many students, the teacher has to cover a large amount of content in a limited amount of time, making it difficult to provide enough individualized instruction and interaction for each student. Assistive teaching robots can solve teacher shortages [38]. According to Robaczewski and other scholars, NAO robots have been widely employed in education and can aid novices in learning through several built-in programs [39]. They also observed that NAO robots contain motor, functional, and emotional qualities enabling them to perform well as assistive robots. On the one hand, some researchers used the Android Robot SAYA and the Madatech robot to compare the effectiveness of robot-assisted teachers within the same course [40]. They ultimately discovered that most students could actively participate in learning activities and interact with the robot assistants throughout the system with the help of the robot. By employing the ASIMO robot to teach storytelling to students, Costa, and other scholars have discovered that the humanoid robot’s gaze behavior significantly aided in developing the students’ narrative abilities [41]. Some scholars use robots as a tool to assist classroom teaching, using the robot’s language recognition and facial expression recognition functions to help students and teachers complete classroom interactions.
In education, the nature of teaching and learning is much more than the transfer of knowledge. One of the aims of education is to develop students’ generic skills, which include critical thinking, creative thinking, problem-solving [42], and a sense of self-directed learning. Although robots excel in perception, memory, persistence, and concentration, they rarely cover all dimensions of education, such as emotions, morality, and social interaction [43]. This is precisely why robots can only be used as tools to assist teachers in the teaching and learning process, where teachers are able to impart human values, inspire students to think creatively, and guide them to become socially responsible citizens. Assistive teaching robots support teachers in terms of knowledge transfer and practice correction, but their greater significance lies in freeing up teachers’ time and energy to focus on those aspects of education that require the involvement of human emotion, intelligence, and flexibility. Robots do not exist to replace teachers [44], but rather to play to their strengths in education and provide more support to teachers, enabling them to guide the development of their students in a more profound way.
Meanwhile, in modern society, knowledge is updated at a rapid pace and students need to have the ability to learn continuously and adapt to changes [45]. Assistive teaching robots help to develop students’ sense of independent learning and continuous learning ability through personalized learning plans, continuous learning tracking, and resource support [46]. This is a gradual process toward lifelong learning, enabling students to adapt to future changes and challenges. Therefore, the use of assistive teaching robots in education is not only to improve efficiency, but also to better meet the diversity of education and the comprehensive needs of students. As part of educational technology, they work with teachers to create a more dynamic and interactive learning environment, helping to cultivate future talents with comprehensive skills.
While robotics facilitates sustainable learning [47], and the number of situations where assistive teaching robots are used in the classroom is growing, necessitating suitable methods to create and assess these robots [19]. There needs to be more information regarding the options for assistive teaching robots in the research on developing systems for evaluating the functions of robots. Most of the existing content primarily concentrates on the application of industrial robots [48]. Therefore, it is hoped that by drawing on the development of an evaluation system for industrial robots, an evaluation system that is practical and appropriate for evaluating educational robots can be explored.
It is not easy to choose the best industrial robot because of the increasing complexity of robotic systems and the growing diversity of robots with varied roles, features, and specifications on the market [49]. Most currently used selection techniques for creating functional evaluation metrics in the literature on industrial robots use the Multi-Criteria Decision-Making (MCDM) model. An MCDM refers to a situation with a limited or infinite collection of decisions that cannot be shared and have conflicting solutions. The fundamental goal of this strategy is to arrive at a value judgment that is based on consensus [50,51]. The Analytic Hierarchy Process (AHP) is a multi-criteria technique that is among the most well-known and significant. Numerous factors, including positioning accuracy, cost, flexibility, load capacity, human–machine interface, and vendor service quality [48], must be taken into account during the industrial robot selection process [52,53]. Therefore, the selection of industrial robots, in general, can be considered a multi-criteria decision problem, and manufacturers and business organizations can use the MCDM approach to resolve various problems with robot selection evaluation [54]. The most well-known MCDM methodology, AHP, was first put forth by Saaty (1980) [55] for assessing, rating, ranking, and evaluating decision selection of one of the most widely used MCDM techniques. The fundamental steps of the AHP method include laying out a complex decision problem as a hierarchy, using pairwise comparison techniques to estimate the relative importance of each element at each level, and finally integrating these significant issues to create an overall evaluation system for choosing a decision solution [56].
By determining the weights of various levels of criteria through AHP to determine the performance of various robot designs, Goh (1997) [57] and Geng (2013) [58] evaluated the value of various industrial robot alternatives. An integrated model combining Analytic Hierarchy Processing (AHP) and Quality Function Deployment (QFD) was presented to determine whether the performance of the functional deployment of robots in the industry improves from the standpoint of needs [56]. Wang et al. (2022) [59] used a linguistic assessment scale based on a spherical fuzzy set, SF-AHP, to allow decision makers to freely express their significance in decision making. Kapoor and Tak (2005) [60] proposed a method of using “fuzzy linguistic variables” instead of numbers to solve the industrial robot selection problem by combining the Analytic Hierarchy Process (AHP) and Multi-Attribute Utility Theory (MAUT), two conventional, multi-criteria decision-making techniques. They used a fuzzification process by tying linguistic variables to the values of membership functions and developing suitable decision rules. Defuzzification, the final step in the process, converts fuzzy outputs into exact values and produces results as fuzzy scores.
AHP is frequently utilized as an MCDM technique for evaluating industrial robot functions since it is predicated on the idea that the dimensions and criteria of a system are independent [61]. The dimensions or criteria of the evaluation system rarely function independently in practice; instead, they frequently interact. The Analytical Network Process (ANP), a development of the AHP methodology, has also successfully resolved several real-world decision problem-solving processes, including project selection, product planning, supply chain management, and optimal scheduling. A better-organized functional evaluation and selection metrics system must choose the best robot for a particular domain and analyze fuzzy selection choices. DEMATEL can be used by producers, decision makers, and users as a field decision-making guide when combined with ANP approaches.

2.2. The Functional Evaluation Index System of Assistive Teaching Robots

In order to select suitable robots for teaching and to evaluate the effectiveness of robots in facilitating sustainable learning, it is necessary to consider robotics in different educational scenarios [30] and the corresponding teaching standards, teaching objectives, content and teaching environments. Even though earlier studies have emphasized educational level and learning subjects as two crucial factors in selecting educational robots, there is still a lack of an implementable evaluation framework to guide future research [21,22,23,24,25,26,27]. Most studies have concentrated on theory, design, development, practice, and reflection, but there has yet to be much academic research on the functional analysis and evaluation of educational robots. The significance of the educational purpose is the aspect that needs to be focused on in the case of constrained educational costs for assistive teaching robots, according to some related scholars who have made this point in their research [38]; Chang et al. (2010) [62] argue for focusing on the appearance and body construction of teaching assistants robots, with movable robotic arms or suitable display positions designed to enable a variety of interactions between teachers and students; Tsiakas et al. (2018) [63] divided the four functional evaluation metrics for assistive teaching robots into perceptual functions, behavioral control, classroom assistance, and personalization to assess the effects of robot appearance, verbal and nonverbal behavior, service, and communication style on educational instruction. To further improve the autonomous capabilities of the assistive teaching robot, Cooney and Leister (2019) [38] built a more comprehensive functional evaluation system through an exploratory experimental study of the robot’s capabilities. Other researchers used a structural equation modeling approach to design a quantitative model of the instructional design model and proposed a functional design framework for robot-assisted instruction based on constructivism [64]. The network relationships of the assessment system and the effect relationships between evaluation indicators, however, have not been taken into consideration by related research, which proposes a few evaluation dimensions on the surface.
The quality evaluation index system for the assistive teaching robot was developed in this study based on a substantial amount of domestic and international literature on the test evaluation and related design of the assistive teaching robot in four aspects: system structure, appearance interface, teaching function, and auxiliary support. The 17 secondary indicators under the dimension level are as follows:

2.2.1. System Structure (X1)

A robotic system comprises hardware, operating system software, and a physical body. The robot is a typical system, and although it should be highly reliable in theory, Merlet (2009) claimed that variations in how the mechanical components of the robot were made inherently cause uncertainty in the robot. As a result, some analytical methods are required to guarantee that the robot is reliable in a pedagogical use environment [65]. Functional diversity and ease of use can improve teaching effectiveness and expand the range of services teaching assistants offer, making the robot more manageable and structurally compact. According to Yoshino and Zhang (2018) [66], the reliability and security of the system should be the primary considerations when evaluating the indicators, because they can improve the user experience when the operation of the robot system is stable, the authentication service system is comprehensive, and the firewall has security protection. The user experience is impacted by how quickly system functions respond, according to Yu et al. (2020) [67] and a related article published by the China Education Equipment Industry Association.
Furthermore, easy-to-use systems and programs can maintain their appeal to students and their interest and motivation to learn. When utilizing the ROBOSEM robot as an example to emphasize the relevance of switching between remote operation and autonomous control features given by assistive teaching robots, Park et al. (2011) [68] underlined that system flexibility may lead to a superior qualitative experience. As a result, according to the literature mentioned above, the system structure dimension’s metrics include system reliability, security, flexibility, functional diversity, and operational convenience.

2.2.2. Appearance Interface (X2)

Robots have been divided by Fong and other scholars into four categories based on their appearance, namely, anthropomorphic, exaggerated, caricatured, and functional [12]. It turns out that children under the age of 9 pay close attention to the appearance of robots. Children’s attitudes toward robots that resemble humans are better than their attitudes toward robots that are simply machines [69], but support for those robots that closely resemble humans has declined dramatically. Ryu et al. (2007) [69] explored the design function of assistive teaching robots and found that the robot’s physical characteristics and role models significantly impacted educated people. As a result, the esthetically pleasing, comprehensive, and compact design of the outer structure is a critical assessment criterion. According to Yang and Wang’s (2020) [70] hypothesis, the interactive interface is the element that people find most intuitive in determining the level of service quality and customer happiness, and the ease with which an interface’s features may be used directly impacts the user experience. According to a related article by Tsiakas et al. (2018) [63], custom extension plugins can increase the diversity of the educational process. According to the Technical Specification for Teaching Robots in Primary and Secondary Schools, teaching assistants should have a modular structure and flexibly mountable and expandable external device interfaces. To better assess the appearance of the user interface, the degree of expansion of the equipment is also taken into account in this study.

2.2.3. Teaching Function (X3)

Huijnen et al. (2017) [71] suggest that assistive robots need to provide students with content that is compatible with their learning methods and study habits, improve student learning through personalized instruction, and get them more actively involved in learning. Appropriate use of robotics in the teaching and learning process is expected to improve the effectiveness of teaching and learning, enhance students’ motivation and interest in learning [72], and ultimately contribute to the promotion of lifelong sustainable learning. Therefore, the teaching function of assistive teaching robots has become one of the most important dimensions to measure its functionality. Many schools have introduced educational robots as teaching aids, such as the Pepper robot, which is able to interact with students, introduce course content and answer questions [73]. Teaching functionality, UI appearance, and assistive support are the three main factors that determine the quality of an assistive teaching robot. As a result, the instructional design content that the assistive teaching robot holds must follow the rules of effective teaching and learning, such as normality, applicability, and completeness. According to Wu et al. (2015) [74], because educational robots must have higher levels of intelligence and a greater variety of knowledge, assistive teaching robots must be able to create various lesson plans to suit curriculum variations. To be an effective teaching assistant, robots should assist teachers in the classroom by adjusting the pace of teaching and learning to create a more efficient and effective teaching environment [75]. They should also be able to meet the needs and preferences of different students according to their diverse characteristics. According to the literature listed above, the teaching function dimensions include teaching content standardization, teaching content applicability, teaching content integrity, and teaching process control.

2.2.4. Auxiliary Support (X4)

According to Hong and Huang (2016) [76], the RALL system framework, which offers teachers aids, scripted materials for writing, scripted materials that can be easily imported into multimedia resources, and scripted presentations for anthropomorphic robots, can improve the resource management capabilities of the assistive teaching robot. According to Cross et al.’s (2009) [77] theory, a robot’s expression, language, actions, emotions, and ability to communicate pain are distinctive and dynamic in human–robot interaction. This ability to express verbal engagement, in particular, impacts the entire classroom. According to Louie et al. (2017) [75], assistive teaching robots are created to address the need for teacher support in the school, particularly in circumstances where teachers struggle to handle things on their own and where teaching assistants are required to be able to perform repetitive tasks and manage sizable amounts of redundant data resources instead of teachers. Park et al. (2011) [68] use ROBOSEM assistive teaching robots to pronounce and assess pronunciation, identify students with RFID tags or markers, or record student credits while retrieving portfolios and voices using microphone radios. It also provides interaction and real-time feedback with students to enhance the learning experience. As a result, regulatory capability, voice interaction capability, resource management capability, and learning support capability are all part of auxiliary support.
The specific reference Table 1 for this article’s dimensions and index sources is below.

3. Methods

To provide a system of robotic solutions and selection indicators that meet different educational needs and are oriented towards sustainable learning. This study refers to the relevant research results of Hsu (2012) [80] and Govindan (2015) [81] in this section. In this study, the DEMATEL method is extended by incorporating the network analysis method (ANP) and utilizing the DANP method. This approach allows for a more detailed quantification of factor weights using a comprehensive influence matrix. Additionally, the study investigates the relationships among the elements within the system, thereby providing a deeper understanding of their interconnections. The network relationships between different assessment dimensions are made clear, successfully extending the analytical reach of the DEMATEL methodology and simplifying pairwise comparisons using the ANP method. The steps in its computation process are described as follows.

3.1. The DEMATEL Method

This study uses the DEMATEL method to determine the interdependence of variables/criteria and to reduce the dependency on features representing the underlying system and development patterns. The interrelationships between the elements that went into creating the INRM can also be determined through it [82,83]. The steps of the method are described as follows:
Step 1: Calculate matrix A of the direct relationship average. The average matrix of direct links was calculated using values from 0–4 for “no impact”, “low impact”, “average impact”, “strong impact”, and “very strong impact”. For the expert questionnaire, the average matrix of direct links was utilized. Respondents were asked to rate the degree of direct influence a i j between the two paired criteria using the following scale, and expert ratings were used to assess the relationships between the elements. Afterwards, the same criteria from the matrix of f respondents were averaged to produce the direct relationship averaging matrix. Equation (1) is the matrix A , where n is the number of criteria. Thus
A = [ a 11 a 1 j a 1 n a i 1 a i j a i n a n 1 a n j a n n ]
Step 2: The product of A and v yields the initial direct influence matrix S = s i j n × n , where n represents the total number of influencing factors in the system and a i j represents the strength of i influencing a factor’s impact over j .
S = v × A
v = max [ max i j n | dij | , max j i n | dij | ]
Step 3: Equation (4) is used to generate the total impact matrix Z , and the member Z i j represents the indirect influence of criterion i on criterion j . Therefore
Z = S + S 2 + + S k = S I S 1
where, Z = Z i j n × n and I is the unit matrix.
Step 4: The matrix Z ’s column sums ( c i ) and row sums ( r i ) are calculated as follows:
c i = ( c j ) n × 1 = ( c j ) 1 × n = [ i = 1 n z i j ]
r i = ( r i ) n × 1 = [ j = 1 n z i j ]
The overall influence effect that criterion j receives from other criteria is represented by element c j in vector c . Similar to this, r i indicates how factor i impacts other criteria, both directly and indirectly.
Step 5: Both matrix Z C and matrix Z D are derived using the criteria and dimensionality, respectively. Based on the criterion and dimensionality, matrix Z can be distinguished as Z C and Z D , respectively. Matrix Z D is produced by averaging the criterion’s level of influence in each dimension.
Z C = D 1 D 2 D n C 11 C 12 C 1 m 1 C 21 C 22 C 2 m 2 C n 1 C n 2 C n m n [ Z c 11 Z c 12 Z c 1 n Z c 21 Z c 22 Z c 2 n Z c n 1 Z c n 2 Z c n m ] D 1 C 11 C 1 m 1 D 2 C 21 C 2 m 2 D n C n 1 C n m n
Step 6: Get INRM. As a result, while r i c i displays the net influence of factor on the other components, r i + c i depicts the intensity of giving and receiving the power of factor i . It is obvious that factor i is the causative component if r i c i is positive, and that factor i is the impacted component if r i c i is negative. Therefore, the data set ( r i + c i , r i c i ) can be mapped to create the INRM.

3.2. Getting ANP Weights Using the DEMATEL Method

This study uses the DEMATEL approach to establish the interrelationships between the elements required to produce the INRM. The steps are briefly listed below. ANP is the procedure for creating unweighted supermatrices and for determining factor weights. ANP, as opposed to AHP, considers the connections and interdependencies between numerous aspects or criteria [84]. The original ANP strategy, however, suffers from three major problems. Before utilizing the network analysis method, one must assume a relational structure of the evaluation system. Second, the complexity of the questionnaire will make it difficult to compare two indications simultaneously [85], which may make the results challenging to grasp. Thirdly, each cluster having the exact same weight is implausible, given that the effect level varies between dimensions or clusters [86].
It should be noted that the DEMATEL-based ANP approach can be used to address these three problems. The unweighted supermatrix is then normalized using the ANP method after the DEMATEL method determines the extent of influence between dimensions. These are the precise steps:
Step 7: Obtain the supermatrix that is not weighted. A matrix Z C σ is obtained by normalizing Z C .
Z c δ = D 1 D 2 D n C 11 C 12 C 1 m 1 C 21 C 22 C 2 m 2 C n 1 C n 2 C n m n [ Z c δ 11 Z c δ 12 Z c δ 1 n Z c δ 21 Z c δ 22 Z c δ 2 n Z c δ n 1 Z c δ n 2 Z c δ n m ] D 1 C 11 C 1 m 1 D 2 C 21 C 2 m 2 D n C n 1 C n m n
A submatrix of Z C called Z C p q , for instance, can be normalized to Z C δ p q as follows:
Z c p q = c p 1 c p i c p m p [ z 11 p q z 1 j p q z 1 m q p q z i 1 p q z i j p q z i m q p q z m p 1 p q z m p j p q z m p m q p q ] C q 1 C q j C q m q z i p q = j = 1 m q z i j p q
where I = 1, 2… m p
Z C d p q = c p 1 c p i c p y p [ z 11 p q z 1 p q z 1 j p q z 1 p q z 1 y q p q z 1 p q z i 1 p q z i p q z i j p q z i p q z i y q p q z i p q z y p 1 p q z y p p q z y p j p q z y p p q z y p y q p q z y p p q ] C q 1 C q j C q y q = [ z 11 d p q z 1 j d p q z 1 y q d p q z i 1 d p q z i j d p q z i y q d p q z y p 1 d p q z y p j d p q z y p y q d p q ]
The matrix Z C δ p q transposes the unweighted matrix supermatrix W p q as follows:
z D x y = i = 1 i x j = 1 j y z i j i x × j y
W p q = c q 1 c q j c q m q [ z 11 p q z i 1 p q z m p 1 p q z i j p q z i j p q z m p j p q z 1 m q p q z i m q p q z m p m q p q ] C p 1 C p i C p m p = ( Z c d p q )
As shown in Equation (11), D i denotes the i th dimension; C i j denotes the j th criterion of the i th dimension.
Step 8: A weighted supermatrix is derived. As seen in step 5, the matrix Z D is generated by averaging the degree of influence of the criteria on each dimension.
z D n m = i = 1 i n j = 1 j m z i j i n × j m
where i n is the number of criteria in dimension n and j m is the number in dimension m .
Z D = [ z D 11 z D 12 z D 1 n z D 21 z D 22 z D 2 n z D n 1 z D n 2 z D n n ] d 2 = j = 1 n z D d 2 j
The weighted supermatrix W δ is created by multiplying the unweighted supermatrix as W by Z D δ as follows:
Z D d = [ z D 11 d 1 z D 12 d 1 z D 1 n d 1 z D 21 d 2 z D 22 d 2 z D 2 n d 2 z D n 1 d n z D n 2 d n z D n 3 d n z D n n d n ] = [ z D d 11 z D d 12 z D d 1 n z D d 21 z D d 22 z D d 2 n z D d n 1 z D d n 2 z D d n n ]
W δ = W × Z D δ
Step 9: Determine the weights of DANP. Continue to impose restrictions on the weighted supermatrix W δ until it converges and stabilizes. The DANP weights can be multiplied by the matrix to reach a steady state.
W W = lim λ W λ

4. Data Analysis

4.1. Relationship between Metric Dimensions and Indicators

According to Table A1 (see Appendix A), a questionnaire was developed to gauge the initial direct influence matrix by comparing the level of influence between any two indicators. A total of 10 experts, including university professors, specialists in robotics-related fields, and primary and secondary school teachers with some familiarity with assisted robotics, were invited to form an advisory group to score the indicators of the influencing factors at each level. By using the maximization of affiliation, this study totaled the results to determine the value of the level of influence between each indicator. The direct impact matrix was produced by applying Equation (2) to convert the fuzzy triangular numbers of the impact factors of the functional assessment indicators into precise values after compiling the questionnaire responses from 10 experts, as shown in Table A1. Even though the 10 experts did not represent all relevant beneficiaries, the results show good consistency and can, in some cases, reflect reality.
The normalized directed relationship matrix is obtained from Equations (2) and (3), and is calculated using Equation (4) (refer to Table 2). The influence matrix Z within each dimension is then averaged to yield the total influence matrix Z D for each dimension, as shown in Table 3. Equations (5) and (6) of Step 4 were completed to produce the sum of provided and received impacts between criteria and dimensions, as shown in Table 4.

4.2. Identification and Statistics of Functional Evaluation Indicators for Assistive Teaching Robots

The entire influence matrix of functional assessment impact indicators can be created using the calculation in Step 5. The impact, influence, centrality, and causation degrees of each factor at all levels are calculated, as seen in Table 5.
The main influencing factors for the functional evaluation of the assistive teaching robot can be initially identified based on the centrality ranking in Table 5 (continued), including functional diversity (X14), ease of use (X15), device expansion (X24), learning assistance capability (X44), teaching process control (X34), and resource management capability (X43). To more intuitively demonstrate the mutual influence and influenced relationship of each influencing factor in the functional evaluation of the assistive teaching robot, this study combines the data in Table 5, projects the statistical results of the cause degree and centrality of each key influencing factor into a two-dimensional coordinate system, and obtains the cause–effect diagram of the influencing factor of the functional evaluation of the assistive teaching robots as shown in Figure 1. Meanwhile, the dimensional and systematic influence network relationship map (INRM) can be drawn from Table 5, as shown in Figure 2.
According to the four-quadrant diagram of causality, the factor in the first quadrant is the Driving Factor of the entire evaluation model, having the most significant impact on the functional evaluation index system and playing an important role in the whole system [87]. The factors in the second quadrant are known as Supporting factors (Voluntariness), as they play a supporting role in the model; the factors in the third quadrant are known as Independent factors, as they have a more significant influence on the model; none of the factors in this study are located in the third quadrant, demonstrating the interdependence of all the factors in the functional index system; and the factor in the fourth quadrant is known as the Threshold factor. Table 6 presents the association between the variables in the four quadrants.
The unweighted supermatrix W p q of the 17 secondary influencing factors was constructed in this study and is displayed in Table 7 to more precisely quantify the weights of the influencing factors for the functional evaluation of the assistive teaching robot.
Based on this, Equation (6) was used to calculate and weigh the comprehensive influence matrix Z C δ of four first-level influence factors. The weights of every part of the system were then derived by performing limit operations on the weighted supermatrix of the influence factors to assess the assistive teaching robot’s function, as shown in Table 8.
Although most other research constructed robot indicators using the ANP approach [88,89], this study opted to employ the DANP method to determine the significance of indicators. The DANP technique utilizes the DEMATEL and ANP results, in addition to examining the interactions between internal elements of the system, to calculate the criteria weights. The degree of influence ( Z C ) between the indicator and the dimension is obtained from DEMATEL, and Z C is normalized by considering the intensity of the effect of each indicator within its dimension. As a result, the weighted supermatrix (Step 8 of Section 3) takes each criterion’s percentage within its dimension and the extent to which it influences the other dimensions into account. Therefore, the weights of the indicators can be obtained first, then we determine the consequences of each dimension by adding the corresponding importance of the indicators within the same dimension. Consistent results can be obtained while avoiding the time-consuming, pairwise comparisons in the original ANP. Table 9 and Table 10 show the component weight values and the total of each component’s influences. This study has now quantified the weights of the factors impacting the functional evaluation of the assistive teaching robot.

5. Discussion

5.1. Cause Factor Analysis

The appearance interface (X2), which has the most considerable cause degree ( r i c i ) value (0.160) and is consequently the most influential dimension, is shown in Table 5 and Figure 1 through a study of how the influencing elements interact with one another and their positions within the system. This indicates the importance of the appearance interface in the operation of the educational robot, aligning with Ryu et al.’s (2007) [69] assertion that a robot’s external features significantly influence educated individuals. At the same time, the appearance interface will dramatically impact the other dimensions and could have the most significant impact on the overall functional evaluation system [12]. The main causal factors (causal degree values greater than 0) of the operational evaluation influencing factors of the assistive teaching robot are system reliability (X11), clarity of interface interaction (X22), degree of equipment expansion (X24), system security (X12), system flexibility (X13), simplicity of interface presentation (X23), functional diversity (X14), the esthetic of the exterior structure (X21), and voice interaction capability (X42). These findings indicate that the above factors are more likely to influence the other factors in the system during the application of the assistive teaching robot, which plays a fundamental and guaranteed role.
Additionally, the system structure (X1) has the highest centrality ( r i + c i ) (1.941), which indicates that it has the most outstanding overall degree of impact inside the dimension, and has the most ability to influence the other dimensional indicators. Furthermore, Yoshino and other researchers (2018) noted in their publication that a system’s general stability enhances the user’s experience when using it [66,67], which is consistent with the findings of this study. We are aware that when selecting a robot that is not yet fully understood, the attractiveness of the appearance can affect people’s initial willingness to use it, which is referred to as the first-cause effect. The stable use experience that the robot itself can bring plays a crucial role in the user’s long-term use as the length of use increases and the use deepens [66]. As a result, the system structure will influence the user experience brought by other dimensions in the process. System security and flexibility are necessary for delivering high-quality services; device expansion, interface presentation clarity, and functional diversity offer crucial hardware and software guarantees for the assistive teaching robot; and whether or not the user will accept the assistive teaching robot depends on the esthetic appeal of the exterior structure and its voice interaction capability. User acceptance of the assistive teaching robot is directly correlated with system dependability and the clarity of interface interaction. The decision makers’ and users’ choice of the robot will ultimately be influenced most by the robot’s appearance interface, and its system structure.

5.2. Outcome Factor Analysis

Similarly, it can be deduced that the capability of resource management (X43), regulatory capability (X41), operational convenience (X15), teaching content applicability (X32), teaching content standardization (X31), simplicity of interface interaction (X23), learning support capability (X44), teaching content integrity (X33), and control of the teaching process (X34) are the outcome factors (reason degree value less than 0) of the factors influencing the functional evaluation of the assistive teaching robot. The components primarily dealing with teaching function and auxiliary support directly impact the evaluation system’s system structure and appearance interface components. The size of the interface or the system design will influence the richness of teaching-related functions that can be provided on the operating interface. In a related study, Yang and Wang (2020) [70] also suggested that the neatness of the interface features affects the range of functions and assistive support that assistive teaching robots can offer users. Teaching function and auxiliary support are two crucial aspects of the functional evaluation system of assistive robots and are key elements in promoting sustainable learning.

5.3. Importance Analysis of Influencing Factors

According to the ranking results of the weight values of the factors influencing the functional evaluation of the assistive teaching robot in Table 10 in the order of highest to lowest, the factors are teaching function (X3), auxiliary support (X4), system structure (X1), and appearance interface (X2). From the criteria, five indicators, namely control of the teaching process (X34), learning support capability (X44), regulatory capability (X41), teaching content applicability (X32), and resource management capability (X43), have a significant influence on the application evaluation of assistive teaching robots. This finding is in line with the need for assistive teaching robots to have a higher level of intelligence and a wide range of knowledge for the teaching field [74], while the esthetic of the exterior structure (X21), clarity of interface presentation (X22), and system security (X12) are the minor essential indicators in the system.
These results support the findings of the DEMATEL method study, which discovered that the teaching function and auxiliary support are two crucial parts of an assistive teaching robot’s functional evaluation system. The teaching function and auxiliary support are vital to the functional evaluation of an assistive teaching robot. Still, simultaneously, these two dimensions are easily influenced by both the system structure and the appearance interface. Although this result contrasts with the results of the centrality ranking of the factors presented in Table 5, it indicates this fact. The finding that robots with diverse teaching resources, standardized content settings, personalized learning for teachers and students, and convenient support are readily preferred by users is also consistent with the expectations of customized teaching that teaching assistants can provide in related studies [71,75]; the four dimensions intertwine to influence the selection of an assistive teaching robot as a whole. Given that teaching is the main application scenario for teaching robots, the design of teaching functions and the reserve of teaching resources require special consideration during the design and development process, as shown by the ranked factors in Table 5 and the computation of the index weights in Table 9. For the assistive teaching robot to be a valuable tool for teachers and a helpful learning partner for students, it is also important to consider how the system is built and how the interface’s appearance is designed during development.

5.4. Theoretical and Practical Implications

In this study, based on integrating the existing literature and verifying the validity of the evaluation indicators, we successfully constructed a set of functional evaluation indicator systems for assistive teaching robots to promote sustainable learning. By applying the DANP method of hybrid, multi-criteria decision making, we deeply explored the associations among the indicator dimensions and their mutual influence relationships in the complex evaluation system. This not only provides an implementable and widely adaptable evaluation framework for related research, but also provides a useful reference for the analysis and evaluation of educational robot functions in educational environments that emphasize structure and operability, presenting the network associations of the evaluation system and the complex influences among evaluation indicators from a new perspective.
In practical applications, this study provides a practical evaluation tool within a specific educational field, especially for relatively structured and explicit educational environments such as STEM fields and basic subjects. In these educational environments, the functionality of assistive teaching robots is compatible with the nature of the subject, and their features such as personalized support, content delivery, and learning pace adjustment can significantly enhance teaching and learning.
According to the data analysis findings, it is evident that auxiliary support and teaching functions are the main criteria for judging the functionality of the assistive teaching robot. The design and development of assistive teaching robots can be guided by the core issues of evaluation systems to meet the needs of teachers and students in educational fields that emphasize structured learning and the development of practical skills. The teaching function includes the teaching methods, techniques, and strategies that the assistive robot can provide to the teacher, as well as the ability to personalize the teaching to the student’s needs. Auxiliary support focuses on the support and assistance that the robot can provide in the teaching process beyond what is necessary for the course to proceed, such as answering questions and solving problems, and recommending learning resources. The robot assistant should be able to obtain the latest teaching resources and knowledge in a timely manner and integrate them into the teaching content to meet the evolving learning needs and knowledge updating requirements. Therefore, the evaluation system can provide guidance for the design and development of assistive robots in terms of focusing attention, as well as a basis for selecting robots that promote sustainable competence enhancement to meet the different learning needs of teachers and students, and to support the whole society in developing sustainable learning competence of the next generation of young people.
The final results of the study may, in certain cases, provide targeted recommendations for the development of appropriate robotic products to support the use of assistive teaching robots in instruction and education. The relevant results from the data analysis can address the homogenization problem with educational robot production, and provide a more robust logical basis for the robots selected by manufacturers, decision makers, and users. When developing and designing assistive teaching robots, the emphasis should be on providing practical teaching functions and tailored assistive support to meet the needs of teachers and students for sustainable learning in education. Even though an assistive teaching robot’s operator interface, appearance, and practicality impact the user experience, a greater focus should be placed on its individualized assistance support and practical teaching functions. Suppose assistive teaching robots are to be successful in the classroom. In that case, they must have access to a wide range of teaching resources, standardized teaching topic environments, and the ability to support individualized learning for teachers and students. Robot manufacturers should concentrate on improving teaching resources, monitoring the quality of teaching content, and improving personalized learning support capabilities during the development process. At the same time, decision makers and users can also use these factors as the primary basis for selecting robots.

6. Conclusions

This study has built an assistive teaching robot functional evaluation indicator system aimed at sustainable learning, that can direct product development and serve as a guide for schools in selecting robots. Relevant functional evaluation is increasingly crucial as artificial intelligence technology continuously upgrades and educational robot applications in real-world settings increase in number. This study built a functional evaluation indicator system for assistant teaching robots using 17 indicators, including resource management skills, system dependability, and degree of equipment expansion, and four dimensions: system structure, appearance interface, teaching function, and auxiliary support. The weights of each dimension indicator that influence the functional evaluation are then determined using the DANP method. Finally, the pertinent influence, result, and causal factors were examined. According to the findings, the teaching function is the key evaluation factor that significantly impacts deciding which assistive teaching robot functions to use.
The research results can provide the following contributions: (1) identify suitable functional evaluation indicators for assistive teaching robots; (2) determine the weights of each dimension and indicator in the evaluation system by using advanced models; and (3) provide targeted suggestions for the development and selection of assistive teaching robots based on expert judgments.

6.1. Suggestions for Future Research and Practice

By analyzing cause–effect relationships and influencing factors, and considering the ranking of influencing factors, this study makes the following recommendations for the design and selection of future assistive teaching robots:

6.1.1. Enrich the Teaching Resource Library and Enhance the Ability to Assist in Teaching

Robots with an extensive library of teaching resources can aid in improving teaching in educational settings. The teaching function is the primary determining factor when choosing assistive teaching robots. Teaching robots should follow national or regional education and teaching standards, ensure that the lesson content satisfies the requirements, and provide a library of teaching resources that may include various educational applications, interactive multimedia teaching materials, online courses, and so on, to meet the needs of multiple learners and promote their participation. The integrated teaching resource library should also cover the fundamental and advanced knowledge of the pertinent subjects in-depth and methodically. Teachers can fully utilize the courseware and teaching resources in the teaching resource library to prepare lessons, saving them time and allowing them to focus more on assisting students with their learning. Assistive teaching robots must also offer teaching materials that can be individually adjusted and optimized according to the needs of various learners to promote students’ learning progress and improve their learning effectiveness. Assistive teaching robots should also have the function of effectively managing learning resources, including collecting, classifying, storing and sharing learning resources, ensuring that students can conveniently access the required learning materials, and providing a diversified choice of resources to meet the learning needs of different students. With the rapid development of information technology and the rapid updating and innovation of educational resources, educational robots, as tools capable of providing rich educational resources, have the potential to create a richer educational experience for learners. By providing students with diverse learning materials, real-time feedback, and personalized support, educational robots are expected to effectively enhance their motivation and interest in learning, thus helping to strengthen their ability to learn in a sustainable way.

6.1.2. Combine Core Technologies and Facilitate Supportive Support

Teaching assistants’ supportive assistance in the classroom can impact the entire teaching process. Robots must have sensors that can provide a variety of application scenarios, natural language processing capabilities, and potent perceptual information processing technologies since the diversity of robot functions must rely on integrating numerous essential technologies. For instance, the Alpha Egg from China’s iFlytek combines speech recognition and sound source localization technology to enable face-to-face communication with users. Due to the variety of data encountered in the practical application of assistive teaching robots in education, there is a need for collaborative efforts among more sophisticated technologies to enhance the pedagogical suitability of educational robots. At the same time, in case of potential misbehavior, the assistive robot needs to have a monitoring and warning function to ensure the order of the educational process and the safety of the learning environment. In practice, assistive teaching robots can use advanced recommendation algorithms and machine learning technologies to recommend learning content and resources tailored to students’ learning needs and interests. Through in-depth analysis of students’ learning history and behavior patterns, it understands and grasps students’ learning tendencies and knowledge needs, and provides them with personalized learning suggestions, thus stimulating students’ active learning interests and sustained motivation. Currently, technologies that can emulate human perception are still in the development stage, and one important technology is multimodal fusion perception. This technology enables robots to obtain more comprehensive and accurate information to support more efficient scientific decision making by assistive teaching robots.

6.1.3. Pay Attention to the Design of Robot Appearance and Interaction Interface

The appearance of the robot has a long-term direct impact on the user’s in-use behavior regarding the robot. To lower the barriers to human–computer interaction and to enhance the learners’ sense of affinity, it is important to essentialize the robot’s appearance, expressions, and movements specifically for the user’s age group. The interactive interface is the first barrier to human–robot interaction, and slick interface animations and clear instructions greatly impact the user’s experience. However, the design of robot products is frequently uniform, both domestically and internationally, in terms of appearance and user interface, and robots from large manufacturers have a similar appearance and the same operation methods, quickly making users feel fatigued. The emphasis on appearance and interactive interface design will help to resolve this issue and enable learners to interact with the robot more thoroughly and productively. Additionally, robots must be more appealing and targeted to learners’ unique requirements and cognitive variations at various levels. In addition, as related artificial intelligence technologies, like bionic technology and intelligent manufacturing technology mature, the anthropomorphic appearance design of assistive teaching robots will become a significant development trend in the future.

6.2. Limitations

A hybrid of DEMATEL and ANP methods was used to build and analyze the indicators in this study. The DEMATEL method, based on the expert analysis and selection of indicator relationships, was used to confirm the interdependence of variables/criteria, so the data obtained have some subjectivity. This limitation issue can be resolved if future research includes fuzzy set comparison analysis.

Author Contributions

Conceptualization, P.-Y.S.; methodology, P.-Y.S. and Q.-G.S.; investigation, P.-Y.S. and Z.-Y.Z.; data curation, Z.-Y.Z., Q.-G.S. and P.-Y.L.; writing—original draft preparation, P.-Y.S. and Z.-Y.Z.; writing—review and editing, P.-Y.S., Z.-Y.Z. and Z.L.; supervision, Q.-G.S. and Z.L.; funding acquisition, P.-Y.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Fujian Province Social Science Planning Basic Research Annual Project in 2022 (FJ2022B033), the Key Special Project for the High-Quality Development of Basic Education in Fujian Province Education Science Planning (FJWTZD21-06), and the Fujian Social Science Foundation (FJ2022BF015).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors are thankful for the help and technical support of other authors and appreciate the financial support provided by the foundations.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Questionnaire on the impact of each indicator.
Table A1. Questionnaire on the impact of each indicator.
Level of
Influence
X11X12X13X14X15X21X22X23X24X31X32X33X34X41X42X43X44
X11
X12
X13
X14
X15
X21
X22
X23
X24
X31
X32
X33
X34
X41
X42
X43
X44
Note: X11: System Reliability, X12: System Security, X13: System Flexibility, X14: Functional Diversity, X15: Operational Convenience, X21: Esthetics of the Exterior Structure, X22: Clarity of Interface Presentation, X23: Simplicity of Interface Interaction, X24: Degree of Equipment Expansion, X31: Teaching Content Standardization, X32: Teaching Content Applicability, X33: Teaching Content Integrity, X34: Control of the Teaching Process, X41: Regulatory Capability, X42: Voice Interaction Capability, X43: Resource Management Capability, X44: Learning Support Capability. Black cells indicate that there is no need for the questionnaire to ask about the impact of the indicator on itself.

References

  1. Dwivedi, Y.K.; Kshetri, N.; Hughes, L.; Slade, E.L.; Jeyaraj, A.; Kar, A.K.; Baabdullah, A.M.; Koohang, A.; Raghavan, V.; Ahuja, M.; et al. “So what if ChatGPT wrote it?” Multidisciplinary perspectives on opportunities, challenges, and implications of generative conversational AI for research, practice, and policy. Int. J. Inf. Manag. 2023, 71, 102642. [Google Scholar] [CrossRef]
  2. Ghatrehsamani, S.; Jha, G.; Dutta, W.; Molaei, F.; Nazrul, F.; Fortin, M.; Bansal, S.; Debangshi, U.; Neupane, J. Artificial Intelligence Tools and Techniques to Combat Herbicide Resistant Weeds—A Review. Sustainability 2023, 15, 1843. [Google Scholar] [CrossRef]
  3. Lu, V.N.; Wirtz, J.; Kunz, W.H.; Paluch, S.; Gruber, T.; Martins, A.; Patterson, P.G. Service robots, customers, and service employees: What can we learn from the academic literature and where are the gaps? J. Serv. Theory Pract. 2020, 30, 361–391. [Google Scholar] [CrossRef]
  4. Hwang, G.J.; Xie, H.; Wah, B.W.; Gašević, D. Vision, challenges, roles and research issues of Artificial Intelligence in Education. Comput. Educ. Artif. Intell. 2020, 1, 100001. [Google Scholar] [CrossRef]
  5. Hsieh, Y.Z.; Lin, S.S.; Luo, Y.C.; Jeng, Y.L.; Tan, S.W.; Chen, C.R.; Chiang, P.Y. ARCS-assisted teaching robots based on anticipatory computing and emotional big data for improving sustainable learning efficiency and motivation. Sustainability 2020, 12, 5605. [Google Scholar] [CrossRef]
  6. Chen, Y.L.; Hsu, C.C.; Lin, C.Y.; Hsu, H.H. Robot-assisted language learning: Integrating artificial intelligence and virtual reality into English tour guide practice. Educ. Sci. 2022, 12, 437. [Google Scholar] [CrossRef]
  7. Touretzky, D.S.; Gardner-McCune, C. Calypso for Cozmo: Robotic AI for everyone. In Proceedings of the 49th ACM Technical Symposium on Computer Science Education, Baltimore, MD, USA, 21–24 February 2018; p. 1110. [Google Scholar]
  8. Chassignol, M.; Khoroshavin, A.; Klimova, A.; Bilyatdinova, A. Artificial Intelligence trends in education: A narrative overview. Procedia Comput. Sci. 2018, 136, 16–24. [Google Scholar] [CrossRef]
  9. Khan, M.A.; Khojah, M.; Vivek. Artificial intelligence and big data: The advent of new pedagogy in the adaptive e-learning system in the higher educational institutions of Saudi Arabia. Educ. Res. Int. 2022, 2022, 1263555. [Google Scholar] [CrossRef]
  10. Woo, H.; LeTendre, G.K.; Pham-Shouse, T.; Xiong, Y. The use of social robots in classrooms: A review of field-based studies. Educ. Res. Rev. 2021, 33, 100388. [Google Scholar] [CrossRef]
  11. Papadopoulos, I.; Lazzarino, R.; Miah, S.; Weaver, T.; Thomas, B.; Koulouglioti, C. A systematic review of the literature regarding socially assistive robots in pre-tertiary education. Comput. Educ. 2020, 155, 103924. [Google Scholar] [CrossRef]
  12. Huang, S. Design and Development of Educational Robot Teaching Resources Using Artificial Intelligence Technology. Int. J. Emerg. Technol. Learn. 2021, 15, 116–129. [Google Scholar] [CrossRef]
  13. Kühnlenz, K.; Sosnowski, S.; Buss, M. Impact of animal-like features on emotion expression of robot head eddie. Adv. Robot. 2010, 24, 1239–1255. [Google Scholar] [CrossRef]
  14. Shaner, D.L.; Beckie, H.J. The future for weed control and technology. Pest Manag. Sci. 2014, 70, 1329–1339. [Google Scholar] [CrossRef]
  15. Levitt, T. The globalization of markets. McKinsey Q. 1983, 2, 69–81. [Google Scholar]
  16. Hiller, J.; Lipson, H. Automatic design and manufacture of soft robots. IEEE Trans. Robot. 2011, 28, 457–466. [Google Scholar] [CrossRef]
  17. Kessler, G. Technology and the future of language teaching. Foreign Lang. Ann. 2018, 51, 205–218. [Google Scholar] [CrossRef]
  18. Tzafestas, C.S.; Palaiologou, N.; Alifragis, M. Virtual and remote robotic laboratory: Comparative experimental evaluation. IEEE Trans. Educ. 2006, 49, 360–369. [Google Scholar] [CrossRef]
  19. Lin, Y.; Qu, Q.; Lin, Y.; He, J.; Zhang, Q.; Wang, C.; Jiang, Z.; Guo, F.; Jia, J. Customizing robot-assisted passive neurorehabilitation exercise based on teaching training mechanism. BioMed Res. Int. 2021, 2021, 9972560. [Google Scholar] [CrossRef] [PubMed]
  20. Hsu, C.C.; Liou, J.J.; Chuang, Y.C. Integrating DANP and modified grey relation theory for the selection of an outsourcing provider. Expert Syst. Appl. 2013, 40, 2297–2304. [Google Scholar] [CrossRef]
  21. Huang, C.N.; Liou, J.J.; Chuang, Y.C. A method for exploring the interdependencies and importance of critical infrastructures. Knowl.-Based Syst. 2014, 55, 66–74. [Google Scholar] [CrossRef]
  22. Hung, Y.H.; Huang, T.L.; Hsieh, J.C.; Tsuei, H.J.; Cheng, C.C.; Tzeng, G.H. Online reputation management for improving marketing by using a hybrid MCDM model. Knowl.-Based Syst. 2012, 35, 87–93. [Google Scholar] [CrossRef]
  23. Crompton, H.; Gregory, K.; Burke, D. Humanoid robots supporting children’s learning in an early childhood setting. Br. J. Educ. Technol. 2018, 49, 911–927. [Google Scholar] [CrossRef]
  24. Zhexenova, Z.; Amirova, A.; Abdikarimova, M.; Kudaibergenov, K.; Baimakhan, N.; Tleubayev, B.; Asselborn, T.; Johal, W.; Dillenbourg, P.; CohenMiller, A.; et al. A comparison of social robot to tablet and teacher in a new script learning context. Front. Robot. AI 2020, 7, 99. [Google Scholar] [CrossRef] [PubMed]
  25. Hashimoto, T.; Kato, N.; Kobayashi, H. Development of educational system with the android robot SAYA and evaluation. Int. J. Adv. Robot. Syst. 2011, 8, 28. [Google Scholar] [CrossRef]
  26. Billard, A.; Kragic, D. Trends and challenges in robot manipulation. Science 2019, 364, eaat8414. [Google Scholar] [CrossRef] [PubMed]
  27. Cheng, Y.W.; Sun, P.C.; Chen, N.S. The essential applications of educational robot: Requirement analysis from the perspectives of experts, researchers, and instructors. Comput. Educ. 2018, 126, 399–416. [Google Scholar] [CrossRef]
  28. Dias, M.B.; Mills-Tettey, G.A.; Nanayakkara, T. Robotics, education, and sustainable development. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 4248–4253. [Google Scholar]
  29. Alam, A. Employing adaptive learning and intelligent tutoring robots for virtual classrooms and smart campuses: Reforming education in the age of artificial intelligence. In Advanced Computing and Intelligent Technologies: Proceedings of ICACIT 2022; Springer: Berlin/Heidelberg, Germany, 2022; pp. 395–406. [Google Scholar]
  30. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef]
  31. Nugent, G.; Barker, B.; Grandgenett, N.; Adamchuk, V.I. Impact of robotics and geospatial technology interventions on youth STEM learning and attitudes. J. Res. Technol. Educ. 2010, 42, 391–408. [Google Scholar] [CrossRef]
  32. Chen, L.; Chen, P.; Lin, Z. Artificial intelligence in education: A review. IEEE Access. 2020, 8, 75264–75278. [Google Scholar] [CrossRef]
  33. Pan, Y. Heading toward artificial intelligence 2.0. Engineering 2016, 2, 409–413. [Google Scholar] [CrossRef]
  34. Peng, S.D. On Robotics Education (above). E-Educ. Res. 2002, 6, 3–7. [Google Scholar] [CrossRef]
  35. Alimisis, D. Educational robotics: Open questions and new challenges. Themes Sci. Technol. Educ. 2013, 6, 63–71. [Google Scholar]
  36. Mitnik, R.; Recabarren, M.; Nussbaum, M.; Soto, A. Collaborative robotic instruction: A graph teaching experience. Comput. Educ. 2009, 53, 330–342. [Google Scholar] [CrossRef]
  37. Scaradozzi, D.; Screpanti, L.; Cesaretti, L. Towards a definition of educational robotics: A classification of tools, experiences and assessments. In Smart Learning with Educational Robotics: Using Robots to Scaffold Learning Outcomes; Springer: Berlin/Heidelberg, Germany, 2019; pp. 63–92. [Google Scholar]
  38. Cooney, M.; Leister, W. Using the engagement profile to design an engaging robotic teaching assistant for students. Robotics 2019, 8, 21. [Google Scholar] [CrossRef]
  39. Robaczewski, A.; Bouchard, J.; Bouchard, K.; Gaboury, S. Socially assistive robots: The specific case of the NAO. Int. J. Soc. Robot. 2021, 13, 795–831. [Google Scholar] [CrossRef]
  40. Verner, I.M.; Polishuk, A.; Krayner, N. Science class with RoboThespian: Using a robot teacher to make science fun and engage students. IEEE Robot. Autom. Mag. 2016, 23, 74–80. [Google Scholar] [CrossRef]
  41. Costa, S.; Brunete, A.; Bae, B.C.; Mavridis, N. Emotional storytelling using virtual and robotic agents. Int. J. Hum. Robot. 2018, 15, 1850006. [Google Scholar] [CrossRef]
  42. Garrison, D.R. Critical thinking and adult education: A conceptual model for developing critical thinking in adult learners. Int. J. Lifelong Educ. 1991, 10, 287–303. [Google Scholar] [CrossRef]
  43. Sharkey, A.J. Should we welcome robot teachers? Ethics Inf. Technol. 2016, 18, 283–297. [Google Scholar] [CrossRef]
  44. Smakman, M.; Berket, J.; Konijn, E.A. The impact of social robots in education: Moral considerations of dutch educational policymakers. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication, Naples, Italy, 31 August–4 September 2020; pp. 647–652. [Google Scholar]
  45. Murdoch-Eaton, D.; Whittle, S. Generic skills in medical education: Developing the tools for successful lifelong learning. Med. Educ. 2012, 46, 120–128. [Google Scholar] [CrossRef]
  46. Drigas, A.; Papanastasiou, G.; Skianis, C. The School of the Future: The Role of Digital Technologies, Metacognition and Emotional Intelligence. Int. J. Emerg. Technol. Learn. 2023, 18, 65. [Google Scholar] [CrossRef]
  47. Ahmed, H.; La, H.M. Education-robotics symbiosis: An evaluation of challenges and proposed recommendations. In Proceedings of the 2019 IEEE Integrated STEM Education Conference (ISEC), Princeton, NJ, USA, 16 March 2019; pp. 222–229. [Google Scholar]
  48. Koulouriotis, D.E.; Ketipi, M.K. Robot evaluation and selection Part A: An integrated review and annotated taxonomy. Int. J. Adv. Manuf. Technol. 2014, 71, 1371–1394. [Google Scholar] [CrossRef]
  49. Matheson, E.; Minto, R.; Zampieri, E.G.; Faccio, M.; Rosati, G. Human–robot collaboration in manufacturing applications: A review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef]
  50. Büyüközkan, G.; Çifçi, G. A novel hybrid MCDM approach based on fuzzy DEMATEL, fuzzy ANP, and fuzzy TOPSIS to evaluate green suppliers. Expert Syst. Appl. 2012, 39, 3000–3011. [Google Scholar] [CrossRef]
  51. Sakthivel, G.; Ilangkumaran, M.; Gaikwad, A. A hybrid multi-criteria decision modeling approach for the best biodiesel blend selection based on ANP-TOPSIS analysis. Ain Shams Eng. J. 2015, 6, 239–256. [Google Scholar] [CrossRef]
  52. Liu, H.C.; Ren, M.L.; Wu, J.; Lin, Q.L. An interval 2-tuple linguistic MCDM method for robot evaluation and selection. Int. J. Prod. Res. 2014, 52, 2867–2880. [Google Scholar] [CrossRef]
  53. Parameshwaran, R.; Kumar, S.P.; Saravanakumar, K. An integrated fuzzy MCDM based approach for robot selection considering objective and subjective criteria. Appl. Soft Comput. 2015, 26, 31–41. [Google Scholar] [CrossRef]
  54. Sen, D.K.; Datta, S.; Mahapatra, S.S. Extension of PROMETHEE for robot selection decision making: Simultaneous exploration of objective data and subjective (fuzzy) data. Benchmarking 2016, 23, 983–1014. [Google Scholar] [CrossRef]
  55. Saaty, T. The analytic hierarchy process (AHP) for decision making. Kobe Jpn. 1980, 1, 69. [Google Scholar]
  56. Bhattacharya, A.; Sarkar, B.; Mukherjee, S.K. Integrating AHP with QFD for robot selection under requirement perspective. Int. J. Prod. Res. 2005, 43, 3671–3685. [Google Scholar] [CrossRef]
  57. Goh, C.H. Analytic hierarchy process for robot selection. J. Manuf. Syst. 1997, 16, 381–386. [Google Scholar] [CrossRef]
  58. Geng, W.L.; Hu, Y.S. Performance evaluation of robot design based on AHP. Int. J. Database Theory Appl. 2013, 6, 79–88. [Google Scholar]
  59. Wang, C.N.; Nguyen, N.A.T.; Dang, T.T. Offshore wind power station (OWPS) site selection using a two-stage MCDM-based spherical fuzzy set approach. Sci. Rep. 2022, 12, 4260. [Google Scholar] [CrossRef]
  60. Kapoor, V.; Tak, S.S. Fuzzy application to the analytic hierarchy process for robot selection. Fuzzy Optim. Decis. Mak. 2005, 4, 209–234. [Google Scholar] [CrossRef]
  61. Wu, H.Y.; Chen, J.K.; Chen, I.S.; Zhuo, H.H. Ranking universities based on performance evaluation by a hybrid MCDM model. Measurement 2012, 45, 856–880. [Google Scholar] [CrossRef]
  62. Chang, C.W.; Lee, J.H.; Chao, P.Y.; Wang, C.Y.; Chen, G.D. Exploring the possibility of using humanoid robots as instructional tools for teaching a second language in primary school. J. Educ. Technol. Soc. 2010, 13, 13–24. [Google Scholar]
  63. Tsiakas, K.; Karkaletsis, V.; Makedon, F. A taxonomy in robot-assisted training: Current trends, needs, and challenges. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference, Corfu, Greece, 26–29 June 2018; pp. 208–213. [Google Scholar]
  64. Khaksar, S.M.S.; Khosla, R.; Chu, M.T.; Shahmehr, F.S. Service innovation using social robot to reduce social vulnerability among older people in residential care facilities. Technol. Forecast. Soc. Change 2016, 113, 438–453. [Google Scholar] [CrossRef]
  65. Merlet, J.P. Interval analysis and reliability in robotics. Int. J. Reliab. Saf. 2009, 3, 104–130. [Google Scholar] [CrossRef]
  66. Yoshino, K.; Zhang, S. Construction of assistive teaching robot in programming class. In Proceedings of the 2018 7th International Congress on Advanced Applied Informatics, Yonago, Japan, 8–13 July 2018; pp. 215–220. [Google Scholar]
  67. Yu, M.; Zhou, R.; Cai, Z.; Tan, C.W.; Wang, H. Unravelling the relationship between response time and user experience in mobile applications. Internet Res. 2020, 30, 1353–1382. [Google Scholar] [CrossRef]
  68. Park, S.J.; Han, J.H.; Kang, B.H.; Shin, K.C. Assistive teaching robot, ROBOSEM, in English class and practical issues for its diffusion. In Proceedings of the Advanced Robotics and Its Social Impacts, Menlo Park, CA, USA, 2–4 October 2011; pp. 8–11. [Google Scholar]
  69. Ryu, H.J.; Song, M.J.; Choi, J.G.; Kim, M.S. Visualization of assistive teaching robot’s image based on child’s mental model. Arch. Des. Res. 2007, 20, 177–188. [Google Scholar]
  70. Yang, D.; Oh, E.S.; Wang, Y. Hybrid physical education teaching and curriculum design based on a voice interactive artificial intelligence educational robot. Sustainability 2020, 12, 8000. [Google Scholar] [CrossRef]
  71. Huijnen, C.A.; Lexis, M.A.; Jansens, R.; de Witte, L.P. How to implement robots in interventions for children with autism? A co-creation study involving people with autism, parents, and professionals. J. Autism Dev. Disord. 2017, 47, 3079–3096. [Google Scholar] [CrossRef] [PubMed]
  72. Hsia, C.H.; Lai, C.F.; Su, Y.S. Impact of using ARCS model and problem-based learning on human interaction with robot and motivation. Libr. Hi Tech 2022, 40, 963–975. [Google Scholar] [CrossRef]
  73. Sonderegger, S. Enhancing learning processes by integrating social robots with learning management systems. In Proceedings of the 2022 31st IEEE International Conference on Robot and Human Interactive Communication, Napoli, Italy, 29 August–2 September 2022; pp. 365–370. [Google Scholar]
  74. Wu, W.C.V.; Wang, R.J.; Chen, N.S. Instructional design using an in-house built teaching assistant robot to enhance elementary school English-as-a-foreign-language learning. Interact. Learn. Environ. 2015, 23, 696–714. [Google Scholar] [CrossRef]
  75. Louie, W.Y.G.; Nejat, G. A social robot learning to facilitate an assistive group-based activity from non-expert caregivers. Int. J. Soc. Robot. 2020, 12, 1159–1176. [Google Scholar] [CrossRef]
  76. Hong, Z.W.; Huang, Y.M.; Hsu, M.; Shen, W.W. Authoring robot-assisted instructional materials for improving learning performance and motivation in EFL classrooms. J. Educ. Technol. Soc. 2016, 19, 337–349. [Google Scholar]
  77. Cross, E.S.; Hortensius, R.; Wykowska, A. From social brains to social robots: Applying neurocognitive insights to human-robot interaction. Philos. Trans. R. Soc. B 2019, 374, 20180024. [Google Scholar] [CrossRef]
  78. Lauridsen, K.; Christensen, P.; Kongsø, H.E. Assessment of the reliability of robotic systems for use in radiation environments. Reliab. Eng. Syst. Saf. 1996, 53, 265–276. [Google Scholar] [CrossRef]
  79. Jung, J.G.; Choi, J.H.; Han, J.H. Analysis on children’s response depending on teaching assistant robots’ styles. J. Korean Assoc. Inf. Educ. 2007, 11, 195–203. [Google Scholar]
  80. Hsu, C.H.; Wang, F.K.; Tzeng, G.H. The best vendor selection for conducting the recycled material based on a hybrid MCDM model combining DANP with VIKOR. Resour. Conserv. Recycl. 2012, 66, 95–111. [Google Scholar] [CrossRef]
  81. Govindan, K.; Kannan, D.; Shankar, M. Evaluation of green manufacturing practices using a hybrid MCDM model combining DANP with PROMETHEE. Int. J. Prod. Res. 2015, 53, 6344–6371. [Google Scholar] [CrossRef]
  82. Hung, Y.H.; Chou, S.C.T.; Tzeng, G.H. Knowledge management adoption and assessment for SMEs by a novel MCDM approach. Decis. Support Syst. 2011, 51, 270–291. [Google Scholar] [CrossRef]
  83. Tzeng, G.H.; Chiang, C.H.; Li, C.W. Evaluating intertwined effects in e-learning programs: A novel hybrid MCDM model based on factor analysis and DEMATEL. Expert Syst. Appl. 2007, 32, 1028–1044. [Google Scholar] [CrossRef]
  84. Ordoobadi, S.M. Application of ANP methodology in evaluation of advanced technologies. J. Manuf. Technol. Manag. 2012, 23, 229–252. [Google Scholar] [CrossRef]
  85. Shao, Q.G.; Liou, J.J.; Weng, S.S.; Chuang, Y.C. Improving the green building evaluation system in China based on the DANP method. Sustainability 2018, 10, 1173. [Google Scholar] [CrossRef]
  86. James, G.M.; Sugar, C.A. Clustering for sparsely sampled functional data. J. Am. Stat. Assoc. 2003, 98, 397–408. [Google Scholar] [CrossRef]
  87. DiPasquale, D.; Wheaton, W.C. The markets for real estate assets and space: A conceptual framework. Real Estate Econ. 1992, 20, 181–198. [Google Scholar] [CrossRef]
  88. Yang, C.L.; Chuang, S.P.; Huang, R.H. Manufacturing evaluation system based on AHP/ANP approach for wafer fabricating industry. Expert Syst. Appl. 2009, 36, 11369–11377. [Google Scholar] [CrossRef]
  89. Chung, S.H.; Lee, A.H.; Pearn, W.L. Product mix optimization for semiconductor manufacturing based on AHP and ANP analysis. J. Adv. Manuf. Technol. 2005, 25, 1144–1156. [Google Scholar] [CrossRef]
Figure 1. Quadrant distribution of functional evaluation index elements.
Figure 1. Quadrant distribution of functional evaluation index elements.
Sustainability 15 13196 g001
Figure 2. Network of systematic impact and the four aspects’ relationship diagram.
Figure 2. Network of systematic impact and the four aspects’ relationship diagram.
Sustainability 15 13196 g002
Table 1. Literature sources for the assistive teaching robots’ functional evaluation index system.
Table 1. Literature sources for the assistive teaching robots’ functional evaluation index system.
DimensionsIndicatorsLiterature Sources
System Structure (X1)System Reliability (X11)[67,68,78]
System Security (X12)[69,78]
System Flexibility (X13)[66,68,69]
Functional Diversity (X14)[6,65,69]
Operational Convenience (X15)[66,67,68]
Appearance interface (X2)Esthetics of the Exterior Structure (X21)[12,68,79]
Clarity of Interface Presentation (X22)[69,70]
Simplicity of Interface Interaction (X23)[12,63,70]
Degree of Equipment Expansion (X24)[19,63]
Teaching function (X3)Teaching Content Standardization (X31)[66,71,72]
Teaching Content Applicability (X32)[71,73,74]
Teaching Content Integrity (X33)[66,74]
Control of the Teaching Process (X34)[71,72,75]
Auxiliary Support (X4)Regulatory Capability (X41)[68,75]
Voice Interaction Capability (X42)[66,68,77]
Resource Management Capability (X43)[76,77]
Learning Support Capability (X44)[63,77,78]
Table 2. Total influence matrix.
Table 2. Total influence matrix.
ZX11X12X13X14X15X21X22X23X24X31X32X33X34X41X42X43X44
X110.200.280.310.310.350.170.180.240.310.300.300.300.350.340.280.310.34
X120.240.160.220.260.280.130.140.200.260.250.260.260.300.300.230.260.27
X130.220.210.180.300.320.150.150.230.270.260.260.250.300.290.230.260.29
X140.260.240.290.260.350.170.190.270.310.290.310.300.350.330.270.310.34
X150.220.210.250.280.240.150.160.240.250.240.260.250.310.280.230.270.29
X210.120.110.140.170.220.080.160.190.160.170.170.170.190.170.130.160.17
X220.160.150.170.220.260.190.110.230.200.210.230.220.240.220.180.210.23
X230.210.190.250.280.320.190.190.180.230.250.270.250.310.260.220.250.28
X240.270.270.290.350.360.170.170.260.240.270.300.290.330.320.280.300.34
X310.150.160.160.210.230.120.130.180.180.170.250.240.260.230.180.240.25
X320.140.140.160.200.230.100.110.180.180.230.180.240.260.240.180.240.27
X330.160.150.180.220.250.110.130.190.210.250.260.190.270.250.190.260.28
X340.160.160.190.220.260.110.130.190.200.250.260.260.220.280.210.260.29
X410.190.210.200.260.270.120.130.200.220.260.270.260.300.220.230.260.29
X420.200.190.220.280.310.120.140.220.250.240.260.250.290.280.180.250.28
X430.220.220.240.290.300.130.150.220.250.260.270.270.280.280.220.220.30
X440.200.200.220.270.280.130.140.200.240.240.260.250.270.270.220.260.22
Table 3. Influence matrix by dimension.
Table 3. Influence matrix by dimension.
Z D X1X2X3X4
X10.260.210.280.29
X20.230.180.240.23
X30.190.150.240.24
X40.240.180.260.25
Table 4. Comprehensive influence matrix.
Table 4. Comprehensive influence matrix.
INRMX11X12X13X14X15X21X22X23X24X31X32X33X34X41X42X43X44
r4.834.044.154.814.142.683.424.144.813.313.273.543.673.883.964.143.86
c3.353.273.614.364.822.332.493.623.974.124.374.244.824.543.674.314.74
r + c8.177.317.779.178.965.015.917.768.777.447.637.788.498.427.638.468.59
r − c1.480.770.540.45−0.680.340.920.520.84−0.81−1.10−0.70−1.16−0.660.29−0.17−0.88
Table 5. Influence index and ranking of each dimension.
Table 5. Influence index and ranking of each dimension.
DimensionsInfluence
Degree/Rank
Influenced
Degree/Rank
Centrality
/Rank
Cause Degree
/Rank
System Structure (X1)1.034(1)0.907(3)1.941(1)0.128(2)
Appearance interface (X2)0.884(2)0.724(4)1.607(4)0.160(1)
Teaching function (X3)0.815(3)1.026(1)1.841(3)−0.211(3)
Auxiliary Support (X4)0.930(4)1.007(2)1.938(2)−0.077(4)
System Reliability(X11)4.826(1)3.345(14)8.171(8)1.481(1)
System Security(X12)4.042(8)3.27(15)7.312(15)0.772(4)
System Flexibility(X13)4.152(4)3.615(13)7.767(10)0.537(5)
Functional Diversity(X14)4.811(2)4.361(6)9.172(1)0.449(7)
Operational Convenience(X15)4.138(7)4.821(2)8.959(2)−0.683(12)
Esthetics of the Exterior Structure(X21)2.675(17)2.331(17)5.006(17)0.345(8)
Clarity of Interface Presentation(X22)3.418(14)2.493(16)5.91(16)0.925(2)
Simplicity of Interface Interaction(X23)4.142(6)3.621(12)7.764(11)0.521(6)
Degree of Equipment Expansion(X24)4.807(3)3.966(10)8.773(3)0.841(3)
Teaching Content Standardization(X31)3.314(15)4.123(9)7.437(14)−0.809(14)
Teaching Content Applicability(X32)3.266(16)4.367(5)7.632(12)−1.101(16)
Teaching Content Integrity(X33)3.541(13)4.243(8)7.785(9)−0.702(13)
Control of the Teaching Process(X34)3.667(12)4.825(1)8.492(5)−1.158(17)
Regulatory Capability(X41)3.88(10)4.544(4)8.424(7)−0.665(11)
Voice Interaction Capability(X42)3.96(9)3.666(11)7.625(13)0.294(9)
Resource Management Capability(X43)4.143(5)4.313(7)8.456(6)−0.17(10)
Learning Support Capability(X44)3.857(11)4.735(3)8.592(4)−0.879(15)
Table 6. Four-quadrant index factor relationship chart.
Table 6. Four-quadrant index factor relationship chart.
QuadrantsTitleFactors
(Sorted by Centrality)
Features
1Driving FactorX11 X12 X13 X14X23 X24 X42The reason degree is positive; the centroid is above 7
2VoluntarinessX21 X22The reason degree is positive; the centroid is below 7
3Independent/The reason degree is negative; the centroid is below 7
4Core ProblemX15 X31 X32 X33X34 X41 X43 X44The reason degree is negative; the centroid is above 7
Table 7. Unweighted supermatrix.
Table 7. Unweighted supermatrix.
ZX11X12X13X14X15X21X22X23X24X31X32X33X34X41X42X43X44
X110.140.200.190.220.250.190.200.270.340.240.240.240.280.270.220.240.27
X120.210.140.190.220.240.180.190.270.360.230.250.240.280.280.220.250.26
X130.180.170.150.240.260.190.190.290.330.240.240.230.280.270.220.240.27
X140.190.170.200.180.250.180.200.290.330.230.250.240.280.260.210.250.27
X150.190.170.210.230.200.190.200.300.310.230.250.240.290.260.220.250.27
X210.160.150.180.220.280.140.270.320.270.240.250.240.270.270.210.250.27
X220.170.160.170.230.270.260.150.320.280.240.250.240.270.260.210.250.28
X230.170.150.200.230.260.240.240.230.290.230.250.230.290.260.210.250.28
X240.170.170.190.230.230.200.210.310.280.220.250.250.280.260.230.240.27
X310.170.180.170.230.250.190.210.300.300.180.270.270.280.250.200.260.28
X320.170.160.180.230.260.180.190.310.320.250.200.270.290.260.200.260.29
X330.170.160.180.230.260.180.200.300.320.250.270.190.280.250.200.260.28
X340.160.160.190.220.260.170.200.300.320.250.270.260.220.270.200.250.28
X410.170.190.180.230.240.170.190.300.340.240.250.240.270.220.230.260.29
X420.170.160.190.230.260.170.190.300.340.240.250.240.280.280.180.250.29
X430.170.170.190.230.230.170.200.290.340.240.250.250.260.280.220.210.30
X440.170.170.190.230.240.180.200.280.340.240.250.250.270.280.230.270.23
Table 8. Polarized supermatrix.
Table 8. Polarized supermatrix.
ZX11X12X13X14X15X21X22X23X24X31X32X33X34X41X42X43X44
X110.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.04
X120.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.04
X130.050.050.050.050.050.050.050.050.050.050.050.050.050.050.050.050.05
X140.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.06
X150.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.06
X210.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.04
X220.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.040.04
X230.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.06
X240.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.06
X310.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.07
X320.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.07
X330.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.07
X340.080.080.080.080.080.080.080.080.080.080.080.080.080.080.080.080.08
X410.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.07
X420.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.060.06
X430.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.070.07
X440.080.080.080.080.080.080.080.080.080.080.080.080.080.080.080.080.08
Table 9. Functional evaluation impact factor weights.
Table 9. Functional evaluation impact factor weights.
FactorX34X44X41X32X43X33X31X24X15X42X23X14X13X11X12X22X21
Weight0.0770.0760.0720.0700.0690.0680.0660.0630.0610.0580.0580.0560.0460.0420.0410.0390.036
Rank1234567891011121314151617
Table 10. Overall weighting table for functional evaluation impact factors.
Table 10. Overall weighting table for functional evaluation impact factors.
TLocal WeightRankingTLocal WeightRankingGlobal Weight
X10.2463X110.172140.042
X120.168150.041
X130.186130.046
X140.226120.056
X150.24890.061
X20.1964X210.186170.036
X220.201160.039
X230.294110.058
X240.32080.063
X30.2811X310.23570.066
X320.24940.070
X330.24360.068
X340.27310.077
X40.2762X410.26230.072
X420.211100.058
X430.25150.069
X110.172140.042
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Su, P.-Y.; Zhao, Z.-Y.; Shao, Q.-G.; Lin, P.-Y.; Li, Z. The Construction of an Evaluation Index System for Assistive Teaching Robots Aimed at Sustainable Learning. Sustainability 2023, 15, 13196. https://doi.org/10.3390/su151713196

AMA Style

Su P-Y, Zhao Z-Y, Shao Q-G, Lin P-Y, Li Z. The Construction of an Evaluation Index System for Assistive Teaching Robots Aimed at Sustainable Learning. Sustainability. 2023; 15(17):13196. https://doi.org/10.3390/su151713196

Chicago/Turabian Style

Su, Pei-Yao, Zi-Ying Zhao, Qi-Gan Shao, Pei-Yuan Lin, and Zhe Li. 2023. "The Construction of an Evaluation Index System for Assistive Teaching Robots Aimed at Sustainable Learning" Sustainability 15, no. 17: 13196. https://doi.org/10.3390/su151713196

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop