Next Article in Journal
Study on the Spatial Convergence Club and Growth Momentum of China’s Regional Economies
Previous Article in Journal
Potential Methods for Limiting the Consumption of Machine Components Exposed to Abrasive Wear
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Characterizing Undergraduate Students’ Systems-Thinking Skills through Agent-Based Modeling Simulation

by
Aparajita Jaiswal
1 and
Tugba Karabiyik
2,*
1
Center for Intercultural Learning, Mentorship, Assessment and Research, Purdue University, Young Hall, Room 120 155 S. Grant Street, West Lafayette, IN 47906, USA
2
Purdue Systems Collaboratory, College of Engineering, Purdue University, Grissom Hall, 315 Grant Street, West Lafayette, IN 47906, USA
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(19), 12817; https://doi.org/10.3390/su141912817
Submission received: 7 September 2022 / Revised: 22 September 2022 / Accepted: 3 October 2022 / Published: 8 October 2022

Abstract

:
Systems thinking is an essential skill for the future workforce. This study focuses on understanding students’ systems-thinking process via an agent-based model simulation. This study aimed to help students to improve their systems-thinking skills. We used a systems-thinking skills development framework to investigate and characterize students’ agent-based simulation assignment in the undergraduate level systems-methods course at a university in the American Midwest. We identified and characterized patterns of students’ systems-thinking processes based on four criteria: thinking, decision making, action, and interpretation. We classified students into three categories based on their systems-thinking abilities and qualitatively identified the least and most prominent patterns the students exhibited.

1. Introduction

The increasing technological advancement and growing human need for a better world have led to the demand for effective systems thinkers [1]. Systems thinking (ST) is a critical cognitive skill for the future workforce. Systems thinking is a holistic view of exploring the factors and interactions that could contribute to a possible outcome [2]. Although there are various definitions and perspectives for systems thinking, they all view systems thinking as a skill to use in solving complex problems [3,4,5,6,7,8,9,10,11,12]. Across the globe, systems-thinking skills are considered to be critical for the 21st-century [13]. In confronting complex systems and their problems and extreme uncertainty, systems-thinking skills have become increasingly important in light of the increasing pace of change, technology, and information [14,15]. Therefore, developing systems-thinking skills in grades K-12 and higher education settings is essential. Several researchers have previously focused on the early development of these skills in K-12 settings [16,17,18,19,20].
The development of systems-thinking skills, starting in elementary school, is essential [16]. For example, Assaraf et al. addressed the question of whether elementary school students, working in a hydrological system, can deal with complex systems [16]. In that study, students were involved with lab simulations and experiments and worked with components and processes of the water cycle in an outdoor learning environment. The study found that most of the students made significant progress in their ability to analyze the components and processes of earth’s hydrological system. Other research showed that high school students who completed a yearlong systems-based learning program can develop systematic mental models and remember learned material based on learning patterns that tend to remain unchanged over time, even after six years [18].
In addition, research has been conducted on assessing and enhancing systems-thinking skills in higher education, using various techniques [21,22,23,24]. However, there has been very limited research conducted in characterizing the systems-thinking abilities of learners in an undergraduate-level engineering classroom. This study is aimed at characterizing the systems-thinking skills of such learners by engaging them in an agent-based modeling simulation project. Agent-Based Modeling and Simulation (ABMS) is a computational modeling approach for defining complex systems, their behavior, and interactions [25]. One of the most popular software systems used in education and research to develop deeper understanding of agent-based models is NetLogo software (version 6.2.2, Evanston, IL, USA) [26,27,28]. NetLogo is a multi-agent modeling environment that models systems or simulates existing models in NetLogo’s models library.
To be specific, the intent of this study is to characterize the systems-thinking abilities of junior- and senior-level undergraduate students, using agent-based modeling and simulation via NetLogo software in a systems-methods class. The project was developed on the basis of a systems-thinking skills development framework proposed by Keating et al. [14]. The framework classifies systems-thinking skills into four levels. The first level is the thinking process itself. It is followed by a decision-making level. The third level is proposing or taking action for the system. This leads to the fourth level, which is an interpretation of the outcomes produced by the system. The theoretical framework is explained in detail in the theoretical framework section of this article. The objective of the systems-thinking project is to help students develop and demonstrate their systems-thinking skills.
The main contributions of this study are identifying and characterizing systems-thinking skills via agent-based modeling and simulation in an undergraduate-level systems-methods course. In order to characterize the systems-thinking abilities of the learners, our guiding research question (RQ) for the study was as follows:
RQ: What are the characteristics of students’ systems-thinking approach while working on an agent-based modeling assignment?
This paper is further structured as follows: Section 2 reviews the related background in systems thinking and agent-based modeling and simulation. Section 3 describes the theoretical framework that we used. Section 4 outlines the research methodology. Section 5 describes the results of the study and provides recommendations. Section 6 discusses the results based on other existing studies, followed by a conclusion and a statement of the limitations of the study in Section 7.

2. Background

2.1. Systems Thinking (ST) and ST Skills

A system is a group of interacting, interrelated, and interdependent components that form a complex and unified whole. In 1969, Karl Ludwig von Bertalanffy developed a general system theory (GSP) in an effort to find unification in science [29]. He opened a new perspective for systems and shifted from traditional linear thinking, which was focused on exploring the parts of a system, to see the whole [30,31].
In 1956, the systems-thinking concept was proposed by Jay Forrester [32]. Systems thinking is a holistic approach, looking at the big picture of a system, in which the system’s parts are interrelated. Systems thinking is used in many areas, including economics [33], environmental sciences [34], medical sciences [35], and educational sciences [16,17,36]. A systems-thinking framework is considered critical for learning in tomorrows’ world and preparing people to effectively shape their futures [34]. Systems thinking uses a variety of computer simulations, along with diagrams and graphs, to model, simulate, and predict system behavior. Two of the two standard models for optimizing the understanding of systems’ behavior are system-dynamics models and agent-based models. Arndt [36] used system-dynamics models to enhance systems thinking in higher education and found that integrated learning environments that consisted of system-dynamics models had positive learning effects. In this study, we focus on the use of agent-based modeling and simulation for the development of systems-thinking skills.
Effective systems thinkers understand the interrelationships among different parts of a system to avoid linear cause-and-effect thinking that is focused on obvious diagnostics. The goal of a systems thinker is to apply a holistic view as an approach to problem-solving in light of the big picture. The habits of a systems thinker include the following:
  • understanding the overall picture of the problem;
  • observing how elements change internally within the system and observing any trends and patterns in the system;
  • recognizing the system’s structure that generates its behavior;
  • identifying the pattern of complex cause-and-effect relationships;
  • making meaningful connections within and between systems;
  • changing perspectives to increase understanding;
  • testing assumptions;
  • considering an issue thoroughly and refusing the urge to come to a quick conclusion;
  • recognizing how mental models affect current and future system structures;
  • using the knowledge of system structures to identify possible leverage actions;
  • dealing with short-term, long-term, and unintended consequences of actions;
  • paying attention to accumulations and their rates of change;
  • observing the impact of time and delays when exploring cause-and-effect relationships; and
  • checking results and changing actions if needed, i.e., “successive approximation” [37].
It is essential to develop and enhance systems-thinking skills using learning experiences and computational models that harness these skills. In this study, we aim to operationalize undergraduate students’ systems-thinking skills via agent-based modeling and simulation in a systems-methods class. Thus, in the next section we discuss agent-based modeling and simulation to provide readers with background about these approaches.

2.2. Agent-Based Modeling and Simulation

Agent-based modeling and simulation (ABMS) is a computational model for simulating the actions and interactions of agents—either intelligent individuals or groups and organizations [38,39]. Agent-based models allow users to simulate the interactions of multiple agents to predict the presence of a complex system. One popular software system that is used for modeling and simulation purposes is NetLogo [40,41]. Four types of agents exist in NetLogo: turtles, patches, links, and observers. Turtles are active agents that can move inside the environment. Turtles can perform programmed functions during their movements and reflect correlative interactions with other turtles and agents. Patches are the square grounds where turtles can move; links connect two turtles; and observers are the agents who observe the simulated world and act as the interface between turtles and the researchers. ABMS has also been used to study systems behaviors and systems thinking [38,39,42,43,44,45,46,47]. For instance, Abbott and Hadzikadic explored the interactions among fly populations, climate, and the environment [45]. They presented a systems-thinking approach to the problem and described it as a system of systems. Then, they implemented an agent-based modeling approach to explore this complex adaptive system. Holman et al. applied agent-based modeling in the ergonomics discipline, using a systems-thinking approach [46]. As the role of ergonomics is to understand the composition and behavior of systems to optimize human wellbeing and overall system performance, computational modeling approaches are crucial in examining the complex and dynamic nature of the systems [48].
Systems-thinking skills can be developed and enhanced using computational modeling, such as agent-based modeling and simulation. In this study, we used an agent-based modeling assignment in a junior-level undergraduate systems-methods course to characterize students’ systems-thinking skills. In addition, using qualitative analysis, we defined the characteristics of expert and novice systems thinkers.

3. Theoretical Framework

This study characterizes systems-thinking skills in four levels: thinking, decision making, action, and interpretation. We used the systems-thinking skills development framework [14] for this study. The framework, shown in Figure 1, suggests that thinking is the first step when exploring a systems-thinking problem; thinking allows the systems thinker to understand the system consciously. The next step is to make informed decisions. Rational decision making allows a systems thinker to increase the alternatives available, to make a more informed set of decisions in the expanded decision space. The third level is action, which allows the systems thinker to apply the decision to solve a problem and evaluate its impact. Once the decision is translated into action, it calls for the last step, interpretation. Interpretation allows the systems thinker to evaluate the outcome of the thinking, decision, and action taken to solve the problem.
It is important to note that all of these skills are applied in sequence and follow a cyclical path. Since the intent of our study was to characterize learners based on their systems-thinking skills, we used the systems-thinking skills development framework to assess students’ informed levels of thinking, decision-making ability, proposed actions based on the decisions, and interpretation of the actions or explanation of the findings to share perspectives. In this study, the framework also served as the basis for designing the project.

4. Methods

The study followed a descriptive research design [49]. A descriptive research design allows researchers to understand the phenomena under consideration. In this study, the design used both quantitative and qualitative approaches to analyze data. Quantitative data analysis helped to explore the problem, and qualitative data analysis allowed for uncovering hidden patterns in the data. To answer the research question, the written reflections of 45 students were scored using rubrics. The data were first analyzed using quantitative approaches, such as clustering, descriptive statistics, and inferential statistics. In the next step, the data were qualitatively analyzed to identify learning patterns for expert and novice systems thinkers. Further results of the quantitative and qualitative analysis were integrated to complete the discussion and conclusion sections of this paper.
This Methods section is divided into four broad subsections. The first is about the learning design of the course; the second describes the context and the participants; in the third section, we describe our data collection process; and in the fourth section we detail the steps we used to conduct the quantitative and qualitative data analysis.

4.1. Learning Design

The learning environment of the systems methods course was grounded in a project-based learning approach, which is a student-centered pedagogy that requires students to work on a project for an extended period of time to explore and respond to a complex problem [50,51]. Students first explored and thought about the problem to be solved. Then, they brainstormed about the ideas that were to be decided, using their prior knowledge to guide them in the process. After that, they took action and carried out an investigation to find an optimized solution for the problem. Finally, they presented the findings and interpreted the solution to the problem [14,52]. The course required students to work individually and as a team on several projects. The primary learning outcome was the development of systems-thinking skills using systems methods for analyzing and solving real-world problems while working in a project-based learning environment. In working on the projects, students applied their conceptual knowledge to the model’s requirements to predict the system’s behavior, using computational tools [41,53]. Students first described and understood systems methods from engineering, biological, physical, and social scientific perspectives [45]. They explained the fundamental concepts of systems methods and models, with applications to real-world problems. Then, they developed proficiency in applying computational modeling to represent the behavior of complex systems with real-life applications. Finally, they applied concepts and tools of systems-methods research to develop comprehensive policies/solutions/strategies for the complex system.

4.2. Context and Participants

This study focused on an undergraduate-level systems-methods course offered in the spring semester of 2022, with a total group of 45 students. Most of the students were in their senior year of university and were pursuing an engineering degree. The class’s demographic information (gender and academic level) is presented in Table 1. In addition, prior to conducting the study, approval from the university’s Institution Review Board was obtained. Pseudonyms were used for the students to protect the confidentiality of their responses.
Table 1 provides the gender and academic level demographic information. The class had 20% female and 80% male students enrolled for the spring semester of 2022. The age of the students ranged from 19 to 22 years. Approximately 89% of the students were in their senior year. Thirty-four of the students majored in aeronautics and astronautics engineering; 7 majored in industrial engineering; 1 majored in mechanical engineering; 1 majored in electrical engineering; 1 majored in public health; and 1 majored in exploratory sciences.
The class was conducted face-to-face throughout the spring semester of 2022. In one of the specific projects in the course, students worked on agent-based modeling and simulation, using NetLogo to explore what a model is trying to show or explain. The main objective of this project was the use of computer simulation to investigate various models to explain the behavior of complex systems. In the end, the students were able to discuss the policies, solutions, and strategies for solving complex problems, and to suggest solutions using systems-science tools and systems thinking. In addition, they investigated the agents’ interactions and the overall behavior of the systems shown in the models. For this specific assignment, the students were asked to navigate through the models Library in NetLogo and read the description of the four models (rabbit grass weeds, ants, simple birth rates, and thermostat), run a few simulations on those models that they found appropriate, and, ultimately, choose only one model for this assignment. Then, they were asked to describe the model of their choice. They were expected to point out the input parameters of the model and to provide their explanations of the parameters’ roles in the model for a better understanding of the model.
Later, the students were asked to identify a “static” and “dynamic” question that they would like to address for their system. “Static” refers to analyzing the final status of the system. The question involved assessing the relationship between one input and one system output. “Dynamic” refers to analyzing the system’s status for each simulation at every step. In both cases, the students designed their behavior space for running simulations to address the system’s static and dynamic questions. Finally, they produced data through the behavior space experiments and created the plots needed to obtain the conclusions from their experiments.

4.3. Data Collection

The data were collected during the fourteenth week of the semester. The students were asked to respond to the questions related to the NetLogo assignment. The project’s questions were designed on the basis of the systems-thinking skill development model of Keating. Table 2 provides the questions that were asked as the part of the project. The responses to these questions, in the form of learning artifacts, served as the data for the study. Data were collected from 45 students who responded to the questions in their assignments. Therefore, a total of 270 responses were analyzed for the study.

4.4. Data Analysis Method

Data analysis was conducted in two steps. First, quantitative data were analyzed, followed by a qualitative analysis. A rubric was created, based on the systems-thinking framework for the quantitative analysis. The rubric consisted of four levels and four criteria. The rubric represented four systems-thinking criteria: thinking, decision making, action, and interpretation. These four criteria were derived from Keating’s [14] framework. Based on the framework, we decided to use four levels for systems-thinking learners. Level one was the absence of systems thinking; at this level, learners do not demonstrate systems thinking for the four mentioned criteria. Level two represented novice systems thinkers, learners who demonstrated a basic understanding of systems-thinking concepts for the four mentioned criteria. Level three represented competent systems thinkers, learners who had intermediate knowledge of systems-thinking concepts but lacked expertise. The fourth level represented expert systems thinkers, learners who had an advanced systems-thinking capability. Student responses to the corresponding criteria were read and scored on the basis of the rubrics in Table 3.
Further descriptive statistics were calculated for the four criteria, and the scores were interpreted on the basis shown in Table 4.
Moreover, the quantitative data obtained from the scoring of the reflections acted as an input for clustering analysis. The data were clustered, using a hierarchical clustering algorithm. Hierarchical clustering is a commonly used educational data-mining method that allows groups to be created on the basis of similarities [54]. Specifically, Ward’s method of hierarchical clustering was used to conduct the clustering analysis, as Ward’s method was used in prior studies to cluster smaller samples and was also used in exploratory studies [55,56]. After segregating the students into clusters, inferential statistics (using a Kruskal–Wallis H test with post hoc pair-wise comparisons via a Mann–Whitney U test) were conducted to identify whether there was a significant difference among the clusters. Because the data were ordinal in nature, we used the Kruskal–Wallis H test and the Mann–Whitney U test for the pair-wise comparison [57].

4.5. Qualitative Analysis

The data were qualitatively analyzed in the next phase, using inductive thematic analysis. The inductive thematic analysis for this study followed the steps delineated by Braun and Clark [58]: (1) becoming familiar with the data; (2) creating initial codes; (3) searching for themes; (4) evaluating the themes; (5) describing and naming the themes; and (6) writing the report. The thematic analysis aimed to identify the learning patterns or behaviors of the students with high systems-thinking abilities, compared with students who demonstrated low levels of systems thinking. The comparison allowed us to understand the learning patterns of students in two extreme groups.

5. Results

5.1. Quantitative Analysis

The results section was divided into two broad subsections: quantitative results and qualitative results. The quantitative results subsection was further divided into three broad categories: overall descriptive statistics, the results of clustering analysis, and the results of the Kruskal–Wallis H test and post hoc Mann–Whitney U test. The qualitative results section was also divided into categories: learning characterstics of expert systems thinkers, and learning characterstics of novice systems thinkers.

Overall Descriptive Statistics

Table 5 provides the mean and the spread for the four criteria. Based on the results, we concluded that, overall, the students demonstrated a high level of thinking ability and moderate levels of decision making, action, and interpretation abilities.

5.2. Results of Clustering

Descriptive Statistics, Cluster-Wise

The cluster analysis allowed us to take a deeper look into the systems-thinking ability of the students for the four constructs. Table 6 shows that the students in Cluster 1 demonstrated high levels of thinking, decision making, action, and interpretation abilities; they were referred to as expert systems thinkers. Students in Cluster 2 demonstrated a high level of thinking and moderate levels of decision making, action, and interpretation abilities; they were considered competent systems thinkers. Students in Cluster 3 demonstrated low levels of ability for all four constructs; they were referred to as novice systems thinkers.

5.3. Kruskal–Wallis H Test Results

The Kruskal–Wallis H test was used to identify whether there was a significant difference in the thinking, decision making, action, and interpretation abilities of students in the three clusters. The subsequent subsection provides the results of the Kruskal–Wallis H-Test and a post hoc Mann–Whitney U test for the four criteria of systems thinking.

5.3.1. Thinking Ability

(a) Results of the Kruskal–Wallis H test: The Kruskal–Wallis H test was conducted to compare the thinking ability of Cluster 1, Cluster 2, and Cluster 3 students. There was a significant difference in the thinking abilities of Cluster 1, Cluster 2, and Cluster 3 students at the p < 0.05 level [H (2) = 16.31, p = 0.000]. Students in Cluster 3 (Mdn = 1) demonstrated a lower thinking ability than students in Cluster 1 (Mdn = 3) and Cluster 2 (Mdn = 2).
(b) Results of the post hoc Mann–Whitney U test: The Mann–Whitney U test was used to compare the groups. The results indicated that the thinking ability of the Cluster 3 students was significantly lower than that of the Cluster 1 students (U (NC3 = 7, NC1 = 18) = 21.37, p = 0.00)) and the Cluster 2 students (U (NC3 = 7, NC2 = 20) = 14.35, p = 0.00)). Moreover, there was no significant difference in the thinking abilities of the students in Cluster 1 and Cluster 2 U (NC1 = 18, NC2 =20) = 7.02, p = 0.07)).

5.3.2. Decision-Making Ability

(a) Results of the Kruskal–Wallis H test: The Kruskal–Wallis H test was conducted to compare the decision-making ability of Cluster 1, Cluster 2, and Cluster 3 students. There was a significant difference in the decision-making abilities of Cluster 1, Cluster 2, and Cluster 3 students at the p<0.05 level [H (2) = 23.90, p = 0.000]. Students in Cluster 3 (Mdn = 1) demonstrated a lower decision-making ability than students in Cluster 1 (Mdn = 3) and Cluster 2 (Mdn = 2).
(b) Results of the post hoc Mann–Whitney U test: The Mann–Whitney U test was used to compare all the groups. The results indicated that the decision-making ability of the Cluster 3 student was significantly lower than that of the Cluster 1 students (U (NC3 = 7, NC1 = 18) = 25.61, p = 0.00)) and Cluster 2 students (U (NC3 = 7, NC2 = 20) = 14.07, p = 0.00)). Moreover, the students in Cluster 1 demonstrated a statistically significant higher decision-making ability than the Cluster 2 students, (U (NC1 = 18, NC2 = 20) = 11.53, p = 0.00)).

5.3.3. Action-Taking Ability

(a) Results of the Kruskal–Wallis H test: The Kruskal–Wallis H test was conducted to compare the action-taking ability of Cluster 1, Cluster 2, and Cluster 3 students. There was a significant difference in the action-taking abilities of Cluster 1, Cluster 2, and Cluster 3 students at the p < 0.05 level [H (2) = 35.25, p = 0.000]. Students in Cluster 3 (Mdn = 1) demonstrated a lower action-taking ability than students in Cluster 1 (Mdn = 3) and Cluster 2 (Mdn = 1.5).
(b) Results of the post hoc Mann–Whitney U test: The Mann–Whitney U test was used to compare all the groups. The results indicated that the action-taking of the Cluster 3 students was significantly lower than that of the Cluster 1 students (U (NC3 = 7, NC1 = 18) = 30.05, p = 0.00)) and Cluster 2 students (U (NC3 = 7, NC2 = 20) = 18.86, p = 0.00)). Moreover, the students in Cluster 1 demonstrated a statistically significant higher action-taking ability than the Cluster 2 students (U (NC1 = 18, NC2 = 20) = 11.2, p = 0.04)).

5.3.4. Interpretation Ability

(a) Results of the Kruskal–Wallis H test: The Kruskal–Wallis H test was conducted to compare the interpretation ability of Cluster 1, Cluster 2, and Cluster 3 students. There was a significant difference in the interpretation ability of Cluster 1, Cluster 2, and Cluster 3 students at the p < 0.05 level [H (2) = 37.70, p = 0.000]. Students in Cluster 3 (Mdn = 1) demonstrated a lower interpretation ability than students in Cluster 1 (Mdn = 3) and Cluster 2 (Mdn = 1).
(b) Results of the post hoc Mann–Whitney U test: The Mann–Whitney U test was used to compare all the groups. The results indicated that the interpretation ability of the Cluster 3 students was significantly lower than that of the Cluster 1 students (U (NC3 = 7, NC1 = 18) = 30.50, p = 0.00)) and Cluster 2 students (U (NC3 = 7, NC2 = 20) = 10.80, p = 0.00)).Moreover, the students in Cluster 1 demonstrated a statistically significantly higher interpretation ability than the students in Cluster 2 (U (NC1 = 18, NC2 = 20) = 19.7, p = 0.00)).

5.4. Qualitative Analysis

5.4.1. Characterizing Expert Systems Thinkers

The assignment for students from Cluster 1 was thematically analyzed to identify characteristics of expert systems thinkers. The themes presented in the flowing paragraphs represent the patterns demonstrated by all the students in Cluster 1.
Pattern 1, detailed and clear explanation of the model: The expert systems thinkers (Cluster 1) were detail-oriented. They provided in-depth responses to the questions, with examples and figures. They were clear, and they logically defended their choice of model. For example, Jose provided a clear explanation of the selected model: “The model being investigated for this assignment is the thermostat model. A thermostat is a feedback control device that, when fed with the input of room temperature, responds to the temperature to maintain the room temperature at the desired level by turning the heater on and off at appropriate times.” Jose also created a figure and explained the working of the model, for example: “As shown in Figure 1, the red turtles represent heat, and the yellow border represents the room being heated. This yellow border is semi-permeable, allowing some of the heat to escape the room. The heat disappears when reached the edge of the display. The heat, represented by white, is located at the center of the room and the thermometer, represented by green, measures the approximate temperature of the room.”
Pattern 2, provided a rationale for choosing the variables/out-of-box thinking: The expert systems thinkers selected the variables for their model and provided rationale for how the selected variables impacted the model. For example, Nori chose the variables for her model and described how changing the variables impacted the model: “Birth Threshold: Sets the energy level at which the rabbits produce. By increasing this, each rabbit will have to consume more food to reproduce than before and the vice versa is true if you lower this input.” Some students also demonstrated out-of-the-box thinking, as they did not just select the input variables but also took a step ahead, identifying and discussing the output variables. For example, Jose mentioned current temperature as an output variable but added an extra note on how it can act as both an input and an output variable: “current-temperature: this is the temperature the thermostat achieves by controlling the heater. Note: current-temperature acts as both an input and an output. It is an input for the thermostat to respond, and it is also the output that the thermostat gives by turning the heater on and off.”
Pattern 3, use of prior knowledge: The students demonstrated the ability to use their prior knowledge while working on the action and interpretation ability questions. For example, Jen referred back to the knowledge she gained in a freshman course and used that knowledge while proposing the action for the model: “Based on my knowledge from freshmen courses and the idea of this model, the maximum temperature should be low, the minimum temperature should be high, the range should be small, the mean should be close to the desired temperature, and the median should be close to the desired temperature, the mode should be close to the desired temperature, and the standard deviation should be low. These conditions will determine the best insulation percentage.”
Pattern 4, problem-solving ability: Expert systems thinkers demonstrated a problem-solving ability. The demonstrated ability to identify the model, decide parameters, propose actions, and interpret the results. They demonstrated a logical approach to solving the problems and used graphs to demonstrate and interpret the findings. For example, Jose created two graphs to represent the static and dynamic models and to explain his model. Figure 2 demonstrates the static model, and Jose provided an explanation for that model: “Based upon the data shown in the graph above it is clear that for each weed growth rate the data can vary by up 50. That being said by utilizing 10 different samples at each growth rate it is possible to decrease the overall variation because the average of the 10 population values can be taken and a linear fit line can be used to demonstrate the relationship.”

5.4.2. Characterizing the Novice Systems Thinkers

The student assignment from Cluster 3 was analyzed using thematic analysis, and the patterns were observed as discussed in the following paragraph. In general, the assignments of the novice systems thinkers did not demonstrate in-depth thinking, decision making, action, or interpretation abilities. They were brief and lacked metacognition.
Pattern 1, unclear explanation of the model: Novice systems thinkers provided an unclear explanation and a very brief description of the model. They did select the model, but found it difficult to explain, rationally, why they found the model appropriate for the assignment. For example, Kristy simply copied Figure 3 and provided a one-line explanation of the figure: “Varying birth rates of red and blue populations, with a roughly constant total value.” Kristy did not attempt to explain why a particular model was selected. The students did select some variables for the model, but failed to describe how those variables impacted the selected model. For example, Kristy chose the following: “Blue fertility rate. The rate of birth of the blue population, red fertility rate. The rate of birth of the red population.” Kristy mentioned that the two colors, red and blue, represented the fertility rates, but she did not mention how those parameters impacted the model.
Pattern 2, questions left unanswered: Novice systems thinkers did not respond to many questions in the assignment, or provided incomplete responses. For example, Susy did not answer questions related to action and interpretation.
Overall, the qualitative study’s results helped us in understanding the difference in the learning patterns of expert and novice systems thinkers.

6. Discussion

The study was conducted in the context of the systems-methods course, and the quantitative results helped in assigning the students to three categories: expert systems thinkers, competent systems thinkers, and novice systems thinkers. The analysis of the quantitative results, based on the systems-thinking skills development framework, revealed that 40% of the students were expert systems thinkers, 44% were competent systems thinkers, and just 16% were novice systems thinkers. Drawing on the theoretical framework used in their paper, Keating et al. [14] demonstrated four levels of systems-thinking skills: thinking, decision making, action, and interpretation. The systems thinker who masters all four skills is a skilled systems thinker. Based on our results, the expert systems thinkers fell into the category of skilled systems thinkers. Our results also add to the body of literature by identifying two additional categories of systems thinkers: competent systems thinkers and novice systems thinkers. Based on our analysis, we found that competent systems thinkers had a high level of thinking and moderate levels of decision making, actions, and interpretation abilities. At the same time, novice systems thinkers demonstrated a low level of thinking, decision making, actions, and interpretation abilities.
We also wanted to investigate and understand the difference between the learning patterns of the students with high systems-thinking abilities (the expert systems thinkers) versus students with low systems-thinking abilities (the novice systems thinkers). The expert systems thinkers demonstrated critical-thinking skills and a rational thought process, and reflected on past experiences while working on problems, while the novice systems thinkers were poor reflectors and lacked the ability to communicate their findings. Expert systems thinkers were able to holistically see the problem and to establish connections among the various variables in the model [3]. Murawski’s [59] study discussed the role of critical-thinking skills in the classroom and beyond. That study compared the problem-solving abilities of critical thinkers and non-critical thinkers. The study found that critical thinkers are effective problem solvers, as they follow a systematic approach to solving problems and spend time thinking and deciding before proposing a solution. Non-critical thinkers are ineffective problem solvers, as they start considering the problem haphazardly; they spend less time thinking through the problem, as they lack background knowledge. Similar behaviors were observed when we compared the learning patterns of expert systems thinkers with novice systems thinkers. The expert systems thinkers demonstrated a logical and sequential method of approaching the problems, while the novice systems thinkers worked on the problems that they found easy and left other problems unanswered.
It was important to note that expert systems thinkers used their prior knowledge in solving the problems. This aligned well with the study conducted by Jaiswal et al. [60], which demonstrated that active reflectors used metacognition (prior knowledge) to plan, evaluate, and infer the results of experiments. In contrast, inactive reflectors did not demonstrate metacognition and were very brief in explaining the results of experiments. McKim and McKendree [61] conducted a study to understand the relationship between metacognition, problem-solving, and systems thinking. Their results demonstrated that systems thinking was a significant predictor of student problem-solving behaviors, and metacognition was a significant predictor of systems-thinking abilities. Therefore, that study proposed identifying ways that could help students in developing metacognition and systems-thinking abilities, as those skills could help in solving complex problems in the future. In our study, we found that expert reflectors and competent reflectors demonstrated high-to-moderate systems-thinking skills and metacognition; it was noteworthy that the course helped the students develop systems-thinking skills, as 84% of the students were either expert or competent thinkers.

Implications for Teaching and Learning

The results of this study suggest that helping students develop systems-thinking skills will help them develop metacognitive, reflective, and problem-solving skills, which are crucial for engineering students [62]. The study emphasized that engineering problems must be designed in such a way that requires students to think and decide before proposing actions. Moreover, engaging students in critical reflection based on predefined prompts will help them in thinking critically and making rational decisions. Because interpretation is the highest level of systems-thinking skills, a conscious effort from instructors is required to help students develop their interpretation skills. Instructors must focus on engaging students in structured problem solving, as that will allow them to develop critical thinking and creativity [59]. In addition, the use of computational models and simulation tools in systems thinking creates a natural environment for students to think, decide, act, and interpret results [46]. We strongly recommend that instructors consider including the use of computational models and simulations in their classes to improve students’ system-thinking skills. Moreover, informing students about these four stages of the system-thinking framework might increase the consciousness of informed systems thinkers and eventually build habits of systems thinking.

7. Conclusions, Limitations, and Future Work

This study focused on the identification and characterization of patterns of students’ system-thinking processes via challenges with a computational model and simulation. We classified the students into three groups, expert, competent, and novice systems thinkers, using the systems-thinking skills development framework categories of thinking, decision making, action, and interpretation. The results of the study demonstrated that creating a structured and theoretically grounded curriculum can help students develop systems-thinking skills. The study also revealed that helping students develop systems-thinking abilities will assist them in becoming critical thinkers and problem solvers in the future. Helping students inculcate systems-thinking abilities will also create a rational-thinking engineering workforce to solve complex problems in the future.
This study does have the following limitations. We used a rubric to evaluate student assignments, and no survey was used to collect the data. In addition, only one of the assignments in the course was used to analyze the systems-thinking skills that students develop over a semester working in a project-based learning environment. Conducting interviews could provide deeper insights into how students developed their thinking, decision making, action, and interpretation skills. In future work, we plan to explore students’ systems-thinking skill development using a survey, followed by in-depth interviews to collect a richer set of data. The intent of the survey will be to characterize the systems-thinking approach of engineering students, followed by interviews with the students to understand their thought processes while engaged in solving a problem by systems thinking.

Author Contributions

Conceptualization, T.K. and A.J.; methodology, T.K. and A.J.; software, T.K. and A.J.; validation, T.K. and A.J.; formal analysis, T.K. and A.J.; investigation, T.K. and A.J.; resources, T.K.; data curation, A.J.; writing—original draft preparation, T.K. and A.J.; writing—review and editing, T.K. and A.J.; visualization, A.J.; supervision, T.K. and A.J.; project administration, T.K. and A.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Purdue University (protocol code: IRB- 2022-254 and date of approval: 3 September 2022).

Informed Consent Statement

Informed consent was waived on the basis of an exempted Institutional Review Board approval, as the study was conducted in established or commonly accepted educational settings that involved regular educational practices.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available to protect the privacy and confidentiality of the respondents.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gilissen, M.G.; Knippels, M.-C.P.; van Joolingen, W.R. Bringing systems thinking into the classroom. Int. J. Sci. Educ. 2020, 42, 1253–1280. [Google Scholar] [CrossRef]
  2. Morganelli, D.M. What is Systems Thinking? Available online: https://www.snhu.edu/about-us/newsroom/business/what-is-systems-thinking (accessed on 28 July 2022).
  3. Arnold, R.D.; Wade, J.P. A definition of systems thinking: A systems approach. Procedia Comput. Sci. 2015, 44, 669–678. [Google Scholar] [CrossRef] [Green Version]
  4. Hossain, N.U.I.; Dayarathna, V.L.; Nagahi, M.; Jaradat, R. Systems thinking: A review and bibliometric analysis. Systems 2020, 8, 23. [Google Scholar] [CrossRef]
  5. Kenett, R.S.; Swarz, R.S.; Zonnenshain, A. Systems Engineering in the Fourth Industrial Revolution: Big Data, Novel Technologies, and Modern Systems Engineering; John Wiley & Sons: Hoboken, NJ, USA, 2019. [Google Scholar]
  6. Shaked, H.; Schechter, C.; Ganon-Shilon, S.; Goldratt, M. Systems Thinking for School Leaders; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  7. Checkland, P. Systems Thinking, Systems Practice; John Wiley & Sons: Hoboken, NJ, USA, 1981. [Google Scholar]
  8. Flood, R.L.; Carson, E. Dealing with Complexity: An Introduction to the Theory and Application of Systems Science; Springer Science & Business Media: Berlin/Heidelberg, Germany, 1993. [Google Scholar]
  9. Von Bertalanffy, L. The history and status of general systems theory. Acad. Manag. J. 1972, 15, 407–426. [Google Scholar]
  10. Stroh, D.P. Systems Thinking for Social Change: A Practical Guide to Solving Complex Problems, Avoiding Unintended Consequences, and Achieving Lasting Results; Chelsea Green Publishing: Hartford, VT, USA, 2015. [Google Scholar]
  11. Ackoff, R.L.; Ackoff, R.L. Ackoff’s Best: His Classic Writings on Management; John Wiley & Sons: Hoboken, NJ, USA, 1999. [Google Scholar]
  12. Haines, S.G. The Manager’s Pocket Guide to Systems Thinking & Learning; Human Resource Development: Amherst, MA, USA, 1998. [Google Scholar]
  13. Meadows, D.H.; Wright, D. Thinking in Systems: A Primer; Chelsea Green: White River Junction, VT, USA, 2008. [Google Scholar]
  14. Keating, C.B.; Katina, P.F.; Jaradat, R.; Bradley, J.M.; Hodge, R. Systems Thinking: A Critical Skill for Systems Engineers. INCOSE Int. Symp. 2021, 31, 522–536. [Google Scholar] [CrossRef]
  15. Colvile, R. The Great Acceleration: How the World is Getting Faster, Faster; Bloomsbury Publishing USA: New York, NY, USA, 2016. [Google Scholar]
  16. Assaraf, O.B.-Z.; Orion, N. System thinking skills at the elementary school level. J. Res. Sci. Teach. Off. J. Natl. Assoc. Res. Sci. Teach. 2010, 47, 540–563. [Google Scholar] [CrossRef]
  17. Assaraf, O.B.-Z.; Orion, N. Development of system thinking skills in the context of earth system education. J. Res. Sci. Teach. Off. J. Natl. Assoc. Res. Sci. Teach. 2005, 42, 518–560. [Google Scholar] [CrossRef]
  18. Ben-Zvi-Assaraf, O.; Orion, N. Four case studies, six years later: Developing system thinking skills in junior high school and sustaining them over time. J. Res. Sci. Teach. 2010, 47, 1253–1280. [Google Scholar] [CrossRef]
  19. Evagorou, M.; Korfiatis, K.; Nicolaou, C.; Constantinou, C. An investigation of the potential of interactive simulations for developing system thinking skills in elementary school: A case study with fifth-graders and sixth-graders. Int. J. Sci. Educ. 2009, 31, 655–674. [Google Scholar] [CrossRef] [Green Version]
  20. Lee, H.-N.; Kwon, Y.-J.; Oh, H.-J.; Lee, H.-D. Development and application of the educational program to increase high school students’ systems thinking skills-Focus on global warming. J. Korean Earth Sci. Soc. 2011, 32, 784–797. [Google Scholar] [CrossRef] [Green Version]
  21. Huang, S.; Muci-Kuchler, K.H.; Bedillion, M.D.; Ellingsen, M.D.; Degen, C.M. Systems thinking skills of undergraduate engineering students. In Proceedings of the 2015 IEEE Frontiers in Education Conference (FIE), El Paso, TX, USA, 21–24 October 2015; pp. 1–5. [Google Scholar]
  22. Connell, K.Y.H.; Remington, S.M.; Armstrong, C.M. Assessing systems thinking skills in two undergraduate sustainability courses: A comparison of teaching strategies. J. Sustain. Educ. 2012, 3. [Google Scholar]
  23. Hung, W. Enhancing systems-thinking skills with modelling. Br. J. Educ. Technol. 2008, 39, 1099–1120. [Google Scholar] [CrossRef]
  24. Ison, R. Applying systems thinking to higher education. In Systems Research and Behavioral Science: The Official Journal of the International Federation for Systems Research; John Wiley & Sons, Ltd.: Chichester, UK, 1999; Volume 16, pp. 107–112. [Google Scholar]
  25. Wilensky, U. NetLogo (and NetLogo User Manual), Center for Connected Learning and Computer-Based Modeling, Northwestern University. 1999. Available online: http://ccl.northwestern.edu/netlogo/ (accessed on 29 July 2022).
  26. Tong, X.; Nikolic, I.; Dijkhuizen, B.; van den Hoven, M. Behaviour change in post-consumer recycling: Applying agent-based modelling in social experiment. J. Clean. Prod. 2018, 187, 1006–1013. [Google Scholar] [CrossRef]
  27. Tisue, S.; Wilensky, U. Netlogo: A simple environment for modeling complexity. Int. Conf. Complex Syst. 2004, 21, 16–21. [Google Scholar]
  28. Tisue, S.; Wilensky, U. NetLogo: Design and implementation of a multi-agent modeling environment. Proc. Agent 2004, 2004, 7–9. [Google Scholar]
  29. Zhang, B.H.; Ahmed, S.A.M. Systems Thinking—Ludwig Von Bertalanffy, Peter Senge, and Donella Meadows. In Science Education in Theory and Practice: An Introductory Guide to Learning Theory; Akpan, B., Kennedy, T.J., Eds.; Springer International Publishing: Cham, Germany, 2020; pp. 419–436. [Google Scholar] [CrossRef]
  30. Von Bertalanffy, L. General Theory of Systems; George Braziller: New York, NY, USA, 1969. [Google Scholar]
  31. Von Bertalanffy, L. General systems theory and psychiatry—An overview. Gen. Syst. Theory Psychiatry 1969, 32, 33–46. [Google Scholar]
  32. Forrester, J.W. System dynamics, systems thinking, and soft OR. Syst. Dyn. Rev. 1994, 10, 245–256. [Google Scholar] [CrossRef]
  33. Irijanto, T.T.; Zaidi, M.A.S.; Ismail, A.G.; Arshad, N.C. Al Ghazali’s thoughts of economic growth theory, a contribution with system thinking. Sci. J. PPI-UKM 2015, 2, 233–240. [Google Scholar]
  34. Zoller, U. Environmental education and the university: The ‘problem solving-decision making act’ within a critical system-thinking framework. High. Educ. Eur. 1990, 15, 5–14. [Google Scholar] [CrossRef]
  35. Aboumatar, H.J.; Thompson, D.; Wu, A.; Dawson, P.; Colbert, J.; Marsteller, J.; Kent, P.; Lubomski, L.H.; Paine, L.; Pronovost, P. Development and evaluation of a 3-day patient safety curriculum to advance knowledge, self-efficacy and system thinking among medical students. BMJ Qual. Saf. 2012, 21, 416–422. [Google Scholar] [CrossRef] [PubMed]
  36. Arndt, H. Enhancing system thinking in education using system dynamics. Simulation 2006, 82, 795–806. [Google Scholar] [CrossRef]
  37. Habits of a Systems Thinker. STEMAZing Systems Thinking, 8 July 2017. Available online: https://stemazing.org/habits-of-a-systems-thinker/ (accessed on 5 August 2022).
  38. Macal, C.M.; North, M.J. Tutorial on agent-based modeling and simulation. In Proceedings of the Winter Simulation Conference, Orlando, FL, USA, 4 December 2005; p. 14. [Google Scholar]
  39. Macal, C.M.; North, M.J. Agent-based modeling and simulation. In Proceedings of the 2009 Winter Simulation Conference (WSC), Austin, TX, USA, 13–16 December 2009; pp. 86–98. [Google Scholar]
  40. NetLogo Home Page. Available online: https://ccl.northwestern.edu/netlogo/ (accessed on 29 July 2022).
  41. Wilensky, U.; Rand, W. An Introduction to Agent-Based Modeling: Modeling Natural, Social, and Engineered Complex Systems with NetLogo; Mit Press: Cambridge, MA, USA, 2015. [Google Scholar]
  42. Parunak, H.V.D.; Savit, R.; Riolo, R.L. Agent-based modeling vs. equation-based modeling: A case study and users’ guide. In International Workshop on Multi-Agent Systems and Agent-Based Simulation; Springer: Berlin/Heidelberg, Germany, 1998; pp. 10–25. [Google Scholar]
  43. Bandini, S.; Manzoni, S.; Vizzari, G. Agent based modeling and simulation: An informatics perspective. J. Artif. Soc. Soc. Simul. 2009, 12, 4. [Google Scholar]
  44. Macal, C.; North, M. Introductory tutorial: Agent-based modeling and simulation. In Proceedings of the winter Simulation Conference, Savannah, GA, USA, 7–10 December 2014; pp. 6–20. [Google Scholar]
  45. Abbott, R.; Hadžikadić, M. Complex adaptive systems, systems thinking, and agent-based modeling. In Advanced Technologies, Systems, and Applications; Springer: Berlin/Heidelberg, Germany, 2017; pp. 1–8. [Google Scholar]
  46. Holman, M.; Walker, G.; Lansdown, T.; Hulme, A. Radical systems thinking and the future role of computational modelling in ergonomics: An exploration of agent-based modelling. Ergonomics 2020, 63, 1057–1074. [Google Scholar] [CrossRef]
  47. Hulme, A.; Mclean, S.; Salmon, P.M.; Thompson, J.; Lane, B.R.; Nielsen, R.O. Computational methods to model complex systems in sports injury research: Agent-based modelling (ABM) and systems dynamics (SD) modelling. Br. J. Sports Med. 2019, 53, 1507–1510. [Google Scholar] [CrossRef] [PubMed]
  48. Read, G.J.M.; Salmon, P.M.; Thompson, J.; McClure, R.J. Simulating the behaviour of complex systems: Computational modelling in ergonomics. Ergonomics 2020, 63, 931–937. [Google Scholar] [CrossRef]
  49. Helen, L.D.; Dulock, R. Research Design: Descriptive Research. J. Pediatr. Oncol. Nurs. 1993, 10, 154–157. [Google Scholar]
  50. Jaiswal, A.; Karabiyik, T.; Thomas, P.; Magana, A.J. Characterizing Team Orientations and Academic Performance in Cooperative Project-Based Learning Environments. Educ. Sci. 2021, 11, 520. [Google Scholar] [CrossRef]
  51. Krajcik, J.S. Supporting science learning in context: Project-based learning. In Portable Technologies; Springer: Berlin/Heidelberg, Germany, 2001; pp. 7–28. [Google Scholar]
  52. Magana, A.J.; Karabiyik, T.; Thomas, P.; Jaiswal, A.; Perera, V.; Dworkin, J. Teamwork facilitation and conflict resolution training in a HyFlex course during the COVID-19 pandemic. J. Eng. Educ. 2022, 111, 446–473. [Google Scholar] [CrossRef]
  53. Innoslate—Systems Engineering and Requirements Management Software. Available online: https://www.innoslate.com/ (accessed on 19 September 2022).
  54. Psaromiligkos, Y.; Orfanidou, M.; Kytagias, C.; Zafiri, E. Mining log data for the analysis of learners’ behaviour in web-based learning management systems. Oper. Res. 2011, 11, 187–200. [Google Scholar] [CrossRef]
  55. Magana, A.J.; Jaiswal, A.; Madamanchi, A.; Parker, L.C.; Gundlach, E.; Ward, M.D. Characterizing the psychosocial effects of participating in a year-long residential research-oriented learning community. Curr. Psychol. 2021, 1–18. [Google Scholar] [CrossRef]
  56. Antonenko, P.D.; Toy, S.; Niederhauser, D.S. Using cluster analysis for data mining in educational technology research. Educ. Technol. Res. Dev. 2012, 60, 383–398. [Google Scholar] [CrossRef]
  57. McKight, P.E.; Najab, J. Kruskal-wallis test. Corsini Encycl. Psychol. 2010, 1. [Google Scholar]
  58. Clarke, V.; Braun, V.; Hayfield, N. Thematic analysis. Qual. Psychol. Pract. Guide Res. Methods 2015, 222, 248. [Google Scholar]
  59. Murawski, L.M. Critical Thinking in the Classroom… and Beyond. J. Learn. High. Educ. 2014, 10, 25–30. [Google Scholar]
  60. Jaiswal, A.; Lyon, J.A.; Zhang, Y.; Magana, A.J. Supporting student reflective practices through modelling-based learning assignments. Eur. J. Eng. Educ. 2021, 1–20. [Google Scholar] [CrossRef]
  61. McKim, A.; McKendree, R. Metacognition, systems thinking, and problem-solving ability in school-based agriculture, food, and natural resources education. Adv. Agric. Dev. 2020, 1, 38–47. [Google Scholar] [CrossRef] [Green Version]
  62. Lawanto, O. Students’ metacognition during an engineering design project. Perform. Improv. Q. 2010, 23, 117–136. [Google Scholar] [CrossRef]
Figure 1. Systems-thinking skills [14].
Figure 1. Systems-thinking skills [14].
Sustainability 14 12817 g001
Figure 2. Static system example from expert reflectors’ assignment.
Figure 2. Static system example from expert reflectors’ assignment.
Sustainability 14 12817 g002
Figure 3. Example from a novice reflector’s assignment.
Figure 3. Example from a novice reflector’s assignment.
Sustainability 14 12817 g003
Table 1. Gender and academic level.
Table 1. Gender and academic level.
GenderAcademic Level
MaleFemaleFirst YearSecond YearThird YearFourth Year
36902340
Table 2. Systems-thinking construct.
Table 2. Systems-thinking construct.
ConstructsQuestion
ThinkingRun a few simulations on those that you find interesting, and ultimately choose the model (only one) you are going to use for this assignment. Provide a brief description and one representative figure or image about the model you have chosen. If you are using a figure or image from somewhere else, please cite your sources accordingly.
Decision makingWhat are the (input) parameters for this model? List them and provide an explanation of their role in the model.
Action(a) Identify a “static” and a “dynamic” question you would like to address for this system. The question must involve assessing the relationship between one input parameter and one output of the system.
(b) Explain the question you are addressing through your static question and the key characteristics of your Behavior Space design to do so. Clearly, state which input parameter, and which output you are including (5–10 sentences).
InterpretationAnalyze the data produced by the Behavior Space. Include one plot (clearly label X and Y axes and include a detailed figure caption). Include conclusions obtained by your experiment.
Table 3. Rubrics for scoring the student assignment.
Table 3. Rubrics for scoring the student assignment.
Absence of Systems Thinking (0)Basic level of Systems Thinking (1)Intermediate Level of Systems Thinking (2)Excellent Level of Systems Thinking (3)
ThinkingStudent lacks informed thinkingStudent demonstrates basic level of thinkingStudent demonstrates intermediate level of thinking but lacks expertiseStudent demonstrates Informed level of thinking
DecisionStudent lacks decision-making abilitiesStudent demonstrates basic decision-making abilitiesStudent demonstrates intermediate level of decision-making abilitiesStudent discusses expanded possibilities for different decisions
ActionStudent does not take/propose any actions Student proposes basic actions based on their decision Student demonstrates intermediate level of ability proposing some actions based on their decision Student proposes actions based on the decisions
InterpretationStudent lacks ability to interpret the findings or share perspectivesStudent demonstrates basic ability to interpret the actions or explain the findings or share perspectivesStudent demonstrates intermediate level of ability to interpret the actions or explain the findings or share perspectivesStudent explains and demonstrates an understanding of actions taken and shows ability to explain the findings or share perspectives
Table 4. Interpretation of mean values.
Table 4. Interpretation of mean values.
LevelsMean Value
High level (expert systems thinkers)2.25 and above
Moderate level (competent systems thinkers)1.51–2.24
Low level (novice systems thinkers)0–1.5
Table 5. Overall mean and standard deviation for all students.
Table 5. Overall mean and standard deviation for all students.
CriteriaMeanSD
Thinking2.330.74
Decision making2.220.74
Action 1.900.97
Interpretation 1.961.01
Table 6. Cluster-wise mean and standard deviation.
Table 6. Cluster-wise mean and standard deviation.
Cluster 1 (Expert)Cluster 2 (Competent)Cluster 3 (Novice)
CRITERIAMeanSDMeanSDMeanSD
Thinking2.720.462.350.591.290.76
Decision making2.720.572.100.451.140.38
Action 2.830.381.510.620.640.51
Interpretation 2.950.161.530.670.640.51
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jaiswal, A.; Karabiyik, T. Characterizing Undergraduate Students’ Systems-Thinking Skills through Agent-Based Modeling Simulation. Sustainability 2022, 14, 12817. https://doi.org/10.3390/su141912817

AMA Style

Jaiswal A, Karabiyik T. Characterizing Undergraduate Students’ Systems-Thinking Skills through Agent-Based Modeling Simulation. Sustainability. 2022; 14(19):12817. https://doi.org/10.3390/su141912817

Chicago/Turabian Style

Jaiswal, Aparajita, and Tugba Karabiyik. 2022. "Characterizing Undergraduate Students’ Systems-Thinking Skills through Agent-Based Modeling Simulation" Sustainability 14, no. 19: 12817. https://doi.org/10.3390/su141912817

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop