Development and Implementation of an Augmented Reality Thunderstorm Simulation for General Aviation Weather Theory Training

. In 2021, there were 1,157 general aviation (GA) accidents, 210 of them fatal, making GA the deadliest civil aviation category. Research shows that accidents are partially caused by ineffective weather theory training. Current weather training in classrooms relies on 2D materials that students often ﬁnd difﬁcult to map into a real 3D environment. To address these issues, Augmented Reality (AR) was utilized to provide 3D immersive content while running on commodity devices. However, mobile devices have limitations in rendering, camera tracking, and screen size. These limitations make the implementation of mobile device based AR especially challenging for complex visualization of weather phenomena. This paper presents research on how to address the technical challenges of developing and implementing a complex thunderstorm visualization in a marker-based mobile AR application. The development of the system and a technological evaluation of the application’s rendering and tracking performance across different devices is presented.


INSTRUCTION
General Aviation (GA) refers to all civil aviation operations that do not transport passengers or goods for commercial or ''for hire'' purposes [1].GA had the most flight hours in civil aviation and is the deadliest segment in civil aviation (regulated by Title 14 Code of Federal Regulations Part 121, 135, 91) [2].In 2021 alone, GA had more than 21.9 million flight hours compared to 15.9 million flight hours for part 121 air carriers (regularly scheduled air carriers such as U.S.-based airlines, regional air carriers, and cargo operators).For every 100,000 GA flight hours there were 5.26 accidents with 0.95 being fatal, compared to part 121 air carriers' accident rate of 0.49 with 0 being fatal [2].Among those accidents, pilot decision making was a decisive factor IS&T Member.
A pilot's decision is important in dangerous weather conditions like thunderstorms.During 1996-2014, 71% of thunderstorm-related accidents resulted in fatalities, which was significantly higher than non-thunderstorm-related accidents (23%) [4].The high percentage of thunderstormrelated accidents highlights the need to improve pilots' decision-making skills when facing adverse weather conditions [5].Efforts to help student pilots take correct decisions when facing severe weather conditions, include required training programs on weather theory knowledge [6].Weather theory knowledge provides pilots with a background of weather principles, how they affect flight safety, and how to make proper decisions [7].Without sufficient knowledge of weather theory, pilots might not be able to take safe decisions when facing adverse weather conditions [8,9].
Student or novice pilots can learn weather theory knowledge from a Federal Aviation Administration (FAA) certificated pilot school, either in-person or at home [6].Different training programs focus on different aspects of weather, which leads to inconsistency in pilots' knowledge levels [10,11].In addition to the inconsistency in training programs, the educational materials provided to student pilots are exclusively in 2D formats such as textbooks, images, and videos [12].These 2D materials cannot effectively communicate all the complexities of weather phenomena and require students to create their own 3D mental models leading to misunderstanding of important concepts [13,14].Though there are flight simulators and other expensive visualization equipment to simulate weather phenomena, most of them use cockpit views and focus on practical training instead of aeronautical knowledge instruction.Traditional materials also lack hypothetical scenarios that may be encountered in flight or clear instructions on different topics in weather aviation [9].To effectively overcome these issues for effective weather theory training, three criteria were proposed to for educational materials: (1) allow students who do not have expensive equipment to easily visualize complex weather phenomena with hypothetical scenarios, (2) provide easily accessible equipment to teach 3D weather concepts, and (3) complement traditional classroom and 2D flight instruction materials.Due to the wide range of GA training programs, it poses a considerable challenge to ensure the availability of 3D visualization equipment to students.As a result, it is important for students to access the 3D training content through their existing devices.
Advanced flight simulators to visualize complex 3D weather phenomena are available but fail to address the proposed criteria.Most of the high-fidelity training simulators are aircraft specific and dedicated to commercial and military pilots [15][16][17].In the past 10-15 years, extended reality (XR) has been used in aviation training to further reduce training costs and broaden training system availability [18,19].For the purposes of this work, XR will be used as an umbrella term for technologies that immerse users into different degrees of virtual environments (VE), including Virtual Reality (VR), AR, and Mixed Reality (MR) [20].VR immerses users into a completely virtual environment [21], typically using a Head-Mounted Display (HMD) (also referred to as a Head Worn Display) or a CAVE TM system.For the purposes of this paper, the term HMD will be used when referring to a display on a user's head.AR presents 3D virtual content overlaid onto real-world environments [22].AR allows a user to view virtual content and 2D material using either mobile devices, such as smartphones and tablets (see Figure 1) or AR HMDs such as the Microsoft HoloLens (see Figure 2) [23][24][25].HMD-based VR and AR training have limitations such as requiring an external computer to power the device or a limited area of use due to constraints on tracking [19].A critical decision in this work was to determine a suitable platform for the 3D training content.Several potential platforms were considered in this decision.First, a VR system can be in an HMD powered by a mobile device, standalone computing in the headset, or tethered to a more powerful external computer.Any VR HMD provides a fully immersive experience and can remove a user's situational awareness of their physical environment.So, a student in a VR HMD may not be able to use a physical textbook while in a VE, requiring the simulation to include all teaching materials (i.e., no textbook).Alternatively, to use any kind of VR HMD with traditional teaching materials will require several awkward putting on and taking off of the system by a user.A standalone computing HMD or one powered by external computer can provide sufficient rendering power.This power comes at an additional cost of $300-$3,000 per user.A phone-based VR HMD reduces overall system cost but will have significantly decreased rendering capabilities.AR can present virtual content using a device that students most likely already own (e.g., a smartphone) and meets the three criteria to improve weather theory training listed above.AR on a mobile device was a favorable choice for this work due to several advantages.First, AR does not require every screen pixel to show virtual content, so rendering in real-time on a mobile device can be accomplished.Additionally, AR allows for seamless integration with traditional materials, making it easier to work with existing content.Furthermore, AR can deliver reliable tracking and spatial navigation by utilizing a simple printed image marker.Lastly, AR is widely used in many fields where training personnel on complex spatial relationships is necessary, such as learning 3D geometry [26].After considering all these issues from the literature search, AR on a mobile device using a marker-based algorithm was chosen for this project.In this work, AR on mobile device is referred to as mobile AR.
While marker-based mobile AR technology was chosen in this work, a mobile device does not have the same graphics capability as an HMD or high-fidelity flight simulator.These hardware limitations have reduced many XR-based training materials to simple content such as videos, static 3D models, and 2D images.In addition, most of the implementations do not have complex interaction to further enhance training [27][28][29][30][31]. Mobile devices only have the capability to render a limited number of polygons due to their processing power and battery capacity [32] and due to the relatively small screen size, weather models might not fully fit on the screen with a corresponding user interface (UI).In addition, touch-screen interfaces limit interactions for 3D VEs.A user cannot input depth information as most touch screens only provide 2D position information.Finally, AR on mobile devices is further limited by one hand use, due to the other hand usually occupied with holding the device [33,34].

OBJECTIVE
To use AR in weather theory training, a new solution is needed to render complex 3D weather phenomena with limited processing power, an effective UI, and the ability to compensate for non-ideal tracking conditions.The research presented in this paper studied, addresses the technical challenges to use mobile AR in weather theory training for GA student pilots.The goal is to achieve high-quality rendering and real-time stable tracking of thunderstorm educational modules on a mobile device to ensure an immersive and effective user experience.This application serves as an easily accessible 3D training tool for initial student pilots, assisting their comprehension of abstract thunderstorm concepts from 2D materials.
To achieve this objective, a particle system was created to form a thunderstorm cell with a life-like cloud appearance.Specialized shaders, viewing perspectives, and an image target were developed to improve weather theory training, lower the rendering requirements for a mobile device, and maintain stable tracking of virtual content.The work also conducted a technological evaluation of the application's rendering performance and tracking accuracy across all compatible devices.
Although this paper focuses on the technical achievements of this work, it is worth noting that a series of user studies were conducted to assess the usability and educational benefits to students of the system as well.While not the focus of this paper, they are briefly summarized here (for full details see Meister et al. [35,36]).Evaluations determined that the students improved their factual knowledge and visual knowledge, with high levels of motivation in a statistically significant manner.A preliminary evaluation of the application was conducted with three subject matter experts in weather and aviation, and three students, to assess whether the AR thunderstorm visualization could communicate weather theory and whether the interfaces were usable for learning and task completion (for full details, see Meister et al. [35]).Students' knowledge increased after using visualization to explore the dynamics of the thunderstorm.Subject matter experts felt the learning experiences appropriately communicated thunderstorm theory in ways that supported instruction.The AR interfaces were rated as usable for learning interactions and produced low levels of workload.In a follow-up study, 18 (17 male, 1 female) student pilots, or pilots with fewer than 250 total flight hours, were asked to complete the AR weather-related learning activities on a smartphone (for full details, see Meister, [36]).The learning outcomes were measured by pre-and post-tests concerning factual and visual knowledge and participants' completion time.There was a statistically significant increase in student factual knowledge (from 71% to 91%), even though students already had a high level of incoming knowledge.Visual knowledge also increased in a statistically significant manner, from 55% to 90%.To evaluate user experience with the AR application, post-trial surveys were also given to evaluate the application's usability, user motivation, and overall experience.Participants also reported positive learning experiences with high motivation, reasonable task load and completion time, and excellent usability, as rated by the system usability scale.Together these studies demonstrate that the AR thunderstorm model, and associated learning scenarios, resulted in an effective application that enhanced the learning outcomes of students and teaches aviation weather in a way that met the expectations of aviation and weather instructors.

BACKGROUND
A literature review elucidated current research efforts on weather theory training for GA pilots and mobile AR in training and education.

Weather Training for GA Pilots
Pilots must acquire weather theory knowledge and pass a FA approved assessment [37].However, training methods that provide weather theory knowledge are inconsistent, leaving pilots with a level of education that is not sufficient for safe decision making in the cockpit [5,6,38].
In a study performed by the FAA on how training affects GA pilots' ability to make in-flight decisions, 57 GA pilots were put into a low-visibility visual flight rules scenario with an approaching thunderstorm traversing their flight path [38].The pilots' behaviors were categorized as: (1) ''tactical'': did not maintain a safe (20 nautical mile) from the thunderstorm, and (2) ''strategic'': maintained a safe distance from the thunderstorm.Half of each group received additional training on how to safely avoid thunderstorms and then all pilots were asked to go through the scenario again.For pilots who received training, the study showed that 66% of trained tactical pilots changed their unsafe behaviors in one training iteration to maintain a safe distance from the thunderstorm.The average distance trained tactical pilots maintained when avoiding the thunderstorm increased from 10.2 nautical miles (SD = 4.0) to 31.3 miles (SD = 18.2).Pilots who did not receive training did not show a statistically significant difference in their behavior.This study indicates the crucial impact of weather theory knowledge on pilots' decision-making.
Assessments of weather training for GA pilots do not provide objective metrics that can be used to improve weather theory training methods.In one study, 95 weather knowledge questions were developed to evaluate GA pilots' weather knowledge and their interpretation of weather phenomena [5].These questions were presented to 204 GA and student pilots.The mean score of less than 60% for all participants highlights the need for improved weather training for current GA pilots and student pilots.
Research also found inconsistency in current aviation training programs.Guinn et al. [11] studied 22 Aviation Accreditation Board International (AABI) accredited professional flight baccalaureate degree programs.Results of the study revealed that most of the programs focused on the interpretation of weather reports, and only 60% of them mentioned teaching flight hazards.Even when flight hazards were provided in course descriptions, whether instruction included a theoretical understanding of weather hazards or simply focused on interpreting weather reports was unknown.

High Fidelity Flight Simulators versus Extended Reality Simulators
Aviation simulators have a long history of effective use in pilot training.Early aviation tools include computergenerated images presented on a spherical screen to create an immersive flight simulation [39].This early virtual reality system had many limitations, such as large physical size and high operational costs.Hexapod platforms were then developed in flight simulators for civilian aircraft and have been in use for over 50 years to simulate flight motion [40,41].At the beginning of the 20th century, ground simulators were developed to simulate movement, banking, and other flight movements [42].In recent years, the Dynamic Flight Simulator (DFS) was developed to train military pilots fully with minimal to no supplemental training in a real aircraft [16] simulated motion and gravity changes during conditions that GA pilots would rarely encounter.
High-fidelity flight simulators are expensive to operate and rarely accessible for GA pilots.Thus, XR technologies have been used in the last 10-15 years to provide lower cost training platforms.In 2011, the Scalab Virtual Reality Simulator was developed using a VR HMD and data gloves for helicopter flight training [18].Scalab reduced the cost of flight simulation but focused on helicopter flight operations in the cockpit as opposed to weather theory knowledge and was very complex for student pilots, or instructors, to download, compile, and run.In 2019, the U.S. Air Force created the Pilot Training Next (PTN) model to assess and engage in initial pilot training [14].While assessing PTN, researchers encountered issues such as VR training devices not being widely available to participants.
Another limitation of using VR, is the ability of an instructor to maintain situational awareness.VR HMDs only support one person, so an instructor's ability to monitor progress of a trainee and provide useful feedback is severely limited.For example, the Virtual Instructor Pilot Exercise Referee (VIPER R ) is another program to assist flight training [43].To assess the effectiveness of this tool a study was conducted on 52 student naval aviators (SNAs) training with VIPER R , 64 SNAs training with other VR conditions with guidance, 3,014 SNAs training with VR conditions without guidance, and 836 SNAs training with non-VR conditions.Results showed that students working with the guidance-free VR conditions had improved grades (statistically significant) compared to those using the non-VR condition.In addition, students using the guidance VR condition had statistically significant improved grades than their peers using the guidance-free VR conditions.Lastly, students using VIPER R had statistically significant improved grades than those using the guidance VR conditions.

Marker-based Mobile Augmented Reality
Marker-based mobile AR refers to using mobile devices as both a computational and visualization device to present computer-generated non-physical information in a live, real-world environment with a physical marker as a reference point in the physical world [44][45][46].When using a markerbased mobile AR application, users point the device's camera to a pre-defined image target and the virtual content will be placed relative to the target's position in the virtual environment.
Studies in education have shown that relying on students' mental models to learn complex 3D phenomena is not as effective as using 3D visual models [47][48][49].AR enables the ability to present 3D objects and animations visually and can also aid students in developing deeper spatial understanding, motivation, and long-term memory retention of learned skills [50][51][52][53].In addition, AR on a mobile device is more accessible, lower in cost, and portable.Although marker-based mobile AR applications have been successfully developed and implemented in various training areas with positive outcomes, there are still many challenges that exist such as rendering resources, tracking performance, and screen space.
Numerous studies have been conducted to develop content for marker-based mobile AR with the aim of facilitating training and education.However, a common limitation observed in these studies is the insufficient complexity of the content.pARabola is a mobile AR system that utilizes markers to facilitate learning quadratic equations [29].pARabola enables users to input various quadratic functions, with the particle system updating in real-time to reflect the changes made.In the evaluation, some participants commented on the size of text information and buttons, highlighting the significance of interface design in AR applications.In addition, projects with AR applications such as MagicBook, created by Kucuk, Kapakin, and Gotas [30], as well as one for teaching molecular structures [49] improved students' achievement and reduced cognitive load, but only used 3D video animation, static 3D models, images, and sounds.
The literature review highlighted several technical challenges related to marker-based mobile AR in training and education.One major challenge is the limited complexity of content due to rendering power constraints.Rendering power on mobile devices is limited by its battery capacity.A high-end desktop Graphic Process Unit (GPU) consumes 500 watts of power, whereas the total battery capacity of a high-end mobile device such as iPad Pro is only 28.65 watts-hour [54,55].As a result, GPUs on mobile devices consume less than 10 watts of power resulting in lower rendering capacity compared to desktop GPUs [54].For mobile AR specifically, since tracking algorithms run constantly and consume significant power, less is available for rendering [56].Various approaches have been taken to overcome and reduce mobile applications rendering overhead, such as remote rendering, image rendering, pre-rendering, and optimization of graphics code [57][58][59].FlashBack is an example of a software that uses pre-rendering to reduce overhead on mobile VR [59].However, FlashBack was implemented on an HP Pavilion Mini, which is not suitable for phone-based AR applications [60].
Additionally, tracking issues pose a significant challenge in marker-based mobile AR.Tracking is often done using feature points of a target image or model in the camera field to match pre-defined schemes and connect 2D locations in videos captured by the camera with 3D locations in virtual space [61].Stable tracking requires the camera to pick up a sufficient number of feature points in an image target.Failure to do so results in virtual content shaking or not being rendered in its intended location.In low-light conditions, the Charge-Coupled Device (CCD) on a mobile device can introduce noise into the representation of an image target further complicating tracking [56].To improve tracking accuracy and robustness, an image target should contain a high amount of feature points and have significant color contrast.
The UI on a mobile application is another critical element for an effective AR experience.Since users observe and interact with virtual content through a device's screen, efficiently utilized screen space is critical to mobile AR application UI design.For example, in a research study conducted on user experience with a mobile AR application, the UI was mentioned in 25 out of 90 respondents [62].
Overall, the literature review conducted revealed inconsistent, and at times insufficient, weather theory training to pilots.The review demonstrated that marker-based mobile AR offers capabilities that will not only improve learning outcomes, but also make it more accessible to pilots of all abilities.However, the literature also shown that current marker-based mobile AR projects were limited to simple geometry with no or little interaction due to technical challenges in rendering power and tracking performance.

METHODOLOGY
A thunderstorm model with learning activities and scenarios was developed using Unity and implemented in a mobile AR application for both iOS and Android devices [63].Experienced flight instructors provided expertise throughout the development process including the thunderstorm characteristics and corresponding scenario activities [35].This feedback was crucial to create educational materials that would be effective for GA student pilots.The following sections discuss the development and implementation process of creating the thunderstorms training content in a marker-based mobile AR application and how to overcome the technical challenges identified earlier.

Model Creation
The model simulated a single 60-minute thunderstorm cell cycle moving through the defined stages of development, maturation, and dissipation.The model contained different hazards and weather information, including temperature, icing, wind, and precipitation.

VOLUMETRIC APPEARANCE OF
THUNDERSTORM CLOUDS Thunderstorms change dynamically with clouds forming and dissipating throughout their lifecycle.To simulate such activities in a mobile application, where computational resources are limited, a particle system was used.Unity offers a particle system Application Programming Interface (API) that allows small images to be emitted to simulate fuzzy visual effects [60].Particle systems are a common way to create ''volumetric'' visual effects with reduced rendering overhead [32,60].Volumetric refers to visual effects that have movement on the surface and within an object such as fire, smoke, and clouds.A thunderstorm in real life transforms through its lifecycle stages (i.e., developing to dissipation) with a classic anvil shape (i.e., nonuniform in the direction of environmental winds).Controlling each particle within the model to achieve such a visual effect increases processing time and complexity, and is not ideal for mobile AR.To address this issue, the thunderstorm was formed in segments with each segment containing one particle system to form a part of the overall anvil cloud shape through its lifecycle (see Figure 3).Particles are emitted at the beginning of the simulation with random orientations and sizes.The particles' properties (e.g., transparency and color) are controlled by a custom developed graphics shader.A graphics shader is computer code that outputs correct levels of opacity, shading, and color, and even geometry movement (i.e., vertex procedures), during the rendering of a 3D scene.The developed shader takes the height of different location on the cloud, the presented weather information, and the current simulation time.Based on these factors, the shader assigns a specific color and transparency to each location on the cloud, enabling a gradual distribution.By combining this shader with the segment approach, thunderstorm can achieve a nonuniform shape while still maintaining an overall color and transparency changes during the cycle.However, no new particle was created nor modifications to existing particles in the cell were done at run-time to keep rendering resources low.The shader code and the segment approach significantly reduced the time to create unique cloud formations (e.g., duplicating cloud segments or the entire cloud cell) and ran in real-time on a mobile device.

INDICATION OF DYNAMIC WEATHER INFORMATION
Based on the Pilot's Handbook of Aeronautical Knowledge [12], wind, icing, and temperature conditions in a thunderstorm play a critical role in flight safety and are crucial for pilots to understand.Representing these various components of information required easy-to-understand visual cues that were distinguishable when viewed alone or all together.
Color is a powerful tool to display numeric information and was used in the model.The custom graphics shader developed allow gradual color changes throughout the cloud.This shader was used in presenting cloud density (see Figure 4), icing conditions (see Figure 5), and temperature information through the cloud (see Figure 6).To distinguish between cloud density, icing, and temperature, as colors will overlap with each other, icing has labels on the cloud as secondary markings, and each piece of information can be   viewed separately.It is important to note that the choice of colors was arbitrary and does not have special meaning in aviation education.Vibrant colors with high contrast to one another were used with the approval of expert flight instructors to allow easy distinction for the concepts they wanted conveyed to students.
However, not all weather information could be presented with color.Unlike gradually changing data inside the thunderstorm cloud, some corresponding weather phenomena required a more realistic appearance.Particle systems were again used to produce these phenomena.For precipitation, each particle was represented as a raindrop, or hail particle, with a gravity force attached to make it descend from the cloud.Rain and hail were differentiated with unique textures and densities (see Figure 7).In addition, a microburst, an additional characteristic of a thunderstorm critical to pilot training, had to be developed.A microburst contains localized downdraft air, a dust ring, rolling clouds, and an area of intense rain called a ''virga'' [64].To simulate microbursts, particles were set to a continuous cycle movement and dissipation to represent the accompanying rolling clouds, virga, and dust ring (see Figure 8).A variety of materials were implemented to attain a realistic appearance for these phenomena.Since each particle system has its own animation, it was controlled through scripting to ensure synchronization with the overall thunderstorm simulation, which was under the user's control.Finally, variation in wind direction and speed is one of the most crucial components for pilots to understand in real-time flight decisions.Unlike other information in and around a thunderstorm, which may be more contained to a specific region, wind constantly changes in and around the entire cloud.For example, when a microburst occurs, the wind moves toward the ground and then circles back to the thunderstorm after hitting the ground as shown in (see Fig. 8).Additionally, since the wind movement cannot be represented in a lifelike appearance, a simplified 3D arrow geometry was utilized.The model presented in this paper used Unity's physics engine to compute the movement of these 3D arrows according to a predefined wind pattern advised by expert flight instructors [65].However, a limitation of Unity's physics engine was the absence of a timeline-like animation playback feature.To address this limitation, a workaround was implemented whereby the arrow movement remained constant throughout the simulation, and the activation of the arrows was adjusted accordingly to indicate the progression of the wind.By implementing this approach, the wind movement driven by the Unity physics engine can be integrated and controlled within the overall thunderstorm simulation.

Implementation of Different Viewing Perspectives
A mature thunderstorm cell can be up to 16 km wide, whereas some activities inside it only happen in a relatively small area such as a microburst, which is usually less than 4 km in diameter [64,66].To allow users to view the thunderstorm, and all accompanying information, two viewing positions were created with different distances from the virtual camera.The first viewing mode was cloud-centric, which allows users to view the model statically as the ground moves underneath (see Figure 9).This effectively puts the user's viewpoint moving at the same speed as the thunderstorm so that the cloud can be set to a larger scale.The second mode was terrain-centric, where the user's viewpoint is fixed, and the cloud can be viewed moving across the terrain (see Figure 10).The cloud is further from the virtual camera so that it effectively puts the user in an all-encompassing view to observe the movement of the cloud and still be able to see additional elements that may interact with the cloud model.

Implementation of Learning and Scenario-based Activities
2D training materials do not effectively show hypothetical scenarios that pilots might encounter in real-life operations or visualizations of accident scenarios [9].14 Code of Federal Regulations 61.105(b) also emphasized that pilots need to have the ability to recognize critical weather stations and make appropriate decisions [6].However, the developed model can only contain a limited amount of 3D content to keep computational resources optimal for a mobile AR application.To address this issue, four learning activities were developed using the thunderstorm model in different ways.These activities were developed with continual input from experienced flight instructors.The four activities include two learning activities: (1) takeoff under a microburst and (2) thunderstorm avoidance; and two scenario-based activities: (1) takeoff scenario and (2) approach scenario.Each activity is supplemented with questions, instructions, or different flight paths to show how the thunderstorm model, and accompanying weather phenomena, affect pilot decision-making.
The thunderstorm model contains a detailed simulation of a microburst during its three stages: (1) formation, (2) impact, and (3) dissipation.Trying to take off under a microburst is very dangerous and can easily cause an aircraft to lose control at low altitudes and crash.An activity was specifically designed to emphasize a microburst's deadly impact on an aircraft during takeoff.This activity contains an intended and actual flight path to demonstrate the dramatic effect a microburst can have during takeoff (Figure 11).To maintain low rendering overhead, sample 3D geometries were implemented to present the situation, and supplemented with textual descriptions as well as airspeed to emphasize the danger a microburst poses to an aircraft.The flight paths were designed based on expert flight instructors' input.
Three learning and scenario activities were developed with the thunderstorm model to emphasize FAA's regulation on staying 20 nautical miles away from the thunderstorm [3].The take-off scenario (see Figure 12) and approach scenario (see Figure 13) simulated a real-world accident where an aircraft tried to approach and take off from an airport as a thunderstorm was nearby.The movement of the thunderstorm was based on the accident reports and flight instructors' expertise.After reaching 20 nautical miles from the thunderstorm, the scenario animation will pause and ask students about their decision to continue or divert to an alternate airport.Different consequences were given based on the decisions students made.In the take-off scenario, the thunderstorm was formed by multiple cells with lightning and precipitation.Developing each thunderstorm cell individually would increase the computational resources needed, so a custom shader was developed to scale the cloud in different sizes without changing the particle systems or segments.

Implementation of Marker-based Mobile AR
To integrate the thunderstorm model into an easy-to-use mobile AR application, a user interface, and a high-quality image target were needed for a complete experience.
Interfacing with an application in mobile AR presents unique challenges from viewing content to interactions.The interface for the thunderstorm model only contains essential interactions and information, such as a play/pause button, a play bar, a drop-down for playback speed, and a label for the stage of the cloud (see Figure 14).A menu toggle button was used to hide all sub buttons to toggle on/off different features to save viewing space when not needed.All the informative labels were also set to appear only as needed, and the text was kept as concise as possible.Additionally, a grid was implemented around the terrain to provide users with an accurate way to gauge the actual sizes of different thunderstorm characteristics (see Figure 15).Another technical challenge identified was tracking performance [33,34].The tracking quality is heavily reliant on the image target, the devices' camera position, and its capability of getting a clear image.Besides different viewing perspectives, which ensures a user will not move away from the image target while viewing the AR content.An image target with a sufficient amount of feature points, high contrast, and limited repetitive patterns was also designed to maximize tracking stability.As previously mentioned, the selection of mobile AR was driven by the goal of providing a cost-effective solution for visualizing 3D thunderstorm simulation.As a result, image marker was chosen as a suitable method to enable tracking for the AR application.Though newer HMDs offer advanced tracking capabilities, their cost poses a constraint in their usage.
The application was compatible with both iOS and Android system across smart phones and tablets.To preserve rendering power when running the application on different mobile devices, the target framerate of the application was set to the device's screen refresh rate.Because a device's screen refresh rate is the highest frame rate any application running on them can achieve, and a higher frame rate will require more power on rendering.

EVALUATION 7.1 Technical Evaluation
The presented thunderstorm model has realistic and immersive visualization and integrates the model into a mobile AR application allowing it to work along with the traditional materials and be easily accessed by students.The mobile AR application was designed to run on both smartphones and tablets, so an assessment of the performance of the application across these devices was performed.The application was run on three testing devices: (1) OnePlus 8 pro with Android 13 [67] (referred to as Android in the following section), (2) iPhone 13 with iOS 16 [68] (referred to as iPhone in the following section), and (3) 2nd generation iPad Pro with iOS 16 [69] (referred to as iPad Pro in the following section).For the Android device, the screen refresh rate can be set to both 120 Hz and 60 Hz, the iPhone device screen refresh rate can be set only to 60 Hz, and the iPad screen refresh rate can be set to both 120 Hz and 60 Hz.To control variables, the screen refresh rate for all devices, and target frame rate for the mobile AR application, was set to 60 Hz.
The render performance was measured by the application's frame rate.A 60-minute thunderstorm cell cycle was simulated at 180x speed in the terrain-centric mode with all the features turned on (i.e., precipitation, icing, temperature, wind arrows, labels, and grid) (see Figure 16).The device was held 4 ft from the image target.During the animation, the frame rate per second (FPS) was recorded every 10 seconds, with ten trials conducted for each device.Table I shows the average frame rate for each trial and device.The Android device met the target of 60 FPS and had an average frame rate of 58.83 FPS.The iPhone met the target of 60 FPS and had an average frame rate of 60.00 FPS, while the iPad Pro met the target of 60 FPS and had an average frame rate of 60.00 FPS.These results confirm that the application provided a sufficient frame rate across all devices, and operating systems, when viewing the thunderstorm moving across the terrain in front of the image target.
While viewing the thunderstorm animation, a user will typically move around the image target to view the model from various angles.In this case, FPS may not be the main factor affecting user experience as tracking quality plays a more important role.The AR model needs to maintain its relative position in the field of view when moving the mobile device around.To assess the application's tracking quality, a second evaluation was performed.The same devices were held and moved around the image target through four observation points, each 4 ft away (see Figure 17).The light condition was controlled by evaluating at the same time of day in a controlled lab setting with limited windows and consistent ceiling-mounted, artificial LED lighting.The model's animation was stopped at the mature stage and viewed under cloud-centric mode (see Figure 18).When entering each observation point, tracking quality was recorded based on the model's behavior with the following measures: (1) Accurate Tracking, where the model was statically attached to the image target, (2) Imperfect Tracking, where the model was still attached to the image target but was not perfectly static or aligned, and (3) Loose Tracking, where the model was not attached to the image target, and a user needed to point the camera close to the image target to re-calibrate.The assessment of tracking accuracy is qualitative, as it is obvious when inaccurate tracking has occurred.Since the purpose of this system is for education, and not a physically accurate weather model, a qualitative assessment was deemed appropriate for tracking.As with the first evaluation, ten trials were conducted for each device.For Android, iPhone, and iPad Pro, the model  achieved Accurate Tracking, all 40 times for each device (120 times total).The full results are shown in Table II.It was also important to assess rendering performance as a user moved around, so FPS was again recorded during the tracking trials.The average frame rate for each device at each observation point is shown in Table III.Looking at rendering performance during the tracking experiments, a noticeable drop in FPS was observed when the camera entered the thunderstorm model (i.e., point 3 in the testing locations).An in-depth look at this viewpoint revealed that limitations on occlusion culling were the cause.Occlusion culling is a computer graphics algorithm that disables rendering on an object, or vertices therein, if not seen by the camera or completely occluded by other objects [70].At point 3, because of the offset between the thunderstorm cloud and the image target, the user's viewpoint is inside the thunderstorm.All the particles making up the thunderstorm cell movement were within the virtual camera defining the user's viewpoint and were rendered, even though they might be behind other particles.This caused a high rendering overhead and taxed the ability of the tested mobile devices.The app still ran at a sufficient FPS to allow a user to see the thunderstorm from inside without exceeding the devices' limitation, but a noticeable visual difference was observed.Future work on this project is to refine the rendering algorithm to address this issue.

DISCUSSION
To assist student pilots in understanding 2D materials, a thunderstorm simulation was developed and integrated into a mobile AR application.This allowed students to experience 3D learning content on widely available devices.
In the literature review, various technical challenges were identified in relevant research, including limitations in rendering power, tracking performance, and interaction difficulties.The development and the implementation of the thunderstorm simulation has overcome these challenges by using a custom particle system and multiple developed graphics shaders to reduce rendering overhead.Additionally, different viewing perspectives, well-designed image targets, and a simple yet useful interface were developed to ensure stable tracking and improve user interaction.The result from the technical evaluations showed that the application had a stable and sufficient rendering and tracking performance on different mobile devices and operating systems.Additionally, prior study also indicated that the application had the potential to enhance students' learning outcomes with highly usable and relevant content [35,36].
The evaluation results proved that a low-cost device could provide a complex visual simulation for education even when it has lower processing and power requirements compared to other simulation platforms.A detailed comparison is presented in Table IV between different simulation training platforms for GA.A mobile device is relatively cheap and has similar levels of resolution to other platforms.On the other hand, the processing hardware on a mobile device is not as powerful when compared to other platforms.Compared to a standalone HMD, a mobile device's processor is not dedicated to XR.Compared to desktop HMDs and high-fidelity flight simulators, a mobile device has a less powerful processor because of its limited power supply.A direct comparison of power consumption between a mobile device and a wall-mounted device (e.g., desktop HMD and high-fidelity flight simulation) is challenging because the source of power is different.A mobile device relies on its own battery to provide power and is measured by its battery capacity, whereas a wall-mounted device gets power from a wall outlet and is measured by the power consumed per hour.However, the battery capacity of a mobile device is significantly smaller than an hour of power consumed from a desktop HMD or high-fidelity simulation.As a result, a mobile device possesses a lower level of power consumption and processing capabilities than a desktop HMD and high-fidelity flight simulation.To highlight the graphic differences between the thunderstorm implementation and other existing mobile AR training or educational software, a comparison was made between FenAR [31] (left in Figure 19), which utilizes 3D models and animations in mobile AR for teaching physics, and the thunderstorm simulation developed in this research (right in Fig. 19).The graphic shows that the thunderstorm content exhibits better visual quality and more complex visual effects.Combined with the evaluation result, the thunderstorm simulation was able to provide a stable 60fps thunderstorm simulation with stable AR tracking on low-cost and less powerful devices.

CONCLUSION
According to reports, over 29% of GA accidents are weather-related.One of the potential causes of those accidents is a lack of fundamental weather theory knowledge by pilots [78].Traditional GA training on weather is delivered mainly by images, text, or video, which does not provide an immersive and compelling training environment.Mobile AR is an easily accessible visualization tool that can work in conjunction with traditional material making it a better solution to assist GA weather theory training.However, mobile AR is limited in its rendering and processing power, unstable tracking, and small touch screens.The mobile AR application described in this paper provides a detailed solution that can provide an immersive environment with a lower cost than high-fidelity equipment and is more easily accessible than a VR or AR environment requiring an HMD.The model described in this paper represents thunderstorms using a volumetric and resource-efficient model.The model provides various approaches to delivering necessary weather information to overcome the limitations of mobile AR technology.A technical evaluation was performed on the thunderstorm mobile AR application.The evaluation result showed outstanding rendering performance and tracking quality across different devices.User evaluations conducted on the thunderstorm model also reiterate that this application can enhance students' learning outcomes.

Future Work
The evaluation found an FPS drop when moving into the thunderstorm cloud and future work can is planned to minimize this frame drop.To increase the amount of AR content and enlarge content type, size, and space, other tracking references such as area targets could be implemented to use the surrounding environment as registration points [79].This feature could enable a collaborative functionality between teacher and student to further assist pilots' learning.

FUNDING
The work described in this paper was funded by the PEGASAS Center of the Federal Aviation Administration Air Transportation Center of Excellence for General Aviation Research, Cooperative Agreement 12-C-GA-ISU.

DISCLAIMER
Statements and opinions expressed in this text do not necessarily reflect the position or the policy of the United States Government, and no official endorsement should be inferred.

Figure 3 .
Figure 3. Cloud model developed by particle system.

Figure 4 .
Figure 4. Thunderstorm cloud with different densities at different stages.

Table I .
Frame rate result of render performance evaluation in frames per second (FPS).

Table II .
Result of tracking performance evaluation.

Table III .
Result of render performance during tracking performance evaluation (FPS).

Table IV .
Comparison between different simulation platforms.