Design and Assessment of a Virtual Underwater Multisensory Effects Reproducing Simulation System

With current developments in virtual reality technology and advances in the marine industry, various virtual underwater simulation systems that help users to indirectly experience the marine environment via a variety of virtual contents are being developed. In general, underwater simulation systems that use virtual reality developed using three-dimensional graphics technology. Therefore, they are limited in terms of their ability to imbue users with a sense of immersion and presence. In addition, the expansion of the sensory range, with which users can actually experience various underwater environmental phenomena while they are in underwater simulations, is still insufficient. This paper proposes an immersive multisensory effect reproduction system that provides an improved sense of underwater reality for users. The results of comparative user evaluation tests conducted regarding the sense of reality imbued with our proposed system indicate that its sensory presence effect is superior to that of conventional systems.


Introduction
Virtual reality simulation technologies enable users to indirectly experience sites and actions that may be difficult or impossible to be directly experienced in real life [1,2]. Recent developments in the marine industry and reported increases in marine population have led to increased attention being paid to the development of underwater simulations in order to forecast underwater events and also prevent underwater accidents through mock safety exercises. In addition, various types of simulation research studies are being conducted in an effort to enable users to experience underwater environments without actually getting into the ocean.
The primary objective of existing research being conducted in this field is to visually express data obtained via simulations of underwater conditions using virtual reality technology and also to express visual presence using 3D graphics technology and various other display methods. The proliferation of high-performance hardware and the development of realistic computer graphics technology, immersive display equipment, and artificial intelligence (AI) technology have resulted in impressive research results being achieved. However, research on the expansion of the sensory range with which users can actually experience various underwater environment phenomena while they are in underwater simulations is still insufficient.
In order to improve the virtual reality systems, more enhanced methods are needed in aspects such as sensory immersion, sensory fidelity, and multimodal interaction [3,4]. In particular, to improve the sensory immersion experience, the data should be realistically simulated and then sensuously delivered throughout the virtual environment. Further, users should be able to completely immerse themselves and experience the environment as reality through audiovisual and other sensory mechanisms in mock test environments [5][6][7]. This paper proposes a multisensory effect reproducing system that provides users with various sensory effects from actual underwater environments, to facilitate more realistic virtual underwater experiences. We verify the efficacy of our  proposed system by evaluating its presence and usability through a test with participants.

Decision Factors for Presence in Virtual Reality Systems.
The basic elements required for virtual reality, as classified by Zeltzer [3], include presence or immersion, which signifies how much the users are immersed in the virtual environment and feeling as if it is reality, interaction, in which the conditions between the virtual environment and users are dealt with in real time, and autonomy, in which objects in the virtual environment act autonomously. In contrast to existing computer systems, virtual reality should be able to provide a presence that makes users feel as if they are actually in the 3D virtual environment. To increase user immersion, output devices such as head mounted displays (HMD) and input devices such as data gloves are used. One of the major characteristics of virtual reality is that users should be able to adjust the virtual environment and also obtain relevant feedback. Another major characteristic is that objects within virtual environments should move properly or react with autonomy [4,8].
Kim [4] suggested that sensory fidelity, interaction, and psychological elements determined presence in virtual reality systems. And the results of a study conducted by Fontaine [6] indicated that "the approach widening the range of sensory elements creates bigger presence" which suggests that there is a close relation between presence and expansion of the senses. As a primary contributing factor in the provision of realistic virtual experience, presence in virtual reality systems can be complemented by various influences. To improve presence, therefore, in the aspect of sensory fidelity, various senses should be provided when it is cut off from the real world. Figure 1 shows the relationship among the main elements influencing presence in virtual reality.

Virtual Reality-Based Underwater
Simulation. The purpose of virtual underwater simulations is to enable people to experience actual underwater conditions in a mock environment and to understand its characteristics from the simulation of real underwater conditions. This facilitates easier understanding of the main factors of underwater environments and the environmental structure of the ocean [9].
Many studies which aimed at enabling users to experience environments that are difficult to be directly experienced in real life, such as underwater environments, space, and time, are being conducted [10,11].
Studies that realistically simulate autonomous and natural objects in virtual environments by focusing on autonomy among elements of virtual reality and then visually suggesting them to users are being actively undertaken. Among these studies, "Artificial Fishes" [12,13] aim to increase the presence of virtual underwater simulations by realistically expressing objects' movements or actions in virtual underwater environments using AI. The "Virtual Oceanarium" [9] and "VirtualDive" [14] projects involve autonomous objects in virtual underwater environments that can react to users' actions. This allows users to observe situations similar to real underwater ecologies and then explore them by making contact with virtual fish via simple actions or special interaction devices. The "Virtual Aquarium" [11] project provides sensory effects that can stimulate users' sense of touch by using data gloves and increases the visual reality of the underwater environment itself through photorealistic 3D texturing technology. Even though these studies on underwater simulation aim to increase reality of the simulation by increasing the autonomy of virtual underwater environments and providing simple sensory feedback, in terms of sensory perception elements, they are still limited to audiovisual effects. Therefore, it is necessary to complement sensory fidelity by expanding the sensory effects.

Multisensory Effect Elements in Underwater Environments
The objective of virtual underwater simulations is to make users indirectly experience the underwater world which is difficult to do in the real world in simulated virtual conditions or underwater environments similar to the real underwater environments. Our objective is to provide users with an opportunity to experience realistic physical phenomena occurring in the process of interaction between users and underwater environments, through various senses. To this end, we first analyzed real underwater environments and then determined the sensory elements that can be experienced by users in these environments. The main elements that we determined can be sensuously felt by users in real underwater environments and are listed in Table 1.
In order to provide users with presence or immersion in virtual underwater simulation systems, a connection between virtual reality devices and physical sensory effects occurring in the underwater environment is essential [4]. Thus, numerical value modeling for input data that is reliable and can express physical phenomena data via virtual reality devices is needed.
In underwater environments, the physical phenomena data can change regularly depending on the depth of the water, which means that humans should also sense these different changes in accordance with the depth of the water. Table 1: Physical sense effect elements that can be experienced by users in underwater environments.

Sense effect elements Explanation
Buoyancy A sense of buoyancy is felt, such as floating (positive buoyancy), sinking (negative buoyancy), and stopping (neutral buoyancy), in the water depending on control of buoyancy and weight.
Water pressure As water pressure increases in accordance with increases in depth, pressure is felt on body parts and the air in organs such as the ears and lungs.
Water temperature As the depth of the water increases, corresponding decrease in the water temperature and reduction in warmth occur, resulting in a feeling of coldness.

Resistance
Movements underwater are met with a feeling of resistance and result in turbulence.
Thus, we need to calculate and create values for these physical phenomena at specific depths of water in real time and also show the sensory information occurring in the underwater environment through the sensory organs of participants in the real world. In our study, we simulated in real time those elements that can be numerically calculated among the physical sensory elements analyzed previously. The numerical calculation procedure which we used with the physical phenomena information of underwater environments is depicted in Figure 2.
Among the underwater environment physical phenomena information, buoyancy data, water pressure, and water temperature can be numerically collected or calculated in accordance with water depth. As shown in Figure 2, the data used for water temperature and water salinity are actual measured value. The water pressure can be calculated in regular values depending on water depth. The buoyancy used for input interaction between users and the virtual underwater environment is influenced by water temperature, pressure, and density, which can be calculated according to users' volumes and weight.

Multisensory Effect Reproduction System
We designed a multisensory effect reproduction system for virtual underwater simulation to provide users with sensory effects associated with various physical phenomena occurring in virtual underwater environments. Our proposed system primarily comprises four parts accorded with various roles. Figure 3 depicts the composition of the system and its information flow. The "immersive mask module" provides users with audiovisual information associated with the virtual underwater environment and also measures users' respiration volume in real time. The "sensory effects and interaction module" provides users with a sensory effect reproduction mechanism in which users can experience the physical phenomena in the virtual underwater environment. It also extracts users' motion information and volume data. The "adaptation VR/RV engine" calculates the buoyancy and numerical values of sensory elements occurring in the virtual underwater environment and also translates the sensory effect data into device commands. "Contents" is the sensory effect metadata that defines the sensory effect and virtual underwater environment, which are 3D rendered in real time.

Immersive Mask Module.
The "immersive mask module" supports the audiovisual elements of the virtual underwater simulation and also measures each user's respiration volume. It is provided as an interface that is similar to the underwater devices currently used in real underwater environments. Using user's respiratory patterns, such as the amount of inhalation and exhalation, it is possible to have delicate control of buoyancy and also to express audiovisual effects that occur during the breathing process. By providing not only an environment that is visually similar to the real world environment, but also underwater auditory effects in the form of vibration, audiovisual immersion similar to that which occur in real underwater environments is realistically simulated. The devices comprising the immersive mask module include a respiratory sensor consisting of an eMagin Z800 3DVisor display, air flow head, and pressure sensor, a Honeywell ASCX30DN, three-axis (X, Y, and Z) head-tracker, and a surface sound speaker. Figure 4 illustrates the flow of interaction between each device comprising the immersive mask module and the virtual underwater environment implemented in 3D.

Sensory Effects and Interaction
Module. The "sensory effects and interaction module" is divided into sensory effect devices and the user motion interaction processing module. The sensory effect devices include a rotary motor that makes it possible to have physical rotary movements depending on users' arm motions, a vibrating device that generates vibration when the resistance of turbulence collides with structures, a fan that reduces temperature using wind, and an air bag that presses users in accordance with the degree of water pressure. The users' motion interaction processing module includes a Kinect sensor [15] to recognize users' arm motions and control directions and a BC jacket that controls users' buoyancy. These components are integrated into a chair, and the motion signals of each device are controlled through the "sensory device controller" which is implemented using a TinyPLC TPC9X-based integral PLC control board (TSB-14R). Figure 5 shows the composition of the sensory effect and interaction devices.

Real to Virtual (RV)/Virtual to Real (VR) Adaptation
Engine Module. The "RV/VR adaptation engine module" calculates numerical values in real time in accordance with changes in the water depth of the virtual underwater environment as depicted in Figure 1 and also provides the buoyancy system that calculates buoyancy in accordance with the input of users' respiration and volume data. This enables the water depth to be decided in accordance with users' buoyancy     and the 3D underwater environment to be refreshed in real time. It also creates the sense of buoyancy, water pressure, water temperature, and resistance changes sensory effects in accordance with changes in the water depth of the 3D underwater environment. Here, the sensory effect information in the virtual underwater environment is redefined based on the structure of metadata (SEM: Sensory Effect Metadata) defined in standard technologies MPEG-V [16], related to sensory effects. These data are formed with standardized data (XML instances), changed into device commands by a sensory effects manager, and then reproduced as sensory effects via devices in the real world.

3D Simulation Contents and Sensory Effect Metadata.
We extended the virtual underwater world framework proposed by Kim et al. [17] with the ability to change the kind of underwater objects (fish and seaweed), water color, and brightness of light in the environment. Figure 6 shows the 3D underwater space constructed via 3D modeling and the screen implementing the 3D simulation. Users experience visual immersion in the 3D simulated contents through the HMD screen of the immersive mask module.
Among the 3D simulations based on Part 3 of [18], the sensory information content of the whole standard technologies MPEG-V, we describe and deliver sensory effect information pertaining to sensory elements changing in accordance with water depth. Common to all the sensory effects are activation, intensity-range, intensity-value, and pts information. Intensity-value and pts are both determined by the sensory numerical information calculated in accordance with changes in water depth, whereas intensity start and end times are governed by the occurrence of the messages "TWindEffectOn" and "TWindEffectOff, " respectively. Figure 7(a) shows the schema of the sensory effect metadata defined in MPEG-V, and Figure 7(b) shows the sensory effect XML instance redefined in the RV/VR adaptation engine.
The multisensory effect reproduction system is designed to stimulate each user's somesthesis and kinesthesis in accordance with changes in water depth and the user's motions, and it is linked to the 3D underwater environment simulation. It enables users to directly feel various physical sensory effects occurring in the underwater environment. In the virtual 3D underwater environment, users can navigate using the same motions as in real underwater environments. Figure 8 shows the procedure used for multisensory effect reproduction in our proposed virtual underwater simulation system. Figure 9 depicts the complete system in its final form integrated with each element module comprising the multisensory effect reproduction system. As shown in Figure 9, the multisensory effect reproduction system for virtual underwater simulation comprises (a) an immersive mask module, (b) a multisensory effect reproduction module, (c) a user' motion recognition module, (d) a control box including a PC, and (e, f) simulation scenes monitoring screens provided to users.

Test Methods and Environment.
We conducted an experiment to verify the efficacy of our proposed system in which we analyzed the influence of the multisensory effect reproduction environment for virtual underwater simulation on presence and usability. To this end, we selected a total of 10 participants from people with experiences in marine activities such as scuba diving. After experiencing three types of virtual underwater simulation systems classified into types of virtual reality systems, they completed two types of questionnaires. On the basis of the results of measuring presence and usability subjectively felt by the participants in each test environment, we conducted statistical verification using analysis of variance (ANOVA).
In order to measure the degree of presence felt by the participants, first, a Presence Questionnaire (PQ) [19] was used.  Then, to measure the usability of the system, a System Usability Scale (SUS) Questionnaire [20], which is widely used to evaluate the overall usability of systems, was used. In each test environment, the participants experienced a virtual underwater simulation. Table 2 summarizes the contents and characteristics of the test environments. "Test Environment 3" is the system environment proposed in this paper.

Analysis and Evaluation of Test Results.
We used the IBM SPSS statistical program to conduct statistical analysis of the PQ and the SUS Questionnaire. The results of the PQ survey conducted to evaluate the presence of each test environment are shown in Table 3. As shown in the table, our proposed system environment (Test Environment 3) scored the highest (107.40 points), followed by the immersive simulation environment (Test Environment 2) with 86.90 points and the desktop-type simulation environment (Test Environment 1) with 61.00 points, respectively. Thetest statistic of = 31.117 and the value of < 0.001 indicate that there were statistically significant differences.
The posttest (Scheffe) indicates that there were very significant differences between Test Environment 1(a), providing only simple audiovisual senses, and Test Environment 3(c), providing a multisensory reproduction environment. The results indicate that the presence of multimodal underwater simulation with expanded sensory elements improves users' experiences over the virtual underwater simulation providing only audiovisual senses.
The results of the SUS survey conducted to evaluate the usability of each test environment are shown in Table 4. As shown in the table, the desktop-type simulation environment (Test Environment 1) received the highest score (72.00 points), followed by the multimodal virtual reality environment (Test Environment 3) with 57.25 points and the immersive virtual reality environment (Test Environment 2) 8 International Journal of Distributed Sensor Networks   with 54.75, respectively. The -test statistic of F = 10.806 and the value of < 0.001 indicate that there were statistically significant differences. The posttest (Scheffe) indicates that the immersive virtual reality environment (b) and the multimodal virtual reality system environment (c), with the proposed methods, were similar to each other while showing significant differences from the desktop-type virtual reality environment (a). This result may have occurred because the familiar audiovisual environment such as desktop-type virtual reality environment supports user's experiences and is used by users on daily basis.
Summarizing the results obtained for evaluating presence and usability in this test, the presence of the virtual underwater simulation can be improved by providing multimodal effects on top of the audiovisual effects. This result accords with the result of the study conducted by Fontaine [6], in which it is stated that "the approach widening the range of sensory elements creates bigger presence. " In particular, in the case of the virtual underwater simulation that has various sensory changes depending on water depth, the presence can be improved through expansion of multimodal effects including the type of senses in the real environment, similarity, and interaction elements.
From the aspect of usability, we obtained similar results for the immersive visual reality environment (Test Environment 2) and the multimodal visual reality environment (Test Environment 3). Using such additional interface devices, however, can result in the usability decreasing in comparison with the existing simple desktop-type virtual reality environment (Test Environment 1). This result may occur owing to the lack of experience or the wearability of sensory effect devices, which means that such problems contain issues that should be resolved in the methods expressing sensory elements. In particular, we see that it is required to have continuous research on sensory effect devices.

Conclusions
The ultimate objective of virtual underwater simulation using virtual reality technology is to provide users with an opportunity to experience virtual underwater environments realistically. In order to give a more improved presence when experiencing the underwater simulation, more enhanced methods dealing with the aspects sensory immersion, sensory fidelity, and interaction are needed. In this paper, we International Journal of Distributed Sensor Networks 9 proposed a system in which users can experience realistic simulations of phenomena occurring in actual underwater environments, thereby overcoming the sensory perception elements limitations, identified as a weakness in existing virtual underwater simulation system. To verify the efficacy of our proposed system and methods, we solicited participants and conducted an experiment on presence and usability of the primary evaluation elements of virtual reality underwater simulation systems; our proposed multimodal effect reproduction system maintained its usability while its presence improved.
The virtual reality-based virtual underwater simulation system proposed in this paper makes it possible to conduct underwater training activities without spatial or time limitations, by applying it to marine leisure sports training areas such as scuba diving. Further, its use can be varied according to specific purposes such as virtual reality game, marine environment and ecosystem education, and treatment of deep-sea phobia. In the future, it will be necessary to conduct research that normalizes the characteristics of sensory signals through detailed analysis of sensory elements occurring in real underwater environments. Further, studies on sensory effect interfaces in which more realistic sensory stimulation can be provided to users and also studies that can increase usability by providing natural wearability are needed.