A 3D radiation image display on a simple virtual reality system created using a game development platform

The Fukushima Daiichi Nuclear Power Station (FDNPS), operated by Tokyo Electric Power Company Holdings, Inc., suffered a meltdown after a large tsunami caused by the Great East Japan Earthquake on March 11, 2011. The measurement of radiation distribution inside FDNPS buildings is indispensable for executing appropriate decommissioning tasks in the reactor's buildings. In addition, it is extremely important to accurately predict the location of radioactive contamination beforehand because the working time is limited owing to radiation exposure to workers. In this paper, a simple virtual reality (VR) system that can detect radioactive substances in virtual space has been developed to simulate real working environments. A three-dimensional (3D) photo-based model of the real working environment, including an image of the radioactive substance, was imported into the virtual space of the VR system. The developed VR system can be accessed using a smartphone and a cardboard goggle. The VR system is expected to be useful for preliminary training of workers and for recognizing radioactive hotspots during decommissioning of the work environment.

: The Fukushima Daiichi Nuclear Power Station (FDNPS), operated by Tokyo Electric Power Company Holdings, Inc., suffered a meltdown after a large tsunami caused by the Great East Japan Earthquake on March 11, 2011. The measurement of radiation distribution inside FDNPS buildings is indispensable for executing appropriate decommissioning tasks in the reactor's buildings. In addition, it is extremely important to accurately predict the location of radioactive contamination beforehand because the working time is limited owing to radiation exposure to workers. In this paper, a simple virtual reality (VR) system that can detect radioactive substances in virtual space has been developed to simulate real working environments. A three-dimensional (3D) photo-based model of the real working environment, including an image of the radioactive substance, was imported into the virtual space of the VR system. The developed VR system can be accessed using a smartphone and a cardboard goggle. The VR system is expected to be useful for preliminary training of workers and for recognizing radioactive hotspots during decommissioning of the work environment. K : Models and simulations; Radiation monitoring

Introduction
A large amount of radioactive substances was spread over a wide area because of the Fukushima Daiichi Nuclear Power Station (FDNPS) accident that occurred on March 11, 2011. The accident was caused by a large tsunami instigated by the Great East Japan Earthquake. As a result, decommissioning tasks inside and outside the reactor's buildings are underway. Radiation distribution measurements inside the FDNPS buildings are essential to execute appropriate decommissioning tasks in the reactor buildings. Specifically, mapping the radiation distribution, which indicates the distribution of radioactive substances inside the buildings, is extremely important for predicting potential risks to workers and, consequently, decrease the amount of radiation exposure. This type of map would help the workers recognize the locations and shapes of radioactive contamination in their work environments. In addition, information about the distribution of radioactivity would be effective during the formulation of detailed decontamination plans. Therefore, improving radiation measurement methods is especially important for mapping radiation distribution.
For FDNPS buildings, three-dimensional (3D) visualization of the location of radioactive substances is necessary because inside these buildings, radioactive contamination covers the ceiling, walls, floor, and many building structures. We have developed a compact Compton camera to measure and visualize radioactive substances scattered in FDNPS. The details of our Compton camera are described in refs. [1,2]. This camera is based on the technology of a handheld Compton camera developed jointly by Waseda University and Hamamatsu Photonics [3,4]. The gamma-ray sensor of the Compton camera comprises two components: a scatterer and an absorber. This gamma-ray sensor employs a Ce-doped GAGG (Gd 3 Al 2 Ga 3 O 12 ) scintillator coupled with a multipixel photon counter (MPPC; Hamamatsu Photonics K.K.) [5,6]. The scintillator has a 15 × 15 pixel array, and the sizes of each pixel in the scatterer and absorber are 1.5 mm × 1.5 mm × 5 mm and 1.5 mm × 1.5 mm × 10 mm, respectively. The distance between the two gamma-ray sensors is 23.5 mm. The Compton camera weighs less than 1.0 kg.
In a previous work, we achieved successful 3D-location identification of radioactive substances by using the aforementioned Compton camera. In the previous method for radiation imaging using a Compton camera, information about the location of radioactive substances cannot be obtained -1 -  137 Cs sources enclosed in cardboard boxes were measured using a Compton camera, and the 137 Cs images were drawn on the 3D structural model [11].
in three dimensions (3D) because the Compton camera estimates only the direction of radioactive substances. Takeuchi et al. [7] demonstrated by means of Geant4 simulation that the position of the radiation source can be determined in 3D through measurements from different viewpoints using a Compton camera [8][9][10]. In addition, we confirmed experimentally that the positions of radiation sources can be determined in 3D space by using the cone data of the Compton camera measured from different viewpoints [2]. However, workers must be able to visually recognize radioactive substances in a real space. In the measurement scheme involving the previously developed gamma camera, including a Compton camera, a radioactive substance is recognized by superimposing a two-dimensional (2D) radiation image on a 2D optical photograph of an experimental space. In the new method developed by us herein, a 3D structural model of the experimental space is constructed using photogrammetry on a computer. The 3D radiation images obtained using the Compton camera measurement described -2 -

JINST 13 T08011
above are superimposed on the 3D structural model. This facilitates visualization of radioactive substances in 3D. The use of the 3D structural model based on photographs allows for easy visualization of the location of radiation sources in real spaces [11].
Measurements inside the FDNPS buildings pose another problem: the high levels of radioactivity prevent workers from entering the buildings or staying inside for long periods. Therefore, the use of a robot and a radiation detector is especially important. A remote radiation imaging system using a Compton camera mounted on a multicopter drone was developed to remotely measure the distribution of radioactive contamination inside FDNPS buildings [1]. Outdoor evaluation tests in the coastal area (Hamadori region) of Fukushima indicated that the drone system successfully observed several hotspots from the sky.
In the present study, we developed a simple virtual reality (VR) system that displays images of radioactive substances measured using the above-mentioned methods on a virtual space that reproduces the working environment. By applying such a VR system to the FDNPS decommissioning work, workers can detect the locations of radioactive hotspots in the virtual space before entering the FDNPS buildings. Inside the FDNPS buildings, it is extremely important for the workers to be aware of the location of radioactive hotspots beforehand because the working time is limited considering exposure to radioactivity.

Radiation image display on a simple virtual reality system
The developed VR system comprises three key elements: 3D model of the experimental space, images of radioactive substances, and devices that help experience the virtual space. Initially, the 3D model and images of radioactive substances in the experimental space were constructed in the virtual space. Figure 1(a) shows a superposition of the 3D structural model of the experimental place perceived by photogrammetry over the images of 137 Cs-radioactive sources acquired using the Compton camera. Details of both the 3D visualization of radioactive substances and the construction of the 3D structural model have been discussed in our previous work [11].
In total, about 200 photographs of the meeting room were taken, including those of the room's objects such as a table, chair, and cardboard box, as shown in figure 1(b). The size of the experimental space was 3.8 × 4.8 m 2 . The 3D structural model was constructed using the photogrammetry software VisualSFM [12,13]. The algorithm for 3D reconstruction of the objects by using photogrammetry is shown in figure 2. First, feature points are extracted from each photograph. Feature points are assigned to the edges and corners of the objects in the photograph. Then, feature point matching is performed to estimate which points correspond to each other in a pair of photographs. Next, the camera position at which each photograph was taken is estimated using the common feature points, and simultaneously, the positions of the common feature points are determined three-dimensionally by using the principle of triangulation. In the sparse 3D point cloud data reconstructed using the above method, point cloud densification is performed. A more detailed explanation is given in refs. [12][13][14][15].
The 3D radiation images were created using a Compton camera. In the experimental space, two 137 Cs-radioactive sources enclosed in cardboard boxes were placed. The radioactivity of these sources located on the floor surface and on the table were 0.95 MBq and 8.8 MBq, respectively. The radiation sources were measured from different viewpoints by using the Compton camera to obtain -3 - these 3D images. Then, the radiation images were drawn on the 3D structural model to visually recognize the radioactive sources in real space.
In a new attempt, the 3D structural model and radiation image were reproduced using a VR system, which was developed in the Unity 5.6 environment [16]. Unity is a game development engine that incorporates an integrated development environment and supports multiple platforms for 3D and 2D video games and simulations for computers, consoles, and mobile devices running iOS and Android. In addition, Unity can be operated on a PC equipped with Windows or Mac OS.
Unity applied not only to games but also to virtual reality technology in various fields (see section 3). We display images of radioactive substances in a 3D structural model of the radioactive working environment by using Unity.
The development was performed on a PC, and its specifications are listed below. Initially, the 3D structural model and the radiation image were imported into Unity and reproduced in virtual space. All data of these 3D images were prepared within point cloud data and imported into Unity. The point cloud data represent an aggregation of points, where each point has 3D position coordinates and RGB color information. We used Point Cloud Free Viewer, a Unity plug-in that allows for importing and displaying point cloud data in Unity [16]. 3D images can be imported in this way without programming. Figure 3 shows the 3D stereo views created using Unity and those displayed on the VR system. Figures 3(a)-(c) show the same experimental space from different viewpoints. The viewpoint of the worker in the virtual space can be set arbitrarily. Because the height of eye position and the field of view can be changed arbitrarily, it is possible to consider the height of the observer in the VR system. By appropriately setting the height of the eye position, the location relationship between the observers and the radioactive substances was reproduced. In addition, we can obtain the distance information between the observers and the radioactive substances by incorporating actual distance -4 -information into the 3D structural model beforehand. It is conceivable to measure the dimensions of the working environment by using a laser range finder (LRF) system described later and add distance information to the 3D structural model.
The standing position of the observer changes, and the radioactive substances may be observed from arbitrary positions and viewpoints in virtual space, as shown in figures 3(a)-(c). In addition, information such as dose rate or radioactivity of the radioactive substances can be added and displayed inside the virtual space, as shown in figure 3(c). It is also possible to display arbitrary information in the virtual space, such as the type and condition of each building structure. This may be useful for planning the work that must be done at the actual work site. The developed VR system may be experienced using a smartphone and a cardboard goggle (VR goggle; Hacosco Inc.). The size of the cardboard goggle is 162 mm × 130 mm × 85 mm, and the maximum size of the settable smartphone is 160 mm × 80 mm × 10 mm. The 3D stereo view was displayed on the smartphone, as shown in figure 3(d). Users set the smartphone inside the cardboard goggle and looked into it to locate the radioactive substances in virtual space. A binocular-type lens was adopted for the goggle because with such a setup, it is easier to achieve an immersive feeling than with a single lens. An iOS device (iPhone 6s) was used as the smartphone, and it was installed inside the cardboard goggle. The advantage of using smartphones is that they incorporate a gyro sensor and an accelerometer, which enable the user to use the head-tracking mechanism in the VR system. The head-tracking mechanism was incorporated into the VR system by using a Unity plugin called Dive, which works across the iOS and Android platforms [17] (Durovis Dive Software Development Kit supplied by Shoogee GmbH & Co. KG). The values measured by the smartphone sensors were acquired using Unity and synchronized with the viewpoint of the observers set in virtual space. For example, if the user moves their head to the left or to the right, they can look around the left and right in virtual space. In addition, it is possible to look down at the floor. A simple VR system that can display radioactive substances in the working environment in 3D was hence developed successfully.
In addition, we describe an application example for visualization of radioactive substances released due to the FDNPS accident. Radioactive substances scattered in the outdoor environment of Hamadori region in Fukushima were visualized and reproduced by the VR system, as shown in figure 4. A 3D model of the outdoor environment was constructed via photogrammetry by using Agisoft PhotoScan [15], and images of the radioactive hotspot were superimposed on the 3D model of the outdoor environment. The procedure for performing photogrammetry is the same as that shown in figure 2. For radiation measurement, we mounted the Compton camera on a small unmanned helicopter. The helicopter system was made to hover above the measurement area, and the Compton camera remotely visualized the hotspot due to the 137 Cs deposited on the ground of the road surface. In addition, we have conducted a similar measurement for visualizing radioactive hotspots by using a multicopter drone equipped with the Compton camera [1].
This VR system can help easily experience a wide area such as outdoor environments by using a smartphone and a cardboard goggle, without requiring special equipment. In addition, because there are many structures in the wide area inside the FDNPS buildings, we believe that the VR system for visualization of radioactive substances on a 3D structural model is useful for gaining experience about the actual work environment.

Characteristics of component technologies and future prospects
The significance of the technologies used for the development of the VR system is described in this section. In recent years, game engines such as Unity (used in this work) and Unreal Engine are increasingly used not only in games but also in fields such as architecture, medical care, and academics [18]. This tendency is particularly strong among researchers working with VR systems. Because Unity is a popular engine for mobile devices, it is easier to get high performance on terminals with lower specifications than that is possible with the Unreal Engine, which was developed for high-end devices. Unity is appropriate for a VR system built by combining a smartphone and cardboard goggles.
Then, the reconstruction of a 3D structural model is discussed. Because photogrammetry performs reconstruction by using photographs, it is easy to recognize the condition of the environment and the types of objects present therein. In addition, no special equipment is required, because photographs taken with ordinary digital cameras can be used.
In addition to photogrammetry, we are developing a method to acquire a 3D structural model of the work environment by using a LRF system. A pulsed laser can be used to scan objects and scenes to obtain 3D point cloud data of the experimental space. In the dark environment inside the FDNPS buildings, image quality deteriorates, and the precision of photogrammetric reconstruction decreases accordingly. The LRF system can be used to perform measurements in this dark environment because it employs laser scanning. Last year, we succeeded in drawing a 3D radiation distribution map by integrating the radiation image measured using the Compton camera with the point cloud data of the inside of the turbine building of Unit 3 in the FDNPS, which were acquired using a scanning LRF [19]. A 3.5 mSv/h hotspot was detected using the Compton camera and visualized on the 3D radiation distribution map. The point cloud data reflect the dimensions and appearance of the actual environment, so that the worker can grasp the position and spread of the hotspot more precisely. We plan to import the 3D radiation distribution map into the VR system in the future. In addition, applications using a LRF, such as LRF mounted on an unmanned aerial vehicle (UAV), have already been developed for creating 3D structural models of various scenes from a moving UAV [20]. We focus on this data acquisition method for creating VR data.
-7 -Next, we use smartphones for the VR system because smartphones are widely available to the majority of users. In addition, users can experience the VR system at a low cost by combining a smartphone with cardboard goggles, unlike other VR headsets.
In addition to the VR system, we plan to use augmented reality (AR) devices in the future. AR involves adding information such as images and movies to a real world scene by using several devices such as a smartphone and an AR goggle. By using such technology, the images of radioactive substances can be displayed in the actual working environment inside the FDNPS. Workers will be able to move and work while avoiding radioactive substances by looking at the image displayed on the AR device. In addition, the development of an interface that displays both VR and AR on single device has been reported [21]. We believe that these technologies should also be applied to decommissioning work, including visualization of radioactive substances, in the future.
Finally, we describe future developments based on the inputs of workers. In this system, the image of the radioactive substance can be displayed on the 3D model of the working environment inside the VR space. Workers desire to know the air dose rate distribution of the working environment from the viewpoint of exposure dose control. We are currently considering not only the location and shape of radioactive hotspots but also displaying the air dose rate in the working environment by using color contours. This would make it possible to anticipate dose exposure over the course of the work in advance.

Conclusions
In this work, a simple VR system that can recognize radioactive substances in virtual space was developed to reproduce actual working environments. The developed VR system comprises three key elements: the 3D model of the experimental space, images of radioactive substances, and devices that help experience virtual space. The 3D model and images of the radioactive substances were created using photogrammetry and a Compton camera, respectively. The VR environment was experienced using a smartphone and a cardboard goggle. In this paper, radioactive substances placed in the meeting room and deposited in the outdoor environment of Hamadori region in Fukushima prefecture were visualized in VR space. Users can observe radioactive substances from arbitrary viewpoints in virtual space. The VR system is expected to be useful for preliminary training of workers and for detecting radioactive hotspots in the decommissioning work environment. Additionally, we are considering using AR technology in working environments inside the FDNPS. Images of radioactive substances can be displayed on an actual working environment by using the AR devices, and workers can work while predicting radioactive hotspots in the actual working environment.