DEVELOPING AR INTERFACE FOR INTERIOR MODELLING FROM 3D SENSOR

This paper aims to show the process of Augmented Reality (AR) development representing existing building interior from a 3D sensor. These days, AR is becoming well known and widely used in various type of areas such as in gaming industry, medical training, and education. Due to its advantages, AR development for building interior is also necessity, specifically to assist people in understanding the scene more, in Architectural, Engineering, Construction / Facility Management (AEC/FM) applications as well as businesses like interior designing and household marketing. With respect to current situation of COVID-19 pandemic that has restricted travelling makes the usage of AR very beneficial and it could assist people to visualize the interior that needs to be visited. With the aid of suitable sensors like 3D sensor, the development of AR representing existing building interior which is usually full of clutter and occlusion could be eased.


Introduction
Nowadays, Augmented Reality (AR) is recognised as a technology that is changing the most. By activating our senses with computer-generated images, our minds can be absorbed in the sensation that momentarily happens. It recognises AR as another true version of reality [1]. Augmented Reality (AR) is a medium that can be used to visualize a scene in a digital form where it can be shared and accessed easily with the help of internet.
A three-dimensional (3D) sensor is a sensor that allowed us to capture the scene within the range of the vision and produced depth data. 3D sensor able to scan the interior environment and suitable processing can be performed in developing the visualization so that others can see the real representation of the scene.
3D scene perception of indoor environments on a human scale is becoming increasingly relevant in many fields of discipline. By gaining a full situational knowledge of an indoor environment, law enforcement, public safety and first responders will benefit from 3D models. An as-built 3D building models for example, able to assist owners and managers in preserving and maintaining the estate land [2].
3D sensor also can be used in Building Information Modelling (BIM). It can be used as a way to obtain the overview of the room or building by scanning the entire room. It also can be used when the original CAD drawing of the old building is missing. 3D sensor can offer more textural information of the room in a short time compared to redrawing the CAD for the room with manual measuring process. Because of this, a thorough experiment and analysis are required to assess the use of a 3D sensor in interior modelling and visualization on an Augmented Reality platform. This project will use data from a 3D sensor and with suitable processing to create an Augmented Reality experience.

Three-Dimensional Sensor
Three-dimensional (3D) technology is an important scientific breakthrough. It is a depth-sensing technology that augments camera capabilities for visual perception. The method of capturing a realworld object's length, width, and height with a lot of clarity and in-depth detail may be achieved by employing 3D technology. In this work, a 3D sensor called Structure sensor is used.
Structure sensor is a 3D sensor that is developed by Occipital. It is supported with an open-source platform that when connected to a tablet, mobile phone, or computer, it acts as a mobile Structured Light System (SLS). It is compatible to be use for body scanning, medical, canvas and education [3]. This Structure sensor consists of infrared sensor, infrared projector and Light Emitting Diode (LED). With the aid of RGB camera on the iPad for example, it can perform as an SLS that able to scan the environment. This sensor is also used in modelling environment for interiors [4], [5]. The specification of Structure sensor is as stated in Table 1.

Augmented Reality
Augmented Reality (AR) is one of the main technological developments, and when AR-ready smartphones and other applications become more available around the globe, it can only become bigger. AR let us see right in front of us the real-life environment. These scenarios are not that far from what may already be available on your smartphone, with advancements in AR technology. In fact, AR is readily accessible and used in a range of ways, such as Snapchat glasses, in applications (apps) that help you locate your car in a busy parking lot, and in a few retail apps that allow you to try on clothes without ever leaving home. AR is widely used in the world for applications such as medical training, education, retail, marketing, repair and maintenance, business logistics and tourism industry [6]. The advantages of AR are first, it blurs the line of difference between the virtual and real world, thus increasing its usability and effectiveness in application. Next, success or failure of an instance can be determined by using the computing power of AR, thus saving a ton of money. Besides that, it possesses a highly interactive nature which enable to assess several instances in advance. Finally, its strong application in the field of health is noticed, thus enhancing the precision of diagnosis for diseases. Figure 1 shows the flowchart that represents the overall process in developing the AR for interior modelling. The process is divided into four sections. The first section is the data acquisition. This section covers the process and method used in acquiring the dimension for the floor plan and the 3D data of the rooms for the AR. The second section is the data processing. For this section, it will be divided into two parts. The first part is creating the layout of the room and the second part is merging the 3D data whenever necessary, specifically data of large interior. The third section is the data reconstruction. For 3 this section, the task is to process and model the 3D data. The last section is the AR development, where the visualization of the interior model in the AR platform.

Data Acquisition
The process for data acquisition is divided into two parts. The first part is gathering the information regarding the measurement of the room or space. The method use for this part is by measuring the dimension using a laser measure. The chosen places for this project were the corridor outside lecturer's room, a lecturer's room and the Computer and Instrumentation Laboratory of Faculty of Electrical Engineering Technology, Universiti Malaysia Perlis (UniMAP). Figure 2 to Figure 4 show the selected scenes with their measurements. The second section is the data acquisition for the 3D data of the room. The data is acquired by using the Structure sensor and an iPad. The Structure sensor is mounted to the iPad using the provided bracket to collect the 3D data. The data acquisition process for the 3D data of the room is done by capturing the scene of the room using an app called Canvas.

Data Processing
Data processing section is divided into two parts. The first part of the process is the creation of the room layout. For this part, the process is done using a designing software named Sweet Home 3D, an opensource software that is developed specifically to design a room or house [8]. The second part is the process of merging all the 3D data from the Structure sensor of the room. This process is done using MeshLab software. MeshLab software is an open-source software that provide tools to process and modelled the data from Structure sensor such as filtering, merging, and aligning the 3D data [7].
For second part of data processing which is to merge all the data into one mesh, the point-based glue method in the provided align tool was utilised to connect and aligned all the data representing the large interior (lab). Point-based glue method is a tool to align and glue two or more 3D data which works based on the Iterative Closest Point (ICP) method.

Data Reconstruction
This section is where the combined structure data will be integrated with the room's layout. The reason to adding the structure data to this layout is that it will give a bit of depth to the layout. There are two method that have been used for this section. The first method used was manually editing the distance so that it can be perfectly fitted and then flatten the mesh. The second method used was the point-based glue method which is the same as the merging Structure sensor data in the second part of data processing in Section 3.2.

Augmented Reality Development
For the last section, which is the AR development, a mobile app called Assemblr was utilised in visualizing the completed model in AR platform. The first step was to upload the completed model in the Assemblr Studio software. Before uploading, the file first must be converted to .fbx format. For the conversion from .obj to .fbx format, an open-source software named Blender was utilised. After the file was converted into .fbx format, the developed model was uploaded as a component into the library so that it can be used in visualizing it in AR platform. Once the model has been uploaded, the next step was to create a project. Once the project was created, the developed model from the library is chosen and placed into the marker zone. Once done, the project can be published. This method can be done by using either the Assemblr app or Assemblr Studio.

Results and Discussion
The process in visualizing the completed interior model in AR platform were done for all three locations. Table 2 summarizes the overall data acquisition process which consists of acquiring 3D data and creation of the layout of the room. From the table, the 3D data for corridor and lecturer's room was already in one complete mesh, while for the Computer and Instrumentation laboratory, there were three separate 3D data. This is because the lab dimension is bigger, compared to the other two scenes.
Due to this, the lab data undergone additional step where all the data is merged into one mesh by using the point-based glue method. Figure 5 shows the result acquired from this process. From here, all the lab data acquired is combined.  Figure 5. Combined 3D data Figure 6 shows the overall results for the data reconstruction method for all scenes, where the data have been integrated with the layout of the room. It can be seen that there is a difference between these reconstructions, where the 3D data does not fill the lab completely due the size of the lab is bigger compared to other interiors.  Figure 7 shows the result where all the completed models been visualized in Augmented Reality medium. For this process, there are several difficulties where the files need to be converted into .fbx format before adding it to library for it to be able to be viewed in AR platform. Utilising an online conversion website at first faced difficulties to upload to the library. Thus, an open-source software named Blender was used to convert the file into .fbx format. Assemblr app is chosen as it is very user friendly where a lot of commands available such as levitate, scale, move and rotate in utilising the AR features. The interface on the app on how to visualize an object into AR mode is also very user friendly. It could also provide game mode to navigate through the spaces using console.

Conclusion
The use of a 3D sensor in interior modelling AR development has been demonstrated in this project. The 3D data from the Structure sensor was collected in three locations: the corridor to the lecturer's room, the lecturer's room, and a laboratory. This project also demonstrated the process of integrating each 3D data with each individual room, as well as the visualization of the entire model in an Augmented Reality platform. Finally, the 3D data from the Structure sensor could be successfully integrated into the model. However, to obtain a better result, the process of acquiring 3D data can be improved in the future. Proper planning especially in collecting data representing big interior should be conducted. As can be seen from the results, the completed model for the lecturer's room and corridor looks better compared to the completed model of the laboratory, as the 3D data for both rooms almost filled the entire layout nicely while the 3D data for the lab is only a portion. The visualization in Augmented Reality medium process using the Assemblr app is very simple because of the variation of mode and tools that can be chosen to be used in aiding the process. Virtual Reality platform can also be used as another medium to visualize the interior model in the future.