Human-In-The-Loop Remote Piloted Aerial Systems in the Environmental Monitoring

Abstract Paper is related to development of flying robot system. The main objective is to mingle the professional backgrounds in three research directions: development of the aerial vehicle and localization, development of the tele-interaction framework and control system, development of the image fusion system and photogrammetry. Block diagrams give brief description of the systems and sub-systems under the proposed environmental system. Structure of the monitoring UAV adapted for the hand launch given.


I. INTRODUCTION
The construction and development of aerial vehicle or flying robot systems with remote sensing have become a vision of environmental monitoring for agricultural and military purposes as well as for determining the condition of ice, pollution, or territory. The new concept of environmental monitoring is superior to satellite imagery because of its accessibility, flexibility, applicability, and economy. Due to from the expectable advantages, aerial vehicles have been extensively researched for their potential to carry out various missions. Recently, several developments have been launched for aerial vehicles to conduct tasks in battlefield, transportation, exploration, and disaster prevention. Although the aforementioned researches can verify the applicability of aerial vehicle systems, most aerial vehicles lack the flexibility to accomplish other than predefined tasks. Moreover, the limitations of aerial vehicle performance and photographic technique also impede the development of such systems. Therefore, with the introduction of multi-vehicle system, human intelligence, novel aerial vehicle design, and real-time photogrammetry system in outdoor environments, the efficiency and quality of monitoring can be improved significantly.
Without a pilot or crew aboard, Unmanned Aerial Vehicles (UAVs, other commonly used title -Remotely Piloted Aerial Systems -RPAS) is a kind of vehicles that can be utilized to accomplish such tasks as battle operations, risky missions or special assignments like redirecting a guided missile. In addition to applications in battle fields, UAVs have also been extensively developed by several research groups around the world for environmental monitoring in our daily life. Developed and specially equipped UAVs can perform some important tasks in agriculture including crop map development, vegetation monitoring, hyperspectral and multispectral image acquisition, precision agriculture implementation and a number of other tasks. Application of UAVs in agricultural monitoring is effective due to their capabilities: easy and fast mission design, reusability of the platform, cost-effectiveness. Furthermore, developed and specially equipped UAVs for oil monitoring are capable of environmental monitoring and water sampling. In this kind of application, the UAV is equipped with a special container, which makes it possible to take more samples of water during one flight. Other applications of UAV include controlling ice condition in winter, border control and traffic analysis.

II. DEVELOPMENT OF AN ENVIRONMENTAL MONITORING SYSTEM BASED ON RPAS
The research topic of Remote Piloted Aerial Systems (RPAS) is a very significant and popular research area in the development of Unmanned Aerial Systems (UASs). Unlike general UAVs being deployed by following a predefined trajectory and heights, RPAS are more flexible and usable in a variety of applications. Therefore, RPAS can be utilized in monitoring dynamic environment when the paths have to be changed frequently. For example, RPAS can expectedly be developed for highly dynamic environment in military as well as for the monitoring of more efficient disposal of natural resources (forests, water, land, etc.), economic entity surveillance, disaster area detection and infrastructure inspection. For example, there are needs for environmental civil monitoring in the Baltic Sea Region: development of agriculture, state monitoring, ice monitoring in winter. As regards disasters caused by earthquake and typhoon in Taiwan, such RPAS will be used as an important tool to either mitigate misfortunes or rapidly recover the disaster area. Terrestrial and marine areas, forestry inventory, forest fire fighting, forest protection, water level, water pollution and animal migration monitoring in the Baltic Sea Region and Taiwan can all benefit from the development of RPAS. However, RPAS with UAVs have not been utilized for the aforementioned applications extensively, effectively and regularly.
With the use of RPAS, the method of regulating and controlling a group of UAVs becomes significant for accomplishing the assigned mission more quickly and efficiently. Controlling a group of vehicles (aerial and/or grounded vehicles) for keeping formation, following a trajectory, avoiding collisions and sensing unknown environments is a popular research topic. It is demonstrated that enduring multi-vehicle systems with the intelligence of a human being are foreseen as a key technique for enhancing the performance and efficiency of multi-agent systems. Hence, the development of an interface between a human operator and UAVs in RPAS will become a vital issue in enhancing RPAS for environmental monitoring. In the case of multiple UAVs, the problem of controlling flying paths, trajectories and heights would be extremely difficult. Although the control framework for a human operator to control a group of mobile robots was presented previously, the instability behaviour of aerial vehicles and 3D flying space would make the control system difficult to implement. In order to enhance RPAS with human intelligence, a novel system framework, control schemes, theoretical analyses and UAVs-human interconnection in RPAS will be necessary [1]. The image acquiring system, image processing and ortho-image generation and creation of 3D images also play an important role in the use of UAVs for environmental monitoring, especially in large and dynamic outdoor environments. Flight image integration systems are required in order that not a single part of the investigated area between the adjacent images and routes is left without being captured in the process of UAV flying and taking images of the determined area. When selecting the photogrammetric project parameters it is necessary to take into consideration certain weather conditions during the flight and especially the wind direction. The air flow causes the wind speed to increase along with increasing height above the ground. The flight height can be obtained by selecting the required Ground Sample Distance (GSD) or a pixel size on the terrain and depends on the camera focal length, camera sensor width and on the area on the ground covered by one image. Based on the parameters of photogrammetric project and on the values of flight project, the UAV flight could be marked using software (Mission Planner or others) during the flight or previously, when indicating the initial and final points of the planned flight route on the Google Earth map. Thus, on the screen or, to be more exact, in software tables, there appear geodetic coordinates of route pointslongitude and latitudetake-off and landing points [2], [3].
The main objective is to mingle the professional backgrounds in three research directions:  development of the aerial vehicle and localization (blue on Fig. 1);  development of the tele-interaction framework and control system (green on Fig. 1);  development of the image fusion system and photogrammetry (pink on Fig. 1). It is required to develop an intelligent human-in-the-loop remote piloted aerial system in environmental monitoring.
The use of multiple remote sensing devices (aerial vehicles) is significant and necessary for providing high efficiency and flexibility environmental monitoring. See Fig. 1 for the block diagram of the system. However, a group of multiple aerial vehicles is not easy to control by conventional methods due to its high degree-of-freedom. In order to overcome this difficulty, this research will use human motion as the control input because any points on human body can be designed corresponding to different aerial vehicle commands. By capturing the motion of human operator, the bilateral control system can decompose the high-level control to individual control inputs for each monitoring aerial vehicle. These commands will be passed via the main control station to the supervisory aerial vehicle and then to the aerial monitoring vehicles in order to regulate their flying height, speed, direction and trajectory. Meanwhile, the image capturing device and photogrammetry system mounted on the monitoring aerial vehicles will collect and acquire environmental image and information remotely. This information along with the aerial vehicle position and orientation obtained from the localization system will be transmitted to the supervisory aerial vehicle. After the main control station receives all information from the group of monitoring aerial vehicles, the data will be provided for the post-processing and sensing fusion system. These environmental monitoring data are transmitted to the bilateral control system and image-storage workstation to generate haptic information. The haptic feedback device will convey visual image to the human operator and force feedback to influence the human motion. The main objective of haptic feedback is to provide informative results for human operator to make the next decision more wisely. Moreover, force feedback can constrain human motion so that the further command is more suitable for the aerial vehicles, which is generally limited by height and speed, to ensure system stability.
The relationship between the research directions is shown in Fig. 2. It is expected that the outcomes of this joint approach could provide us a novel environmental monitoring system with the benefits of human intelligence, novel vehicle design and advanced photogrammetric technique. The research results will be potentially applicable in various kinds of outdoor monitoring including tasks in large scale and highly dynamic environments. In addition, it is also possible to use the developed system to replace costly satellite systems. It is expected that this research would make significant contributions in aerial vehicle systems for environmental monitoring as well as providing novel schemes and demonstrations in academics in various fields such as aerial vehicles, haptics device, visual/force feedback, localization, photogrammetry, wireless communication and sensing fusion systems. Moreover, if the development of RPAS can be well-established and more mature, environmental monitoring technologies will be led to a brand new situation.

III. DEVELOPMENT OF THE UAV
The overall aim of the research direction is to develop a multifunctional unmanned aerial system for environmental monitoring. Sub-tasks for this direction are to develop environmental monitoring methodology through CAD modelling of an unmanned supervisory aerial vehicle and a ground station and development of an innovative unmanned monitoring vehicle. A complex of research is carried out in the process of modelling the proposed human-in-the-loop system. It includes the development of an innovative structure for both supervisory and monitoring vehicles, calculation of main parameters and features of UAVs, optimization of such characteristics as aerodynamics, mass, strength and flight. Effective propulsion for both vehicles and technical documentation for their further manufacturing will be developed.
The developed aerial vehicle (see Fig. 3) is adapted for belly-landing. The airplane consists of a fuselage, a wing, a tail, a motor with a propeller and an under-fuselage payload container adapted for carrying surveillance equipment. The payload container includes a front module, an aft module, both coupled to the fuselage and spaced apart along the fuselage axis, and a central module. The central module is rotatably attached to the front module and aft module [4].  Such configuration of the payload container provides for improved field of view for the surveillance equipment installed in the central module. With a rotatable central module the protection of the camera or other surveillance equipment does not depend on the size of window or opening in the module, therefore the surveillance equipment that require large opening in the container wall may be used. The aerial vehicle has following advantages when used for the environmental monitoring:  The front module comprises a navigation video camera, having a dedicated video camera for navigation purpose allows for separate optimisation of the airplane control system and optimisation of various payloads. The front module and the aft module are installed on pylons attached to the fuselage. Installing the payload container at some distance from the fuselage reduces or even Such a configuration of the payload container provides an improved field of view for the surveillance equipment installed in the central module. With the rotatable central module, protection of the camera or other surveillance equipment does not depend on the size of window or opening in the module; therefore, surveillance equipment that requires a large opening in the container wall may be used. The aerial vehicle has the following advantages when used for environmental monitoring [5]:  The front module includes a navigation video camera with a dedicated video camera for navigation purposes, which allows for the separate optimisation of airplane control system and optimisation of various payloads.  The front module and aft module are installed on pylons attached to the fuselage.
Installing the payload container at some distance from the fuselage reduces or even eliminates the shadowing of the navigation camera by the airplane propeller as w ell as reduces requirements for the aircraft levelling during landing.  The under-fuselage payload container is placed in such a way that the geometrical centre of the central module is under the airplane's centre of gravity when the airplane is in horizontal flight. Such a placement of the payload container allows for changing payload weight without changing the airplane balance.  The central module has an opening in its lower part that is the side turned to the ground during the airplane's horizontal flight. Depending on surveillance equipment, the opening may be protected by an optical or radio transparent material.  The central module may be adapted for carrying surveillance equipment. For example, the airplane may carry such equipment as a video camera, a photo camera and an infrared camera.  The central module is adapted for rotation within at least ±90° range. Such a range provides both a field of view which is large enough for the surveillance equipment and sufficient protection during landing.  The central module is adapted for rotation within at least ±180º range. The enlarged rotation range provides for not only an enlarged field of view but also for a better protection of the surveillance equipment during landing as the opening in the central module, which is facing down when surveillance equipment is in operation, is turned to face the fuselage during landing.  The payload container has a spindle-shaped form with the container axis parallel to the fuselage axis, and the central module is rotatable around the container axis. The streamlined shape of the container reduces air resistance.  The airplane is adapted for carrying different replaceable central modules with surveillance equipment. It is easier to replace the whole central module than the equipment inside it. Whereas it is more economical to replace only one central module rather than the whole payload module, because the part of equipment, which is common for surveillance equipment, e.g. batteries, controllers, rotation actuators, navigation equipment, are installed in the front and aft modules and only the specialised part of the equipment such as cameras, sensors, etc. are replaceable.
 The central module has a central position and a landing position. The central position is a position when surveillance equipment is targeting down during horizontal flight, while the landing position is a position turned away from the central position by at least 45°.  The airplane is adapted for hand launch (Fig. 4). The structure is suitable for small airplanes launched and retrieved without any additional appliances, i.e. for airplanes, which are hand launched by the operator and belly-landed on dry land and after the replacement or recharging of their batteries are ready for the next flight . The new capabilities proposed for UAVs under the human-in-the-loop environmental system complex also include [5]- [7]:  Silent flight as fuel cells supplant internal combustion engines in some systems;  Endurance UAVs serving as GPS pseudo-satellites and airborne communications nodes to provide users with better connectivity, clearer reception and reduced vulnerability to jamming;  More survivable UAVs through the use of advanced materials, novel composite materials;  UAVs will be protected with multifunctional coatings, special attention will be given to UAVs operating in highly corrosive marine environment;  Significantly speedier information availability to users through on-board real-time processing, higher data rates and covert transmission through the big data approach;  Special software for operation and control will be developed and integrated in the human-in-the-loop system for each monitoring case. On the basis of supervisory and monitoring UAVs, there has been developed a second research direction proposing a novel human-UAV interaction system to remotely control a group of UAVs. Since the number of UAVs may differ for different applications, the main issue is how the interaction of a large number of UAVs can be controlled by only one human operator. A motion capturing system will be designed and built to acquire human movement. Therefore, any characteristic points on a human body could be defined to control the motion of UAVs. The motions of a human operator will be utilized as a high-level command for the accumulated motions of UAVs, and the individual UAV can slightly regulate its trajectory and height for other tasks such as collision avoidance. The acquired information will be conveyed to a ground station developed to control the supervisory and monitoring UAVs. The results of image processing will be transmitted to the human operator to provide real-time information about the monitoring environments. In addition to feedback images, a haptic device will be built to reflect force information for the human operator in order to enhance remote operation performance.

IV. CONCLUSION
Research mingles three different directions: development of the aerial vehicle and localization, development of the tele-interaction framework and control system, development of the image fusion system and photogrammetry.
Developed UAV under human-in-the-loop environmental system main advantages are silent flight, endurance, serving as GPS pseudo-satellites, survivability through the use of advanced materials, novel composite materials; use of multifunctional coatings, speedier information availability to users.
The research results will be applicable in various kind of outdoor monitoring, including tasks in large scale and highly dynamic environments. In addition, it is also possible to use the developed system to replace costly satellite system.