Next Article in Journal
Minimize the Route Length Using Heuristic Method Aided with Simulated Annealing to Reinforce Lean Management Sustainability
Next Article in Special Issue
Election Algorithm for Random k Satisfiability in the Hopfield Neural Network
Previous Article in Journal
Special Issue on “Microwave Applications in Chemical Engineering”
Previous Article in Special Issue
A New Improved Learning Algorithm for Convolutional Neural Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Integration of Collaborative Robot Systems and Their Environmental Impacts

1
Faculty of Informatics, Titu Maiorescu University, 040051 Bucharest, Romania
2
Faculty of Finance-Banking, Accountancy and Business Administration, Titu Maiorescu University, 040051 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Processes 2020, 8(4), 494; https://doi.org/10.3390/pr8040494
Submission received: 16 December 2019 / Revised: 9 April 2020 / Accepted: 21 April 2020 / Published: 23 April 2020
(This article belongs to the Special Issue Neural Computation and Applications for Sustainable Energy Systems)

Abstract

:
Today, industrial robots are used in dangerous environments in all sectors, including the sustainable energy sector. Sensors and processors collect and transmit information and data from users as a result of the application of robot control systems and sensory feedback. This paper proposes that the estimation of a collaborative robot system’s performance can be achieved by evaluating the mobility of robots. Scenarios have been determined in which an autonomous system has been used for intervention in crisis situations due to fire. The experimental model consists of three autonomous vehicles, two of which are ground vehicles and the other is an aerial vehicle. The conclusion of the research described in this paper highlights the fact that the integration of robotic systems made up of autonomous vehicles working in unstructured environments is difficult and at present there is no unitary analytical model.

1. Introduction

The strategic concept of surfing robots has spread throughout the world and has stimulated government programs in the United States, China, Japan, South Korea, India and many developing countries, taking into account national specific outlooks [1,2,3,4]. The academic community is also increasingly involved in stimulating studies and research on the implementation of digital technologies in economic sectors [5,6,7].
On the other hand, disasters strike anywhere and cause numerous losses to life and property. The statistics presented in the 2016 World Disaster Report confirm the need to implement strategies to reduce the loss and impact on people’s daily lives and socio-economic development. Fortunately, making emergency decisions using robots can be an optimal alternative to respond to or control these situations in order to protect both life and property. For example, unmanned ground vehicles and unmanned aerial vehicles can be used where the risk of disaster in response teams is high. Due to its important role in reducing losses and the impact of emergencies, robot collaboration has become an active research area in recent years [8].
Moreover, unmanned autonomous systems (UAS) represent one of the research challenges in the robotics and artificial intelligence domains. Autonomous path planning algorithms can be useful and effective using artificial intelligence, but challenges are created due to the randomness of the environment. This paper aims to highlight the results obtained through the collaboration of two types of autonomous vehicle, namely two unmanned ground vehicles (UGVs) [9] and an unmanned aerial vehicle (UAV). The sensor systems of the two types of drone interact and communicate with each other through Raspberry Pi III controllers.
The concern for the development of these systems was generated by the consequences of the aging infrastructure of industrial complexes, especially petrochemical plants. This has made the incidence of fires in these areas more and more frequent. The effects of a fire are devastating, and human intervention is extremely difficult, what with the lives of emergency personnel being endangered.
Algorithms of displacement and coordination between robots consider the distribution of industrial installations over very large areas, unstructured work environments, soil and air, random weather disturbances (temperature, pressure, humidity, and air flow), the constantly changing legislation of public airspaces, and wireless power supply of UGVs (if the perform recognition tasks).
The performances of UAVs are conditioned by the power supply and embedded payload. There are three main types of target searching method in these studies, namely, image processing-based target searching methods, signal-based target searching methods, and probabilistic target searching methods.
So far, the use of UASs has been insufficient to answer varied search and rescue situations. Emergency interventions have become a daily reality. Algorithms of perception, cognition, decision making, and communication between robots aim to reduce planning/replanting time [10,11].
To improve the performance of autonomous robots, artificial intelligence has the ability to introduce automated planning [12,13]. Work environments are unstructured and introduce many unpredictable events. This aspect makes the planning field independent. Even though the planned routes are geometrically identical, they differ because their characteristics are different.
The studies performed in [14,15] show that there is a high degree of interest in UAV-UGV systems that combine various techniques and approaches to execute specific unmanned tasks. For example, a UAV can autonomously follow an UGV using an image processing algorithm. The aerial images that are provided can help with trajectory planning in rough environments. In such a way, the operator should drive the ground vehicle only, while the quadcopter flies over the operation area.
Other authors [16,17,18] have discussed the results of a recent demonstration of multiple UAVs and UGVs cooperating in a coordinated reconnaissance, surveillance, and target acquisition (RSTA) application. The vehicles were autonomously controlled by the onboard digital pheromone, responding to the needs of the automatic target recognition algorithms.
On top of that, a strategy has been proposed to coordinate groups of UGVs with one or more UAVs. UAVs can be utilized in one of two ways. In the first approach, the UAV guides a swarm of UGVs, controlling their overall formation. In the second approach, the UGVs guide the UAVs, controlling their formation [19].
This paper presents the results of efforts to build a collaborative robot system capable of executing complex disaster response and recovery tasks. The novelty of this research lies in the fact that it has achieved the control, communication, and computation of UAVs and UGVs, and further integrates these heterogeneous systems into a real platform. The aim of this study is to explore high-level task scheduling and mission planning algorithms that enable various types of robot to cooperate together, utilizing each other’s strengths to yield a symbiotic robotic system. Therefore, in this study, through simulations, it is demonstrated that methods utilizing autonomous vehicles searching are comparatively excellent and that the proposed algorithm has better performance compared to other scenarios.
This paper is structured as follows. In Section 2, we will describe the specific UGVs and UAVs. Next, we will present the overall control scheme, which consists of the task planning component, the ground control station (GCS) central controller, and the individual controllers. As the operator only has the role of being an observer (only intervening to make corrections), the evaluation of the fulfilment of the missions represents one of the challenges. The system of robots has the role of identifying and extinguishing fires and it consists of two ground and one aerial autonomous vehicle. Investigation equipment (3D perception, cooperative awareness, mapping, deliberation, and navigation) as well as behavioral control will work on complex scenarios originally generated online and then by introducing random obstacles. The research work that combines other collaboration tasks and approaches is discussed in Section 4. Finally, this paper is concluded in Section 5.

2. Methods

Existing studies [8,9,10,11] have neglected the fact that for decision-making interventions, the decision maker has to treat them differently by using measures based on concrete information that can be made available by mobile robots.
The realization of a collaborative robot family involves developing a complex and integrated model based on the following:
  • Modularity through integration, coordination, evaluation, and optimization of heterogeneous subsystems (hardware, mobile platforms, actuators/grippers/tools, sustainable energy systems), real-time test/evaluation models, and algorithms for subsystems;
  • Cooperative navigation based on the evaluation and optimization of robot movements from the three work environments;
  • Coordination and synchronization of end-effector movement while performing system tests (as a whole), streamlining work and navigation paths via the structural integration of components, effectors, and analytical models;
  • Engaging robotized subsystems (terrestrial/air/underwater robots) in a collaborative/collective/cooperative way, intuitively and safely in the sense of adaptability to identify the obstacles and orientation uncertainties introduced by the initial algorithms;
  • Assimilation of instructions and the updating of working states by performing data interpretation from sensors and comparing the sensor data with the data stored in the assigned software libraries;
  • Intuitive plug-and-play systems that consider the fact that the subsystems that form the assembly are heterogeneous;
  • Open-source software.

2.1. UGV, the Terrestrial Component of an UAS

Programming the paths is done by introducing an algorithm with both an ideal and a real path that will consider the permanently measured values [9]. In the unstructured (terrestrial) environment, the analytical approach is based on Bekker equations [20], according to the Coulomb–Mohr soil failure criteria. It is aimed at determining the following:
  • The soil friction coefficient, μ t e r r a i n , based on the tractive force F, the weight of the vehicle Ga, the ground pressure pav, the constant specific value of the soil K, the soil shear coefficient δ, and the length of the contact spot La:
    μ t e r r a i n = F G a = ( c p a ν + tan ϕ ) + [ 1 ( 1 e σ · L a K ) ]
  • Unitary shear stress τ , where s is the slip coefficient
    τ = τ m a x · ( 1 e s K ) = ( c + p · tan ϕ ) · ( 1 e s K )
There are methods to determine friction coefficients and unitary shear stresses that have been developed by other authors [4,21].
The complexity of the phenomena occurring at the terrain–vehicle interface has enabled the development of empirical methods for the evaluation of vehicle mobility. For example, the NATO Reference Mobility Model (NRMM) software can determine the performance of a terrestrial robot on the move on any type of land in a global sense [9]. Through this performance measurement method, a very good prediction of the maximum permissible speed of terrestrial robots can be made for any specified geographic region in any environmental condition (humid season, precipitation, snow).
Each analytical model separately studies the different characteristics of the UGV propulsion performance. The point of view of achieving a balance between the capabilities of the mobile robot to orient, plan, and estimate the positions of the obstacles and the factors limiting the progressive field ability [22,23,24] takes into account the fact that both the speeds in rectilinear or turn motion and driving autonomy are influenced by the unstructured character of the terrain, which could be sand, grass, concrete, etc. Their conclusion is that an approach that makes corrections on coefficients (following real-time measurements) is much closer to reality. Thus, a predictive control model (PCM) can generate better planning for the paths to be followed.
The sensor system of the firefighting robot FFR-1-UTM (Firefighting Robot 1 from Titu Maiorescu University) consists of temperature, ultrasonic, proximity, infra-red distance measurement, triaxial accelerometer, GPS, gyroscope, air quality measurement, alcohol gas, liquefied petroleum gas (LPG), CO, CO2, CH4, hydrogen gas (H2), and weather station sensors (Figure 1).
The predictions regarding mobility over large areas require a stochastic approach, as the terrain profile, terrain and robot interaction, sensors [25,26], altitude, remote sensing, physical properties of the terrain, slope of the land, internal friction coefficients for soft soils, and friction angles for hard soils introduce several variables that generate uncertainties that can be seen during analysis with either analytical or numerical models [27].
The integration environment is represented by an integration tool for open architecture modeling processes here. This AI model makes and implements decisions and collaborates and distributes standard algorithms and simulation codes with non-preferential interfaces, GIS tools, and Python, with the objectives of the research task group (RTG) and NATO’s research and technology organization (RTO) [26].
Planning deals with finding a sequence of actions to reach a target, starting from an initial state. Planning is a tree structure that uses searches in the state space by creating additional successors and may even go over the intermediate states [28,29].
The determination of the runway requires, first and foremost, solving the uncertainties. For this, we use the Kriging estimation method, based on geostatistical information [29,30,31,32].
The Bekker–Wong model, a model of uncertainty calculation, uses the statistical data obtained by measuring the soil density rated cone index (RCI) [9]. This method is preferred because it allows the determination of the actual runway profile for all four wheels (each wheel crosses a structurally modified terrain profile following the passing of the front wheels) [33,34].
Programming collaborative terrestrial robots has an exploratory behavior [35,36]. To avoid local minima, it supposes multiple approaches, including geographic information systems (GISs), soil geometry, the kinematics and dynamics of the robot, power management, the global system for mobile communications (GSM), and electro-magnetic protection.

2.2. UAVs, the Aerial Component of an UAS

The use of UAVs for the development of collaborative robot systems has become a necessity due to the complexity and diversification of fire hazards.
Terrestrial robots need to be guided to intervene autonomously in spaces about which they do not have sufficient information. This can be effectively accomplished by combining their own information with that of UAVs. Also, the implementation of autonomous/semi-autonomous dual systems allows specialized intervention team operators to take control of UASs.
For flight planning and control, the UAV’s on-board controller uses board sensors to estimate the position and orientation, along with setting payload parameters. For the experimental model HEXA-01-UTM (Hexacopter Robot 01 from Titu Maiorescu University), shown in Figure 2, the open source Arduino IDE (integrated development environment) and Raspbian software were implemented.
The two software packages control motor drivers, encoders, the GPS, GSM, payload, radio transponder, accelerometer, gyroscope, etc. The algorithms for flight planning convert the mission objectives and act on acceleration, kinematics, and aerodynamics to obtain the command set for the trajectory using the feedback kinematic state.
The quantification of the UAV mission accomplishment [37] allows the evaluation of robot performance, either separately or within the UAS of which it belongs. According to AGARD-AR-343 (Advisory Group for Aerospace Research and Development) [38] this can be done via the analysis of the payload function and the command and control function.
As energy autonomy is essential to deliver missions (2700 mAh with 22.2 V), the HIRRUS V1 payload model implemented on the HEXA-01 UTM chassis was the best solution.
This payload has built-in electro-optical/infrared (EO/IR) cameras, video tracking, a passive ultra-high frequency radio-frequency identification (UHF-RFID) low-cost transponder (868 or 915 MHz), a stabilized gimbal, a transmitter for HD cameras, a GoPro camera using coded orthogonal frequency-division multiplexing (COFDM) for transmission, and a roadrunner on-orbit processing experiment (ROPE) system for processing and compressing JPEG data and transmitting them in real time to the GCS.
According to [39], the measured parameters influence flight and, implicitly, the values help to identify obstacles and the environment.
The methodology considers autonomy levels (AL) and technology readiness levels (TRL) defined by the National Institute of Standards and Technology (NIST). The GPS sensors, the accelerometer, and the gyroscope allow navigation by decomposing the planned route into points. HEXA-01 UTM, developed as an experimental model, in terms of system integration and operational reliability, reaches AL 4 and TRL 5.
To determine the UAV precision, both static and dynamic data are analyzed. Combining the two types of data increases optimization. An analytical model [40] which determines the position of the UAV, but also those of objects or obstacles, uses the following equations, specific to the accelerometer, gyroscope, and magnetometer:
s y a = S a · N a · s a + b a + ε a
where s y a is the sensor-measured acceleration, S a is a linear scale factor, N a is the non-orthogonal axis, s a is the corrected real acceleration, b a is the sensor polarity, and ε a is the accelerometer noise.
y ω = S ω · N ω · R ω · s ω + b w + G ω · s a + ε ω
where y ω is the angular speed rate, S ω is a scale factor, N ω is the non-orthogonal axis, R ω is the error due to the geometric position of the three sensors, s ω is the corrected real acceleration, b ω is the sensor error, s a is the real acceleration, G ω is the sensor sensibility to the gravitational acceleration, and ε ω is the gyroscope noise.
y m = D m · s m + o w + ε m
where y m is the measured magnetic domain, D m is the soft-iron distortion [41] that depends on the orientation of the material towards the sensor and the magnetic field, s m is the real magnetic field, o m is the hard-iron distortion, ε ω is the magnetometer noise, and D m and o m include manufacturing defects, scaling factors, non-orthogonal axes, and the relative positions of the three sensors.

3. Results

The payload shown in Figure 3 can retract, i.e., the EO/IR camera can retract. Also, the three brushless motors help to change the target of the camera according to the mission.
Atmospheric instabilities and those due to flight adjustment require a system of correction for image capturing and the attachment of detected obstacles. The existing AI techniques for autonomous robots, analyzed in [37], require data fusion techniques and data extraction procedures to perform data interpretation and diagnostics. The payload functional architecture must also deal with internal and external thermomechanical influences.
The retraction subsystem works in a closed loop. Coefficients of differential equations are time variables, so the analytical and numerical model is extremely complex, because the relationships between the flux, induced voltage, and currents change continuously when the electrical circuit is in relative motion. The movement is highlighted by a simulation of its own vibration modes [42] in Figure 4.
For fire identification and extinction, the autonomous collaborative robot system will operate according to the following algorithm:
  • The hexacopter rises and takes a tour to detect and locate the fire;
  • The mini rover receives the information (wireless) and moves to the coordinates ( x i , y i , z i ) where the fire was found;
  • The predefined route is a 3D map (Figure 5) of a randomly chosen location;
  • Commence movement to the defined target point, continuously calculating the path and indicating its position with respect to the reference system defined as an origin;
  • Unknown objects are scanned ultrasonically and via the IR camera;
  • At the target, the GCS processes data in real time and generates a new map with an optimized route;
  • From this moment, FFR-1-UTM moves to the fire that has been identified using its orthogonal coordinates;
  • For feedback, FFR-1-UTM is equipped with a video camera.
The conclusions following the simulations and software testing are as follows: the command processes are distributed and imply the transformation from three phases into two phases [35,36] and the conversion from stator values to a rotor reference frame. Similar to three-phase asynchronous machines, these processes are described as voltage and current equations.

4. Discussion

In the case of emergencies, the location of access points has become an important research issue, given the impact of using different measures due to limited resources and their dynamic evolution [8,43,44,45].
The collaboration between autonomous mobile robots, conducted at the experimental model level, ended with the extinguishment of a small fire, shown in Figure 6, (simulation of a fire that has paper fuel). The UAS presented here is a limiting one, so there remain several issues that will be investigated further in the future. We will seek to include artificial intelligence techniques in the functional domain of robots that move in two different unstructured environments.
This research proposes an important application which proves the fact that mobile robots can successfully collaborate with efficacy in real-world emergencies. Additionally, by locating the access point in a timely manner, robots can help alleviate and reduce various losses and damage (e.g., life, property, and the environment) caused by fires.
Until now, the solutions proposed in this area have been based more on theoretical hypotheses or computer simulations to demonstrate the effectiveness of a collaborative robot system [44,45,46,47]. On top of that, unmanned ground vehicles (UGVs) can be deployed to accurately locate ground targets and detect humans, fires, gases, etc., but they have the disadvantage of not being able to move rapidly or see through such obstacles as buildings or fences [19].

5. Conclusions

In recent years, UAVs have provided additional degrees of freedom for UGVs, enabling them to negotiate obstacles by simply lifting them across them. Missions including intelligence surveillance and reconnaissance are some of the most investigated and applied types of UAV-UGV collaboration system. Researchers can use virtual reality programs to develop and design UAV-UGV collaborative systems, including multi-robot communication and artificial intelligence systems.
In this study, we intended to evaluate the performance of a collaborative robot system and its use to identify and extinguish fires. According to the general control scheme, this system was also equipped with sensors for route planning and obstacle avoidance. This has enabled the location of the intervention to be identified in an efficient, rapid, and accurate manner. Point setting mechanisms have been experimentally tested with two UGVs and one UAV. This has made it possible to estimate the extent of the missions.
Extending these ideas, the equipment used to develop the experimental models allowed the AL 4 and TRL 5 levels to be reached, which created some problems in obtaining minimum data to initiate a simulation with NRMM I/II.
In our research, the UAV design requires structural changes, where, according to simulations, an octocopter is more stable and can have a more useful load ratio over the energy reserve, approximately 20% to be specific. Equipped with a return to base functionality, a parachute, and a flight termination, an octocopter UAV is extremely safe, especially as it is able to continue flying with up to two engines out of use (not on the same arm). It can be flown in automatic, GPS, or manual modes, with the pilot being able to intervene at any time if necessary.
These changes lead us to believe that delays in decision-making will be diminished, such that the collaborative work of robots will be effective. Most of the solutions in the market consist of a single robot only; however, our model, being composed of three autonomous vehicles, will help greatly in saving the precious lives of people and also the lives of the defense personnel and rescue teams involved in these missions, also saving a lot of time.

Author Contributions

Conceptualization, L.S.G., I.P. and D.J.; methodology, L.S.G.; software, I.P. and D.J.; validation, L.S.G., I.P., D.J. and I.O.; formal analysis, I.O., I.P. and D.J.; investigation, L.S.G., I.P. and D.J.; writing—original draft preparation, I.O. and L.S.G.; writing—review and editing, I.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Awad, F.; Naserllah, M.; Omar, A.; Abu-Hantash, A.; Al-Taj, A. Collaborative Indoor Access Point Localization Using Autonomous Mobile Robot Swarm. Sensors 2018, 18, 407. [Google Scholar] [CrossRef] [Green Version]
  2. Franchi, A.; Freda, L.; Oriolo, G.; Vendittelli, M. The sensor-based random graph method for cooperative robot exploration. IEEE/ASME Trans. Mechatron. 2009, 14, 163–175. [Google Scholar] [CrossRef]
  3. Jin, J.; Chung, W. Obstacle Avoidance of Two-Wheel Differential Robots Considering the Uncertainty of Robot Motion on the Basis of Encoder Odometry Information. Sensors 2019, 19, 289. [Google Scholar] [CrossRef] [Green Version]
  4. Laughery, S.; Gerhart, G.; Muench, P. Evaluating Vehicle Mobility Using Bekker’s Equations; US Army TARDEC: Warren, MI, USA, 2000. [Google Scholar]
  5. Gramegna, T.; Cicirelli, G.; Attolico, G.; Distante, A. Automatic construction of 2D and 3D models during robot inspection. Ind. Robot Int. J. 2006, 33, 387–393. [Google Scholar] [CrossRef]
  6. Gualda, D.; Ureña, J.; García, J.C.; Lindo, A. Locally-Referenced Ultrasonic–LPS for Localization and Navigation. Sensors 2014, 14, 21750–21769. [Google Scholar] [CrossRef] [PubMed]
  7. Socas, R.; Dormido, R.; Dormido, S. New Control Paradigms for Resources Saving: An Approach for Mobile Robots Navigation. Sensors 2018, 18, 281. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Zhang, Z.-X.; Wang, L.; Wang, Y.-M. An Emergency Decision Making Method for Different Situation Response Based on Game Theory and Prospect Theory. Symmetry 2018, 10, 476. [Google Scholar] [CrossRef] [Green Version]
  9. Ciobotaru, T. Semi-Empiric Algorithm for Assessment of the Vehicle Mobility. Leonardo Electron. J. Pract. Technol. 2009, 8, 19–30. [Google Scholar]
  10. Hofmann, T.; Niemueller, T.; Lakemeyer, G. Initial Results on Generating Macro Actions from a Plan Database for Planning on Autonomous Mobile Robots. In Proceedings of the Twenty-Seventh International Conference on Automated Planning and Scheduling (ICAPS 2017), Pittsburgh, PA, USA, 18–23 June 2017; AAAI Press: Palo Alto, CA, USA, 2017. [Google Scholar]
  11. Koo, J.; Cha, H. Localizing WiFi access points using signal strength. IEEE Commun. Lett. 2011, 15, 187–189. [Google Scholar] [CrossRef]
  12. McCluskey, T.L.; Vaquero, T.; Vallati, M. Engineering Knowledge for Automated Planning: Towards a Notion of Quality. In Mobile Robotics; Nehmzow, U., Ed.; Springer Science & Business Media: Berlin, Germany, 2003. [Google Scholar]
  13. Amigoni, F. Experimental Evaluation of Some Exploration Strategies for Mobile Robots. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation, Nice, France, 22–26 September 2008; pp. 2818–2823. [Google Scholar]
  14. Waharte, S.; Trigoni, N. Supporting search and rescue operations with UAVs. In Proceedings of the 2010 International Conference on Emerging Security Technologies (EST 2010), Canterbury, UK, 6–7 September 2010; pp. 142–147. [Google Scholar]
  15. Giakoumidis, N.; Bak, J.U.; Gomez, J.V. Pilot-Scale Development of a UAV-UGV Hybrid with Air-Based UGV Path Planning. In Proceedings of the 10th International Conference on Frontiers of Information Technology, Islamabad, Pakistan, 17–19 December 2012; pp. 204–208. [Google Scholar]
  16. Symington, A.; Waharte, S.; Julier, S.; Trigoni, N. Probabilistic target detection by camera-equipped UAVs. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation (ICRA 2010), Anchorage, AK, USA, 3–8 May 2010; pp. 4076–4082. [Google Scholar]
  17. Saska, M.; Vonasek, V.; Krajnik, T.; Preucil, L. Coordination and navigation of heterogeneous UAVs-UGVs teams localized by a hawk-eye approach. In Proceedings of the International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 7–12 October 2012; pp. 2166–2171. [Google Scholar]
  18. Wang, X.; Zhu, H.; Zhang, D.; Zhou, D.; Wang, X. Vision-based detection and tracking of a mobile ground target using a fixed-wing UAV. Int. J. Adv. Robot. Syst. 2014, 11, 156. [Google Scholar] [CrossRef]
  19. Hui, C.; Yousheng, C.; Xiaokun, L.; Shing, W.W. Autonomous Takeoff, Tracking and Landing of a UAV on a Moving UGV Using Onboard Monocular Vision. In Proceedings of the 32nd Chinese Control Conference (CCC), Xi’an, China, 26–28 July 2013; pp. 5895–5901. [Google Scholar]
  20. Bekker, M.G. Introduction to Terrain-Vehicle Systems; The University of Michigan Press: Ann Arbor, MI, USA, 1969. [Google Scholar]
  21. Wong, J.Y. Theory of Ground Vehicle, 2nd ed.; John Willey & Sons: Hoboken, NJ, USA, 1993. [Google Scholar]
  22. Ojeda, L.; Borenstein, J.; Witus, G. Terrain Trafficability Characterization with a Mobile Robot. In Proceedings of the SPIE Defense and Security Conference, Unmanned Ground Vehicle Technology VII, Orlando, FL, USA, 28 March–1 April 2005; Dept. of Mechanical Engineering the University of Michigan: Ann Arbor, MI, USA, 2005. [Google Scholar]
  23. Bedaka, A.K.; Mahmoud, A.M.; Lee, S.-C.; Lin, C.-Y. Autonomous Robot-Guided Inspection System Based on Offline Programming and RGB-D Model. Sensors 2018, 18, 4008. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Kim, E.; Choi, S.; Oh, S. Structured Kernel Subspace Learning for Autonomous Robot Navigation. Sensors 2018, 18, 582. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Cepeda, J.S.; Chaimowicz, L.; Soto, R.; Gordillo, J.L.; Alanís-Reyes, E.A.; Carrillo-Arce, L.C. A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration. Sensors 2012, 12, 12772–12797. [Google Scholar] [CrossRef] [Green Version]
  26. Besada-Portas, E.; Lopez-Orozco, J.A.; Lanillos, P.; De la Cruz, J.M. Localization of Non-Linearly Modeled Autonomous Mobile Robots Using Out-of-Sequence Measurements. Sensors 2012, 12, 2487–2518. [Google Scholar] [CrossRef] [PubMed]
  27. Bradbury, M.; Dasch, J.; Gonzalez, R.; Hodges, H.; Jain, A.; Iagnemma, K.; Letherwood, M.; McCullough, M.; Priddy, J.; Wojtysiak, J.; et al. Next-Generation NATO Reference Mobility Model (NG-NRMM). In Final Report by NATO Exploratory Team ET-148; Dasch, J., Ed.; Alion Science and Technology, USA Paramsothy Jayakumar, US Army TARDEC: Warren, MI, USA, 2016. [Google Scholar]
  28. Hofmann, T.; Niemueller, T.; Claßen, J.; Lakemeyer, G. Continual Planning in Golog. In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16), Phoenix, AZ, USA, 12–17 February 2016; AAAAI Press: Pittsburgh, PA, USA, 2016. [Google Scholar]
  29. Siegwart, R.; Nourbakhsh, I.R.; Scaramuzza, D. Introduction to Autonomous Mobile Robots; MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
  30. Cressie, N.; Johannesson, G. Fixed Rank kriging for very large Spatial Data Sets. J. R. Stat. Soc. 2008, 70, 209–226. [Google Scholar] [CrossRef]
  31. Bhattacharjee, S.; Mitra, P.; Ghosh, S.M. Spatial Interpolation to Predict Missing Attributes in GIS Using Semantic Kriging. IEEE Trans. Geosci. Remote Sens. 2014, 52, 4771–4780. [Google Scholar] [CrossRef]
  32. Nuță, I. Contributions to the Improvement Technology Development and Emergency Intervention. Ph.D. Thesis, Military Technical Academy, Bucharest, Romania, 2015. [Google Scholar]
  33. Smith, W.C. Modeling of Wheel-Soil Interaction for Small Ground Vehicles Operating on Granular Soil. Ph.D. Thesis, the University of Michigan, Ann Arbor, MI, USA, 2014. [Google Scholar]
  34. Balch, T.; Arkin, R. Avoiding the Past: A Simple but Effective Strategy for Reactive Navigation. In Proceedings of the 1993 IEEE International Conference on Robotics and Automation, San Francisco, CA, USA, 2–6 May 1993; Volume 1, pp. 678–685. [Google Scholar]
  35. Socas, R.; Dormido, S.; Dormido, R.; Fabregas, E. Event-based control strategy for mobile robots in wireless environments. Sensors 2015, 15, 30076–30092. [Google Scholar] [CrossRef] [Green Version]
  36. Eaton, C.M.; Chong, E.K.P.; Maciejewski, A.A. Multiple-Scenario Unmanned Aerial System Control: A Systems Engineering Approach and Review of Existing Control Methods. Aerospace 2016, 3, 1. [Google Scholar] [CrossRef] [Green Version]
  37. AGARD-AR-343. Available online: https://www.abbottaerospace.com/wpdm-package/agard-ar-343 (accessed on 22 April 2017).
  38. Vidyadharan, A.; Philpott, R., III; Kwasa, B.J.; Bloebaum, C.L. Analysis of Autonomous Unmanned Aerial Systems Based on Operational Scenarios Using Value Modelling. Drones 2017, 1, 5. [Google Scholar] [CrossRef] [Green Version]
  39. Chow, J.C.K.; Hol, J.D.; Luinge, H. Tightly-Coupled Joint User Self-Calibration of Accelerometers, Gyroscopes, and Magnetometers. Drones 2018, 2, 6. [Google Scholar] [CrossRef] [Green Version]
  40. Kollar, T.; Roy, N. Trajectory optimization using reinforcement learning for map exploration. Int. J. Robot. Res. 2008, 27, 175–196. [Google Scholar] [CrossRef]
  41. Texas Instruments. Clarke and Park Transforms in the Field Orientated Control (FOC), Clarke & Park Transforms on the TMS320C2xx Application Report Literature Number: BPRA048. 1997. Available online: http://www.ti.com/lit/an/bpra048/bpra048.pdf (accessed on 15 April 2019).
  42. Analog Devices Inc. ADSP-21990: Reference Frame Conversions: Park, Inverse Park and Clarke, Inverse Clarke Transformations MSS Software Implementation User Guide. 2002. Available online: https://www.analog.com/media/en/technical-documentation/application-notes/Refframe.pdf (accessed on 15 April 2019).
  43. Stefan, A.; Stefan, A.; Constantin, D.; Mateescu, C.; Cartal, L.A. Aspects of kinematics and dynamics for Payload UAVs. In Proceedings of the 2015 7th International Conference on Electronics, Computers and Artificial Intelligence (ECAI), Bucharest, Romania, 25–27 June 2015. [Google Scholar] [CrossRef]
  44. Juliá, M.; Gil, A.; Reinoso, O. A comparison of path planning strategies for autonomous exploration and mapping of unknown environments. Auton. Robot. 2012, 33, 427. [Google Scholar] [CrossRef]
  45. Chou, C.-Y.; Juang, C.-F. Navigation of an Autonomous Wheeled Robot in Unknown Environments Based on Evolutionary Fuzzy Control. Inventions 2018, 3, 3. [Google Scholar] [CrossRef] [Green Version]
  46. Kashino, Z.; Nejat, G.; Benhabib, B. Aerial Wilderness Search and Rescue with Ground Support. J. Intell. Robot. Syst. 2019. [Google Scholar] [CrossRef]
  47. Gorostiza, E.M.; Lázaro Galilea, J.L.; Meca Meca, F.J.; Salido Monzú, D.; Espinosa Zapata, F.; Pallarés Puerto, L. Infrared Sensor System for Mobile-Robot Positioning in Intelligent Spaces. Sensors 2011, 11, 5416–5438. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Sensor systems of FFR-1 UTM (Firefighting Robot 1 from Titu Maiorescu University).
Figure 1. Sensor systems of FFR-1 UTM (Firefighting Robot 1 from Titu Maiorescu University).
Processes 08 00494 g001
Figure 2. Unmanned aerial vehicle (UAV) model HEXA-01-UTM (Hexacopter Robot 01 from Titu Maiorescu University).
Figure 2. Unmanned aerial vehicle (UAV) model HEXA-01-UTM (Hexacopter Robot 01 from Titu Maiorescu University).
Processes 08 00494 g002
Figure 3. HIRRUS V1 payload.
Figure 3. HIRRUS V1 payload.
Processes 08 00494 g003
Figure 4. Vibration modes of the HIRRUS V1 payload.
Figure 4. Vibration modes of the HIRRUS V1 payload.
Processes 08 00494 g004
Figure 5. Unmanned autonomous system (UAS) 3D representation.
Figure 5. Unmanned autonomous system (UAS) 3D representation.
Processes 08 00494 g005
Figure 6. Unmanned ground vehicle (UGV) performing fire extinguishing testing, carried out at the Military Technical Academy yard on 26 October 2017, at the Patriot Fest contest.
Figure 6. Unmanned ground vehicle (UGV) performing fire extinguishing testing, carried out at the Military Technical Academy yard on 26 October 2017, at the Patriot Fest contest.
Processes 08 00494 g006

Share and Cite

MDPI and ACS Style

Grigore, L.S.; Priescu, I.; Joita, D.; Oncioiu, I. The Integration of Collaborative Robot Systems and Their Environmental Impacts. Processes 2020, 8, 494. https://doi.org/10.3390/pr8040494

AMA Style

Grigore LS, Priescu I, Joita D, Oncioiu I. The Integration of Collaborative Robot Systems and Their Environmental Impacts. Processes. 2020; 8(4):494. https://doi.org/10.3390/pr8040494

Chicago/Turabian Style

Grigore, Lucian Stefanita, Iustin Priescu, Daniela Joita, and Ionica Oncioiu. 2020. "The Integration of Collaborative Robot Systems and Their Environmental Impacts" Processes 8, no. 4: 494. https://doi.org/10.3390/pr8040494

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop