To read this content please select one of the options below:

Model-free visual servoing based on active disturbance rejection control and adaptive estimator for robotic manipulation without calibration

Jun Tian (School of Aerospace Engineering, Xiamen University, Xiamen, China)
Xungao Zhong (School of Electrical Engineering and Automation, Xiamen University of Technology, Xiamen, China and Xiamen Key Laboratory of Frontier Electric Power Equipment and Intelligent Control, Xiamen, China)
Xiafu Peng (School of Aerospace Engineering, Xiamen University, Xiamen, China)
Huosheng Hu (School of Computer Science and Electronic Engineering, University of Essex, Colchester, UK)
Qiang Liu (School of Engineering Mathematics and Technology, University of Bristol, Bristol, UK)

Industrial Robot

ISSN: 0143-991x

Article publication date: 21 May 2024

33

Abstract

Purpose

Visual feedback control is a promising solution for robots work in unstructured environments, and this is accomplished by estimation of the time derivative relationship between the image features and the robot moving. While some of the drawbacks associated with most visual servoing (VS) approaches include the vision–motor mapping computation and the robots’ dynamic performance, the problem of designing optimal and more effective VS systems still remains challenging. Thus, the purpose of this paper is to propose and evaluate the VS method for robots in an unstructured environment.

Design/methodology/approach

This paper presents a new model-free VS control of a robotic manipulator, for which an adaptive estimator aid by network learning is proposed using online estimation of the vision–motor mapping relationship in an environment without the knowledge of statistical noise. Based on the adaptive estimator, a model-free VS schema was constructed by introducing an active disturbance rejection control (ADRC). In our schema, the VS system was designed independently of the robot kinematic model.

Findings

The various simulations and experiments were conducted to verify the proposed approach by using an eye-in-hand robot manipulator without calibration and vision depth information, which can improve the autonomous maneuverability of the robot and also allow the robot to adapt its motion according to the image feature changes in real time. In the current method, the image feature trajectory was stable in the camera field range, and the robot’s end motion trajectory did not exhibit shock retreat. The results showed that the steady-state errors of image features was within 19.74 pixels, the robot positioning was stable within 1.53 mm and 0.0373 rad and the convergence rate of the control system was less than 7.21 s in real grasping tasks.

Originality/value

Compared with traditional Kalman filtering for image-based VS and position-based VS methods, this paper adopts the model-free VS method based on the adaptive mapping estimator combination with the ADRC controller, which is effective for improving the dynamic performance of robot systems. The proposed model-free VS schema is suitable for robots’ grasping manipulation in unstructured environments.

Keywords

Acknowledgements

Funding: National Natural Science Foundation of China; 61703356; Natural Science Foundation of Fujian Province; 2022J011256.

Citation

Tian, J., Zhong, X., Peng, X., Hu, H. and Liu, Q. (2024), "Model-free visual servoing based on active disturbance rejection control and adaptive estimator for robotic manipulation without calibration", Industrial Robot, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/IR-12-2023-0347

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Emerald Publishing Limited

Related articles