single-rb.php

JRM Vol.35 No.6 pp. 1524-1531
doi: 10.20965/jrm.2023.p1524
(2023)

Development Report:

Application of Object Grasping Using Dual-Arm Autonomous Mobile Robot—Path Planning by Spline Curve and Object Recognition by YOLO—

Naoya Mukai, Masato Suzuki, Tomokazu Takahashi, Yasushi Mae, Yasuhiko Arai, and Seiji Aoyagi

Kansai University
3-3-35 Yamate-cho, Suita, Osaka 564-8680, Japan

Received:
June 14, 2023
Accepted:
October 26, 2023
Published:
December 20, 2023
Keywords:
ROS, YOLO, spline curve
Abstract

In the trash-collection challenge of the Nakanoshima Robot Challenge, an autonomous robot must collect trash (bottles, cans, and bentos) scattered in a defined area within a time limit. A method for collecting the trash is to use machine learning to recognize the objects, move to the target location, and grasp the objects. An autonomous robot can achieve the target position and posture by rotating on the spot at the starting point, moving in a straight line, and rotating on the spot at the destination, but the rotation requires stopping and starting. To achieve faster movement, we implemented a smooth movement approach without sequential stops using a spline curve. When using the training data previously generated by the authors in their laboratory for object recognition, the robot could not correctly recognize objects in the environment of the robot competition, where strong sunlight shines through glass, because of the varying brightness and darkness. To solve this problem, we added our newly generated training data to YOLO, an image-recognition algorithm based on deep learning, and performed machine learning to achieve object recognition under various conditions.

Object grasping under sunlight shining

Object grasping under sunlight shining

Cite this article as:
N. Mukai, M. Suzuki, T. Takahashi, Y. Mae, Y. Arai, and S. Aoyagi, “Application of Object Grasping Using Dual-Arm Autonomous Mobile Robot—Path Planning by Spline Curve and Object Recognition by YOLO—,” J. Robot. Mechatron., Vol.35 No.6, pp. 1524-1531, 2023.
Data files:
References
  1. [1] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” 2016 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 779-788, 2016. https://doi.org/10.1109/CVPR.2016.91
  2. [2] N. J. Nilsson, “A mobile automation: An application of artificial intelligence techniques,” Proc. of the 1st Int. Joint Conf. on Artificial Intelligence (IJCAI-69), pp. 509-520, 1969.
  3. [3] K. Komoriya, S. Tachi, and K. Tanie, “A method for autonomous guidance of mobile robots,” J. of the Robotics Society of Japan, Vol.2, Vol.3, pp. 222-231, 1984 (in Japanese). https://doi.org/10.7210/jrsj.2.222
  4. [4] T. Ohuchi et al., “Path control of a mobile robot vehicle using two circular arcs,” IEICE Trans. on Electronics D, Vol.J70-D, No.9, pp. 1742-1750, 1987.
  5. [5] Y. Kanayama and N. Miyake, “Trajectory Generation for Mobile Robots,” O. D. Faugeras (Ed.), “Robotics Research: The 3rd Int. Symp.,” pp. 334-340, MIT Press, 1986.
  6. [6] K. Komoriya and K. Tanie, “Trajectory control of a wheel-type mobile robot using B-spline curves,” J. of the Robotics Society of Japan, Vol.8, No.2, pp. 133-143, 1990 (in Japanese). https://doi.org/10.7210/jrsj.8.133
  7. [7] J. Miyata et al., “Trajectory tracking control of autonomous mobile robot using temporal spline approximation method,” IEEJ J. of Industry Applications D, Vol.123, No.7, pp. 778-783, 2003.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024