Hostname: page-component-76fb5796d-zzh7m Total loading time: 0 Render date: 2024-04-26T07:56:18.986Z Has data issue: false hasContentIssue false

Image-based anti-interference robotic Chinese character writing system

Published online by Cambridge University Press:  22 January 2024

Xian Li
Affiliation:
College of Automation Science and Engineering, South China University of Technology, Guangzhou, China
Chenguang Yang*
Affiliation:
College of Automation Science and Engineering, South China University of Technology, Guangzhou, China
Sheng Xu
Affiliation:
Guangdong Provincial Key Lab of Robotics and Intelligent System, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
Yongsheng Ou
Affiliation:
Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, China
*
Corresponding author: Chenguang Yang; Email: cyang@ieee.org

Abstract

This article designs a robotic Chinese character writing system that can resist random human interference. Firstly, an innovative stroke extraction method of Chinese characters was devised. A basic Chinese character stroke extraction method based on cumulative direction vectors is used to extract the components that make up the strokes of Chinese characters. The components are then stitched together into strokes based on the sequential base stroke joining method. To enable the robot to imitate handwriting Chinese character skills, we utilised stroke information as the demonstration and modelled the skills using dynamic movement primitives (DMPs). To suppress random human interference, this article combines improved DMPs and conductance control to adjust robot trajectories based on real-time visual measurements. The experimental results show that the proposed method can accurately extract the strokes of most Chinese characters. The designed trajectory adjustment method offers better smoothness and robustness than direct rotating and translating curves. The robot is able to adjust its posture and trajectory in real time to eliminate the negative impacts of human interference.

Type
Research Article
Copyright
© The Author(s), 2024. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Li, Z., Li, S. and Luo, X., “An overview of calibration technology of industrial robots,” IEEE/CAA J. Autom. Sin. 8(1), 2336 (2021).Google Scholar
Yin, H., Alves-Oliveira, P., Melo, F. S., Billard, A. and Paiva, A., “Synthesizing Robotic Handwriting Motion by Learning from Human Demonstrations.” Proceedings of the 25th International Joint Conference on Artificial Intelligence, No. CONF (2016).Google Scholar
Zeng, H., Huang, Y., Chao, F. and Zhou, C., “Survey of robotic calligraphy research,” CAAI Trans. Intell. Syst. 11(1), 1526 (2016).Google Scholar
Chao, F., Huang, Y., Zhang, X., Shang, C., Yang, L., Zhou, C., Hu, H. and Lin, C.-M., “A robot calligraphy system: From simple to complex writing by human gestures,” Eng. Appl. Artif. Intel. 59, 114 (2017).Google Scholar
Wang, S., Chen, J., Deng, X., Hutchinson, S. and Dellaert, F., “Robot Calligraphy Using Pseudospectral Optimal Control in Conjunction with a Novel Dynamic Brush Model.” 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE (2020) pp. 66966703.Google Scholar
Wang, G., Liao, Z. and Chen, X., “Robot manipulator lettering technology and motion,” J Chongq. Univ. 26(12), 69 (2003).Google Scholar
Gong, X. and Wang, J., “Drawing Chinese character based on continuous path control of IRB 140 industrial robot,” Modern Manuf. Eng. (2), 135–137(2010).Google Scholar
Chuanjian, Z., Chunmei, L. and Jun, H., “Functional design and realization of writing for motoman-up6 robots,” Mach. Buildi. Autom. 40(2), 119121 (2011).Google Scholar
Yang, Y., Chen, W., Zhou, L., Zheng, B., Xiao, W., Huang, Y. and Sun, Z., “A Hybrid Control Framework Teaching Robot to Write Chinese Characters: From Image to Handwriting,” 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE), IEEE (2021) pp. 11611166.Google Scholar
Yao, F. and Shao, G., “Modeling of Ancient-style Chinese Character and Its Application to CCC Robot,” 2006 IEEE International Conference on Networking, Sensing and Control., IEEE (2006) pp. 7277.Google Scholar
Yao, F., Shao, G. and Yi, J., “Extracting the trajectory of writing brush in Chinese character calligraphy,” Eng. Appl. Artif. Intel. 17(6), 631644 (2004).Google Scholar
Lam, J. H. and Yam, Y., “Application of Brush Footprint Geometric Model for Realization of Robotic Chinese Calligraphy, ” 2011 2nd International Conference on Cognitive Infocommunications (CogInfoCom), IEEE (2011) pp. 15.Google Scholar
Lam, J. H. and Yam, Y., “Stroke Trajectory Generation Experiment for a Robotic Chinese Calligrapher Using a Geometric Brush Footprint Model,” 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE (2009) pp. 23152320.Google Scholar
Li, J., Sun, W., Zhou, M. and Dai, X., “Teaching a Calligraphy Robot Via a Touch Screen,” 2014 IEEE International Conference on Automation Science and Engineering (CASE), IEEE (2014) pp. 221226.Google Scholar
Lin, F. and Tang, X., “Dynamic Stroke Information Analysis for Video-based Handwritten Chinese Character Recognition,” Proceedings Ninth IEEE International Conference on Computer Vision, IEEE (2003) pp. 695700.Google Scholar
Sun, Y., Qian, H. and Xu, Y., “Robot Learns Chinese Calligraphy from Demonstrations,” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE (2014) pp. 44084413.Google Scholar
Chao, F., Huang, Y., Lin, C.-M., Yang, L., Hu, H. and Zhou, C., “Use of automatic Chinese character decomposition and human gestures for Chinese calligraphy robots,” IEEE Trans. Hum. Mach. Syst. 49(1), 4758 (2018).Google Scholar
Cao, F., Wu, Z., Ao, X. and Zhou, M., “Vectorization of Qi Gong calligraphy,” J. Chin. Inf. Process. 24(6), 97102 (2010).Google Scholar
Wang, X., Liang, X., Sun, L. and Liu, M., “Triangular Mesh Based Stroke Segmentation for Chinese Calligraphy,” 2013 12th International Conference on Document Analysis and Recognition, IEEE (2013) pp. 11551159,Google Scholar
Gan, L., Fang, W., Chao, F., Zhou, C., Yang, L., Lin, C.-M. and Shang, C., “Towards a Robotic Chinese Calligraphy Writing Framework,” 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE (2018) pp. 493498,Google Scholar
Wang, T.-Q., Jiang, X. and Liu, C.-L., “Query pixel guided stroke extraction with model-based matching for offline handwritten Chinese characters,” Pattern Recogn. 123, 108416 (2022).Google Scholar
Ju, Z., Ji, X., Li, J. and Liu, H., “An integrative framework of human hand gesture segmentation for human–robot interaction,” IEEE Syst. J. 11(3), 13261336 (2015).Google Scholar
Chang, C.-C., “A library for support vector machines. 2001. http://www.csie.ntu.edu.tw/~cjlin/libsvm.Google Scholar
Ijspeert, A., Nakanishi, J. and Schaal, S., “Learning Attractor Landscapes for Learning Motor Primitives,” In: Advances in Neural Information Processing Systems. vol. 15 (2002).Google Scholar
Koutras, L. and Doulgeri, Z., “Dynamic Movement Primitives for Moving Goals with Temporal Scaling Adaptation,” 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2020) pp. 144150,Google Scholar
Yang, C., Chen, C., He, W., Cui, R. and Li, Z., “Robot learning system based on adaptive neural control and dynamic movement primitives,” IEEE Trans. Neur. Netw. Learn. Syst. 30(3), 777787 (2018).Google Scholar
Yang, C., Zeng, C., Cong, Y., Wang, N. and Wang, M., “A learning framework of adaptive manipulative skills from human to robot,” IEEE Trans. Ind. Inform. 15(2), 11531161 (2018).Google Scholar
Anand, A. S., Østvik, A., Grøtli, E. I., Vagia, M. and Gravdahl, J. T., “Real-time temporal adaptation of dynamic movement primitives for moving targets,” 2021 20th International Conference on Advanced Robotics (ICAR), IEEE (2021) pp. 261268,Google Scholar
Liao, Z., Jiang, G., Zhao, F., Wu, Y., Yue, Y. and Mei, X., “Dynamic skill learning from human demonstration based on the human arm stiffness estimation model and Riemannian DMP,” IEEE/ASME Trans. Mechatron. 28(2), 11491160 (2022).Google Scholar
Zeng, C., Chen, X., Wang, N. and Yang, C., “Learning compliant robotic movements based on biomimetic motor adaptation,” Robot. Auton. Syst. 135, 103668 (2021).Google Scholar
Koutras, L. and Doulgeri, Z., “A Novel DMP Formulation for Global and Frame Independent Spatial Scaling in the Task Space,” 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), (2020) pp. 727732,Google Scholar
Abu-Dakka, F. J. and Kyrki, V., “Geometry-aware Dynamic Movement Primitives,” 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2020) pp. 44214426,Google Scholar
Wang, T., Yan, L., Wang, G., Gao, X., Du, N. and Chen, I.-M., “Learning from Demonstration Using Improved Dynamic Movement Primitives,” 2021 IEEE 16th Conference on Industrial Electronics and Applications (ICIEA), (2021) pp. 21302135,Google Scholar
Yang, C., Zeng, C., Fang, C., He, W. and Li, Z., “A DMPs-based framework for robot learning and generalization of humanlike variable impedance skills,” IEEE/ASME Trans. Mechatron. 23(3), 11931203 (2018).Google Scholar