Skip to main content
Log in

Dynamic liquid volume estimation using optical tactile sensors and spiking neural network

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

Tactile sensing plays a unique role in robotic applications such as liquid volume estimation. The latest research shows that current computer vision methods for liquid volume estimation are still unsatisfactory in terms of accuracy. Additionally, their performance depends on the environment, such as requiring high illumination or facing issues with occlusions. In the face of these challenges, we design an optical tactile sensor to solve the task of liquid volume estimation. We build the sensor using four photoresistors and a LED on a printed circuit board, and an elastic dome bound with a 3D-printed base by four screw suits. The tactile sensor is attached to the Robotiq and applied in the task of liquid volume estimation in different trajectories movement. The experiment is designed as multiple different trajectories for imitating human movements. We propose the tactile regression spiking neural network (TR-SNN) for the liquid volume estimation. The TR-SNN uses a binary decoding method to decode the output spike trains of the network into liquid volume values. It is a novel and successful spiking neural network (SNN) decoding method for the liquid volume estimation task. The experimental results show that compared to baseline models, the TR-SNN can achieve a state-of-the-art estimation performance with a coefficient of determination up to 0.986.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Dahiya RS, Mittendorfer P, Valle M, Cheng G, Lumelsky VJ (2013) Directions toward effective utilization of tactile skin: a review. IEEE Sens J 13(11):4121–4138

    Article  Google Scholar 

  2. Sun X, Liu T, Zhou J, Yao L, Liang S, Zhao M, Liu C, Xue N (2021) Recent applications of different microstructure designs in high performance tactile sensors: a review. IEEE Sens J 21(9):10291–10303

    Article  Google Scholar 

  3. Xue T, Wang W, Ma J, Liu W, Pan Z, Han M (2020) Progress and prospects of multimodal fusion methods in physical human–robot interaction: a review. IEEE Sens J 20(18):10355–10370

    Article  Google Scholar 

  4. Yi Z, Calandra R, Veiga F, van Hoof H, Hermans T, Zhang Y, Peters J (2016) Active tactile object exploration with gaussian processes. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4925–4930

  5. Dahiya RS, Metta G, Valle M, Sandini G (2009) Tactile sensing-from humans to humanoids. IEEE Trans Rob 26(1):1–20

    Article  Google Scholar 

  6. Kroemer O, Lampert CH, Peters J (2011) Learning dynamic tactile sensing with robust vision-based training. IEEE Trans Rob 27(3):545–557

    Article  Google Scholar 

  7. Zou L, Ge C, Wang ZJ, Cretu E, Li X (2017) Novel tactile sensor technology and smart tactile sensing systems: a review. Sensors 17(11):2653

    Article  Google Scholar 

  8. Khan LU (2017) Visible light communication: applications, architecture, standardization and research challenges. Digit Commun Netw 3(2):78–88

    Article  Google Scholar 

  9. Kamiyama K, Vlack K, Mizota T, Kajimoto H, Kawakami K, Tachi S (2005) Vision-based sensor for real-time measuring of surface traction fields. IEEE Comput Graph Appl 25(1):68–75

    Article  Google Scholar 

  10. Johnson MK, Adelson EH (2009) Retrographic sensing for the measurement of surface texture and shape. In: 2009 IEEE conference on computer vision and pattern recognition. IEEE, pp 1070–1077

  11. Yuan W, Srinivasan MA, Adelson EH (2016) Estimating object hardness with a gelsight touch sensor. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 208–215

  12. Gomes DF, Lin Z, Luo S (2020) Geltip: a finger-shaped optical tactile sensor for robotic manipulation. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 9903–9909

  13. Donlon E, Dong S, Liu M, Li J, Adelson E, Rodriguez A (2018) Gelslim: a high-resolution, compact, robust, and calibrated tactile-sensing finger. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1927–1934

  14. Sun H, Kuchenbecker KJ, Martius G (2022) A soft thumb-sized vision-based sensor with accurate all-round force perception. Nat Mach Intell 4(2):135–145

    Article  Google Scholar 

  15. Xie H, Jiang A, Seneviratne L, Althoefer K (2012) Pixel-based optical fiber tactile force sensor for robot manipulation. In: Sensors. IEEE, pp 1–4

  16. Zhao H, O’Brien K, Li S, Shepherd RF (2016) Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides. Sci Robot 1(1):eaai7529

    Article  Google Scholar 

  17. Tang Y, Liu H, Pan J, Zhang Z, Xu Y, Yao N, Zhang L, Tong L (2021) Optical micro/nanofiber-enabled compact tactile sensor for hardness discrimination. ACS Appl Mater Interfaces 13(3):4560–4566

    Article  Google Scholar 

  18. Xie H, Jiang A, Wurdemann HA, Liu H, Seneviratne LD, Althoefer K (2013) Magnetic resonance-compatible tactile force sensor using fiber optics and vision sensor. IEEE Sens J 14(3):829–838

    Article  Google Scholar 

  19. Mahmood MA, Wang H, Xie H, Chen W (2018) An MR compatible tactile sensor array probe head. In: 2018 IEEE international conference on real-time computing and robotics (RCAR). IEEE, pp 310–315

  20. Tar A, Cserey G (2011) Development of a low cost 3d optical compliant tactile force sensor. In: 2011 IEEE/ASME international conference on advanced intelligent mechatronics (AIM). IEEE, pp 36–240

  21. Khamis H, Xia B, Redmond SJ (2019) A novel optical 3d force and displacement sensor-towards instrumenting the papillarray tactile sensor. Sens Actuators A 291:174–187

    Article  Google Scholar 

  22. Schenck C, Fox D (2017) Visual closed-loop control for pouring liquids. In: 2017 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2629–2636

  23. Golovko V, Mikhno E, Mamyha A (2021) Neural network approach for estimating the level and volume of liquid in transparent containers, pp 56–60

  24. Wang BC, Wang P (2012) The new development and application of optical sensor. In: Advanced materials research, vol 430. Trans Tech Publ, pp 1215–1218

  25. Kamata H, Mukuta Y, Harada T (2022) Fully spiking variational autoencoder. Proc AAAI Conf Artif Intell 36(6):7059–7067

    Google Scholar 

  26. Ratnasingam S, McGinnity TM (2011) A spiking neural network for tactile form based object recognition. In: The 2011 international joint conference on neural networks. IEEE, pp 880–885

  27. Gerstner W, Kistler WM (2002) Spiking neuron models: single neurons, populations, plasticity. Cambridge University Press

  28. Shrestha SB, Orchard G (2018) Slayer: spike layer error reassignment in time. In: Advances in neural information processing systems, p 31

  29. San Millan-Castillo R, Morgado E, Goya-Esteban R (2019) On the use of decision tree regression for predicting vibration frequency response of handheld probes. IEEE Sens J 20(8):4120–4130

    Article  Google Scholar 

  30. Huang K-H, Tan F, Wang T-D, Yang Y-J (2019) A highly sensitive pressure-sensing array for blood pressure estimation assisted by machine-learning techniques. Sensors 19(4):848

    Article  Google Scholar 

  31. Malek S, Melgani F, Bazi Y (2018) One-dimensional convolutional neural networks for spectroscopic signal regression. J Chemom 32(5):e2977

    Article  Google Scholar 

  32. Yao L, Ge Z (2021) Dynamic features incorporated locally weighted deep learning model for soft sensor development. IEEE Trans Instrum Meas 70:1–11

    Google Scholar 

  33. Gehrig M, Shrestha SB, Mouritzen D, Scaramuzza D (2020) Event-based angular velocity regression with spiking networks. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 4195–4202

  34. Dong J, Cong Y, Sun G, Zhang T (2022) Lifelong robotic visual-tactile perception learning. Pattern Recogn 121:108176

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported in part by National Natural Science Foundation for Youth of China under Grant 62203428, in part by the Guangdong Basic and Applied Basic Research Foundation under Grant 2021A1515110486, in part by the Shenzhen Science and Technology Program under Grant RCBS20210706092252054, and in part by the Science and Technology Innovation Commission of Shenzhen (No. JSGGZD20220822095401004), Postdoctoral Science Foundation of China (No. 2021M703389).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Meng Yin or Zhengkun Yi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, B., Fang, S., Yin, M. et al. Dynamic liquid volume estimation using optical tactile sensors and spiking neural network. Intel Serv Robotics 17, 345–355 (2024). https://doi.org/10.1007/s11370-023-00488-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-023-00488-0

Keywords

Navigation