Elsevier

Expert Systems with Applications

Volume 94, 15 March 2018, Pages 94-111
Expert Systems with Applications

Material recognition using tactile sensing

https://doi.org/10.1016/j.eswa.2017.10.045Get rights and content

Highlights

  • A system that can recognise materials based on tactile sensing alone is presented.

  • Identification achieved by analysis of thermal conductivity and surface texture.

  • Evaluation of various machine learning algorithms and developed hybrid classifiers.

  • The system outperforms humans in the task of material recognition.

Abstract

Identification of the material from which an object is made is of significant value for effective robotic grasping and manipulation. Characteristics of the material can be retrieved using different sensory modalities: vision based, tactile based or sound based. Compressibility, surface texture and thermal properties can each be retrieved from physical contact with an object using tactile sensors. This paper presents a method for collecting data using a biomimetic fingertip in contact with various materials and then using these data to classify the materials both individually and into groups of their type. Following acquisition of data, principal component analysis (PCA) is used to extract features. These features are used to train seven different classifiers and hybrid structures of these classifiers for comparison. For all materials, the artificial systems were evaluated against each other, compared with human performance and were all found to outperform human participants’ average performance. These results highlighted the sensitive nature of the BioTAC sensors and pave the way for research that requires a sensitive and accurate approach such as vital signs monitoring using robotic systems.

Introduction

Humans can quickly gain information about properties of an object or material simply by viewing them from different angles; for example they can estimate how it might feel to touch or how heavy it may be. This is due to our highly sophisticated visual capabilities and ability to adapt prior knowledge learned from similarly shaped and known objects. However, there are some properties which are difficult to detect by vision alone, for example thermal conductivity, compressibility or a determination from what material an object is made. In order to learn these characteristics and remember them for future interactions, physical manipulation of an object is required. Distinguishing between objects and materials of different compressibility, temperature and texture can be achieved by performing complex manipulation tasks such as squeezing or rubbing. These complex manipulation tasks can inherently be performed extremely well by humans due to our very sophisticated tactile perception.

In order to produce the tactual perceptions required to learn about object properties, humans perform various types of movements when interacting with an object. Experimental psychologists have identified six general types of exploratory movements: pressure to determine hardness, static contact to determine thermal properties, lateral sliding movements to determine surface texture, enclosure to determine global shape and volume, lifting to determine weight, and contour following to determine exact shape (Lederman & Klatzky, 1987). Humans can complete these exploratory movements very fast and rapidly evaluate the object leading to a possible identification.

One of the main applications of an artificial tactile sensing system is expected to be in robotic applications. In order to retrieve the necessary tactual perceptions, a robot must explore an object in a similar manner to that of a human. However, in comparison to humans, these exploratory movements can be slow for a robot and typically involve the evaluation of a great volume of acquired data, thus being computationally expensive. It would be a useful skill for a robot if it could complete a preliminary evaluation of the object by quickly identifying the physical nature of the material (e.g. metal, wood, plastic etc.) through completing a small number of initial basic actions, thus removing the need for extensive manipulation. These initial basic actions could include the retrieval of thermal information upon initial contact with a material which may provide an indication of its physical nature, e.g. wood, metal or plastic. Furthermore the texture of a material can be identified by sliding a robotic hand or finger along the material, i.e. determining if it is rough or smooth.

We use an artificial fingertip to acquire data relating to the thermal properties and surface texture of materials, that are first analysed to initially identify to which group the material belongs (e.g. wood, metal, plastic etc.) and subsequently to determine the specific individual material (e.g. aluminium, copper, pine etc.). Building on the work presented in Kerr, McGinnity, and Coleman (2013); 2014a) and Kerr, McGinnity, and Coleman (2014b) the contribution of this work is the implementation of six further classifiers, hybrid configurations of the classifiers and a comparison of the performances of these classifiers. The classifiers implemented and compared are a one stage ANN (Kerr, McGinnity, & Coleman, 2014a), a two stage ANN (Kerr et al., 2014b), a one stage SVM, a two stage SVM, a hybrid ANN and SVM approach, GMM, LDA, NB, k-NN and MLP. Utilisation of these techniques to classify materials from tactile data resulted in an increase in classification accuracy and system efficiency in comparison to previous work (Kerr et al., 2014b) resulting in an approach capable of a fast initial identification of the material group and/ or individual material. These classifiers are also evaluated against human performance when identifying the same test materials as reported in Kerr et al. (2014b). Therefore, in this paper we propose an approach for material classification using a combination of two of the three key characteristics, namely surface texture (represented through vibration data) and thermal characteristics. Although one weakness of the approach is that it does not achieve accuracies as high as that of Xu, Loeb, and Fishel (2013), the strengths include the fact that it is computationally efficient and it demonstrates high classification accuracies across 14 materials, some of which are very similar, unlike many methods presented in the literature where the test materials can be very different. A range of common classifiers together with novel combinations of classifiers are evaluated and proven to be accurate and efficient algorithms for material classification. Furthermore, the system is tested with previously unlearned materials in order to assess its robustness.

The remainder of this paper is organised as follows: Section 2 presents an overview of related research in tactile sensing-based material classification. Section 3 describes the experimental set-up, including an overview of the equipment used and the learning algorithms used for the robot system. Section 4 presents an evaluation of the performance of the various artificial systems developed and compares their performance to that of human subjects. Section 5 discusses the artificial system and human participants’ performance. Conclusions and plans for future work are given in Section 6.

Section snippets

Background and related research

Three of the key characteristics that are critical in performing an efficient grasp on an object are texture, compressibility and thermal properties. Such characteristics can be obtained from tactile sensing. Materials from different classes may have similar compressibility; for example neither acrylic or medium density fibreboard (MDF) are very compressible but are from different classes of materials, i.e. plastic and wood. Drimus, Kootstra, Bilberg, and Kragic (2011), developed a sensor and

Tactile sensor

The BioTAC fingertip is a tactile sensor that is shaped like a human fingertip and is filled with an incompressible liquid, giving it similar compliance to a human fingertip (Kerr, McGinnity, Coleman, 2013, Lin, Erickson, Fishel, Wettels, Loeb, 2009). The sensor is capable of detecting a full range of information: micro vibrations, forces and temperature, similar to a human finger. Fig. 1 shows a cross section view of the BioTAC fingertip.

There are various sensors on the fingertip that enable

Results

In order to evaluate the artificial system’s performance, analysis comparing the algorithms was conducted. In addition, the artificial system was evaluated against human performance. In this section, the results of the human evaluation experiments are presented. The initial analysis of the extracted data collected from the fingertip during the experiments is also presented. Furthermore, the performances of the aforementioned classifiers are presented, evaluated against each other and against

Discussion

From the results of the human evaluation carried out in Kerr et al. (2014a), it was found that the average accuracy was 79.76% when identifying the material groups and 69.64% when identifying individual materials. A detailed analysis of these findings can be seen in the confusion matrices in Fig. 11(b) and 12(b). It can be seen from Table 4 that all of the artificial systems presented have outperformed the human participants (when performance is averaged across all test materials), in some

Conclusion and future work

A range of classifiers and combinations of classifiers that were implemented, tuned and evaluated for the classification of materials into their respective groups and the classification of individual materials has been presented. This classification was based on a combination of the thermal and surface texture properties of the materials. It is proven that a two stage SVM approach performs best and is the most efficient method in completing the material classification task. It was found that

Emmett Kerr received a BEng (Hons.) in Mechanical and Manufacturing Engineering from Queens University Belfast, UK in 2005. He then worked in industry for the Smurfit–Kappa group as a project engineer, co-managing over e7.5 million in projects and later he worked as an engineering manager managing a department with 10 staff members. He then returned to research work and his studies and received a MSc with distinction in Computing and Intelligent Systems from the University of Ulster, UK in 2012

References (30)

  • S. Lederman et al.

    Hand movements: A window into haptic object recognition

    Cognitive Psychology

    (1987)
  • T. Bhattacharjee et al.

    Data-driven thermal recognition of contact with people and objects

    2016 IEEE haptics symposium, HAPTICS 2016, Philadelphia, PA, USA, April 8–11, 2016

    (2016)
  • T. Bhattacharjee et al.

    Material recognition from heat transfer given varying initial conditions and short-duration contact

    Proceedings of robotics: Science and systems, Rome, Italy

    (2015)
  • C. Bishop

    Neural networks for pattern recognition

    (1995)
  • D. Chathuranga et al.

    Investigation of a biomimetic fingertip’s ability to discriminate fabrics based on surface textures

    Advanced intelligent mechatronics (AIM), 2013 IEEE/ASME international conference on

    (2013)
  • A. Dempster et al.

    Maximum likelihood from incomplete data via the em algorithm

    Journal of the Royal Statistical Society, Series B

    (1977)
  • A. Drimus et al.

    Classification of rigid and deformable objects using a novel tactile sensor

    Advanced robotics (icar), 2011 15th international conference on

    (2011)
  • J. Fishel et al.

    Bayesian exploration for intelligent identification of textures

    Frontiers in Neurorobotics

    (2012)
  • M. Hall et al.

    The WEKA data mining software: An update

    SIGKDD Explorations

    (2009)
  • V. Ho et al.

    Experimental investigation of surface identification ability of a low-profile fabric tactile sensor

    Intelligent robots and systems (IROS), 2012 IEEE/RSJ international conference on

    (2012)
  • J. Hoelscher et al.

    Evaluation of tactile feature extraction for interactive object recognition

    Proceedings of 15th IEEE-RAS international conference on humanoid robots

    (2015)
  • N. Jamali et al.

    Material classification by tactile sensing using surface textures

    Robotics and automation (IRCA), 2010 IEEE international conference on

    (2010)
  • N. Jamali et al.

    Majority voting: Material classification by tactile sensing using surface texture

    Robotics, IEEE Transactions on

    (2011)
  • F. Jurie et al.

    Creating efficient codebooks for visual recognition

    Computer vision, 2005. ICCV 2005. Tenth IEEE international conference on

    (2005)
  • E. Kerr et al.

    Material classification based on thermal properties - a robot and human evaluation

    2013 IEEE international conference on robotics and biomimetics, December 12–14 2013, Shenzhen, China

    (2013)
  • Cited by (58)

    • A multisensory-feedback tactile glove with dense coverage of sensing arrays for object recognition

      2023, Chemical Engineering Journal
      Citation Excerpt :

      To further verify the functionality of the tactile glove, altogether 10 voluntary participants conduct object recognition tests with the overall accuracy in differentiating 20 types of objects up to 94.2 % (Table S2), which is close to the classification accuracy of the test set (94.9 %). Compared to object identification in literatures [55–58,39,44,46] (Table. S3), the presented tactile glove shows a remarkable capability of possessing excellent recognition accuracy. This breakthrough behavior benefits from the combination of pressure and temperature arrays with high spatiotemporal sensing resolution, which is utilized to decode the softness and material of measured objects.

    • Fiber optic tactile sensor for surface roughness recognition by machine learning algorithms

      2021, Sensors and Actuators A: Physical
      Citation Excerpt :

      With the development of technology, the perception of object surfaces’ various features has started to gain significant importance [1–8].

    View all citing articles on Scopus

    Emmett Kerr received a BEng (Hons.) in Mechanical and Manufacturing Engineering from Queens University Belfast, UK in 2005. He then worked in industry for the Smurfit–Kappa group as a project engineer, co-managing over e7.5 million in projects and later he worked as an engineering manager managing a department with 10 staff members. He then returned to research work and his studies and received a MSc with distinction in Computing and Intelligent Systems from the University of Ulster, UK in 2012 and was awarded the ”8-over-8” industry award for best student in his MSc studies. He has recently successfully defended his PhD thesis in Robotics also in the University of Ulster, UK. He is a lecturer and a researcher in The Intelligent Systems Research Centre in the Faculty of Computing and Engineering, University of Ulster, Magee Campus. Emmett has worked on numerous research projects, including the IM-CLeVeR and VISUALISE FP7 EU projects. He is author or co-author on 14 research papers and been involved in the writing of many research grant proposals.

    T.M. McGinnity (SMIEEE, FIET) received a First Class (Hons.) degree in Physics in 1975, and a Ph.D. degree from the University of Durham, UK in 1979. He is currently Pro-Vice Chancellor for Student Affairs and Head of the College of Science and Technology at Nottingham Trent University, UK. Formerly, he was Professor of Intelligent Systems Engineering and Director of the Intelligent Systems Research Centre in the Faculty of Computing and Engineering, University of Ulster. He is the author or co-author of over 300 research papers and has attracted over $25 million in research funding. His research interests are focused on computational intelligence, cognitive robotics and biological information processing.

    Sonya Coleman received a BSc (Hons.) in Mathematics, Statistics and Computing from the University of Ulster, UK in 1999, and a PhD in Mathematics from the University of Ulster, UK in 2003. She is a Professor of Vision Systems in the School of Computing and Intelligent System at the Ulster University, Magee, Cognitive Robotics team leader within the Intelligent Systems Research Centre. Prof Coleman has 130+ publications primarily in robotics, image processing, bio-inspired systems and computational finance. Funding from various sources such as EPSRC (EP/C006283/1), The Nuffield Foundation, The Leverhulme Trust and the EU has supported her research. She is currently PI of the Capital Markets Engineering Studentship pro- gramme and Co-I on the Capital Markets Collaborative Network project. She is currently Co-I on the EU FP7 SLANDAIL project and was Co-I on recently completed EU FP7 projects VISUALISE and RUBICON. In 2009 she was awarded the Distinguished Research Fellowship by the Ulster University in recognition of her contribution to research.

    View full text