Towards Effective Tactile Identification of Textures using a Hybrid Touch Approach , Tasbolat Taunyazov★, Hui Fang Koh, Yan Wu, Caixia Cai, Harold Soh★, International Conference on Robotics and Automation (ICRA), 2019
Links:
The sense of touch is arguably the first human sense to develop. Empowering robots with the sense of touch may augment their understanding of interacted objects and the environment beyond standard sensory modalities (e.g., vision). This paper investigates the effect of hybridizing touch and sliding movements for tactile-based texture classification. We develop three machine-learning methods within a framework to discriminate between surface textures; the first two methods use hand-engineered features, whilst the third leverages convolutional and recurrent neural network layers to learn feature representations from raw data. To compare these methods, we constructed a dataset comprising tactile data from 23 textures gathered using the iCub platform under a loosely constrained setup, i.e., with nonlinear motion. In line with findings from neuroscience, our experiments show that a good initial estimate can be obtained via touch data, which can be further refined via sliding; combining both touch and sliding data results in 98% classification accuracy over unseen test data.
Resources
You can find our paper here. Check out our repository here on github
Citation
Please consider citing our paper if you build upon our results and ideas.
Tasbolat Taunyazov★, Hui Fang Koh, Yan Wu, Caixia Cai, Harold Soh★, “Towards Effective Tactile Identification of Textures using a Hybrid Touch Approach “, International Conference on Robotics and Automation (ICRA), 2019
@inproceedings{taunyazov2019towards,
title = {Towards effective tactile identification of textures using a hybrid touch approach},
author = {Taunyazov, Tasbolat and Koh, Hui Fang and Wu, Yan and Cai, Caixia and Soh, Harold},
booktitle = {International Conference on Robotics and Automation (ICRA)},
pages = {4269--4275},
year = {2019},
organization = {IEEE} }
Contact
If you have questions or comments, please contact Tasbolat Taunyazov.
Acknowledgements
This research is partially supported by the Agency for Science, Technology and Research (A*STAR) under its AME Programmatic Funding Scheme (Project #A18A2b0046) and the SINGA(A*STAR) Int’l. Award. We acknowledge partial funding from the NRF White Space: National Robotics Programme of Singapore (Grant #1722500063).