Localizing the object contact through matching tactile features with visual map

Shan Luo, Wenxuan Mou, Kaspar Althoefer, Hongbin Liu*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

41 Citations (Scopus)

Abstract

This paper presents a novel framework for integration of vision and tactile sensing by localizing tactile readings in a visual object map. Intuitively, there are some correspondences, e.g., prominent features, between visual and tactile object identification. To apply it in robotics, we propose to localize tactile readings in visual images by sharing same sets of feature descriptors through two sensing modalities. It is then treated as a probabilistic estimation problem solved in a framework of recursive Bayesian filtering. Feature-based measurement model and Gaussian based motion model are thus built. In our tests, a tactile array sensor is utilized to generate tactile images during interaction with objects and the results have proven the feasibility of our proposed framework.

Original languageEnglish
Title of host publication2015 IEEE International Conference on Robotics and Automation, ICRA 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3903-3908
Number of pages6
EditionJune
ISBN (Electronic)9781479969234
DOIs
Publication statusPublished - 29 Jun 2015
Event2015 IEEE International Conference on Robotics and Automation, ICRA 2015 - Seattle, United States
Duration: 26 May 201530 May 2015

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
NumberJune
Volume2015-June
ISSN (Print)1050-4729

Conference

Conference2015 IEEE International Conference on Robotics and Automation, ICRA 2015
Country/TerritoryUnited States
CitySeattle
Period26/05/201530/05/2015

Fingerprint

Dive into the research topics of 'Localizing the object contact through matching tactile features with visual map'. Together they form a unique fingerprint.

Cite this