Projects per year
Abstract
Transparent objects are widely used in our daily lives and therefore robots need to be able to handle them. However, transparent objects suffer from light reflection and refraction, which makes it challenging to obtain the accurate depth maps required to perform handling tasks. In this letter, we propose a novel affordance-based framework for depth reconstruction and manipulation of transparent objects, named A4T. A hierarchical AffordanceNet is first used to detect the transparent objects and their associated affordances that encode the relative positions of an object's different parts. Then, given the predicted affordance map, a multi-step depth reconstruction method is used to progressively reconstruct the depth maps of transparent objects. Finally, the reconstructed depth maps are employed for the affordance-based manipulation of transparent objects. To evaluate our proposed method, we construct a real-world dataset TRANS-AFF with affordances and depth maps of transparent objects, which is the first of its kind. Extensive experiments show that our proposed methods can predict accurate affordance maps, and significantly improve the depth reconstruction of transparent objects compared to the state-of-the-art method, with the Root Mean Squared Error in meters significantly decreased from 0.097 to 0.042. Furthermore, we demonstrate the effectiveness of our proposed method with a series of robotic manipulation experiments on transparent objects.
Original language | English |
---|---|
Pages (from-to) | 9826-9833 |
Number of pages | 8 |
Journal | IEEE Robotics and Automation Letters |
Volume | 7 |
Issue number | 4 |
Early online date | 15 Jul 2022 |
DOIs | |
Publication status | Published - 1 Oct 2022 |
Keywords
- Computer vision for automation
- robotics and automation in life sciences
Fingerprint
Dive into the research topics of 'A4T: Hierarchical Affordance Detection for Transparent Objects Depth Reconstruction and Manipulation'. Together they form a unique fingerprint.Projects
- 1 Finished
-
ViTac: Visual-Tactile Synergy for Handling Flexible Materials
Luo, S. (Primary Investigator)
EPSRC Engineering and Physical Sciences Research Council
17/12/2021 → 11/10/2024
Project: Research
Prizes
-
Best Student Paper finalist
Jiang, J. (Recipient) & Luo, S. (Recipient), 2022
Prize: Other distinction