English 中文 日本語 한국어 Русский

Capturing Motion,
Crafting Stories

Explore Our Case Studies: Transforming Motion into Masterpieces Across Industries

RA-L Paper: Enhancing Generalizable 6D Pose Tracking of an In-Hand Object With Tactile Sensing

Client
Tsinghua University
Capture volume
Application
Force and tactile sensing, sensor fusion, visual tracking
Objects
Equipment used


Abstract

Accurate 6D pose tracking of objects is crucial in robotic manipulation tasks, particularly in complex assembly scenarios such as pin insertion. Traditional visual tracking methods are often limited by occlusions and noise in visual information, especially in scenarios involving robotic hand operations. This paper presents a novel method, TEG-Track, proposed by the team led by Professor Yi Li at the Institute for Interdisciplinary Information Sciences, Tsinghua University, and published in RA-L. TEG-Track enhances the performance of generalizable 6D pose tracking by integrating tactile sensing, and the team has constructed the first fully annotated visual-tactile dataset for in-hand object pose tracking in real-world scenarios. The data acquisition system includes the NOKOV motion capture system.

Code and Dataset: https://github.com/leolyliu/TEG-Track 

Article: https://ieeexplore.ieee.org/document/10333330/ 

Citation

 Y. Liu et al., "Enhancing Generalizable 6D Pose Tracking of an In-Hand Object With Tactile Sensing," in IEEE Robotics and Automation Letters, vol. 9, no. 2, pp. 1106-1113, Feb. 2024, doi: 10.1109/LRA.2023.3337690.

Research Background

The reliability of robotic manipulation depends on the accurate perception of the motion state of objects held in hand. Existing 6D pose tracking methods typically rely on RGB-D visual data, which perform poorly in scenarios involving occlusion and environmental collisions. In contrast, tactile sensors can directly sense the geometry and motion information of the contact area, providing additional auxiliary signals for tracking.

System Framework

 The core of TEG-Track lies in using tactile kinematic cues to enhance visual pose trackers through a geometric motion optimization strategy.

Picture (1).png

Dataset

The synthetic dataset includes object instances of various geometric shapes selected from the ShapeNet dataset, while the real-world dataset comprises 200 videos covering 17 different objects across 5 categories. The data acquisition system consists of a robotic arm, tactile sensors, RGB-D sensors, the NOKOV motion capture system, and objects.

Picture (2).png

Experimental Results

Picture (3).png

The experiments compared the performance improvement of TEG-Track on three types of visual trackers: keypoint-based (BundleTrack), regression-based (CAPTRA), and template-based (ShapeAlign). Results indicate that TEG-Track reduced the average rotation error by 21.4% and the translation error by 30.9% in real data.

Qualitative results of long-distance trajectories in real data show red and green boxes representing the predicted and true poses of the object in hand, respectively.

By simulating tactile noise patterns, TEG-Track's performance under different quality tactile signals was tested, showing greater stability and robustness compared to baseline methods relying solely on tactile or visual inputs.

TEG-Track achieved a processing speed of 20 frames per second in multi-frame optimization scenarios, with low additional computational cost, making it suitable for real-time applications.

The NOKOV motion capture system is employed to obtain the true pose information of objects for comparison with predicted poses.



Prev
Applications of motion capture systems in wire-driven continuum robot research
Next
Mechanical Metamaterials with Discontinuous and Tension Compression-Dependent Positive Negative Poissons Ratio

Applications of motion capture systems in wire-driven continuum robot research

Sichuan University
2022-06-17

Applications of Motion Capture Systems for Robot Joint Displacement and Geometric Parameter Calibration

School of Aerospace Engineering and Applied Mechanics,Tongji University
2022-06-18

Applications of motion capture for snake movement analysis and snake robot development

Changchun University of Science and Technology
2022-06-22

Suture Skills Learning of Surgical Robot Based on Learning from Demonstration

Chongqing University of Posts and Telecommunications
2022-02-11

By using this site, you agree to our terms, which outline our use of cookies. CLOSE ×

Contact us
We are committed to responding promptly and will connect with you through our local distributors for further assistance.
Engineering Virtual Reality Movement Sciences Entertainment
I would like to receive a quote
Beijing NOKOV Science & Technology Co., Ltd (Headquarter)
Room820, China Minmetals Tower, Chaoyang Dist., Beijing
info@nokov.cn
+ 86-10-64922321
Capture Volume*
Objective*
Full Bodies Drones/Robots Others
Quantity
Camera Type
Pluto1.3C Mars1.3H Mars2H Mars4H Underwater Others/I do not know
Camera Count
4 6 8 12 16 20 24 Others/I don't know