English 中文 日本語 한국어 Русский

Capturing Motion,
Crafting Stories

Explore Our Case Studies: Transforming Motion into Masterpieces Across Industries

Applications of motion capture in the development of multi-sensor navigation technology for unmanned vehicles

Client
Harbin Institute of Technology
Capture volume
5m × 5m
Application
Unmanned vehicles, location tracking, algorithm validation
Objects
unmanned vehicle
Equipment used

Unmanned vehicles are intelligent autonomous vehicles that may perform path planning and environment perception and have become a popular focus in the research of intelligent vehicles. Unmanned vehicles may identify their surroundings and their own conditions through onboard sensors and perform navigation and positioning calculations to plan paths towards specific targets.

In different conditions, a single navigation sensor is unable to provide accurate data for high-precision position navigation. As such, a variety of sensors are installed on unmanned vehicles. Currently, the most used sensors are inertial measurement units (IMUs), ultra-wideband (UWB), and wheel odometers.

To make the unmanned vehicle system more adaptive and reliable, researchers at the Harbin Institute of Technology studied a multi-sensor navigation system and aimed to solve the problem of asynchronous data acquisition in the case of sensor failure. The researchers abstracted sensor information into factors and used the factor graph model to establish a multi-sensor framework, then used an incremental smoothing and mapping (iSAM2) optimization algorithm based on the Bayesian tree to process and synchronize sensor information. This method ensures that the accuracy is close to the least squares method while retaining computational efficiency, which may greatly improve the robustness and reliability of the navigation system.

The study focused on integrating IMU, UWB, and odometer data in an indoor environment with the unmanned vehicle moving at a relatively low speed. To verify the performance of the navigation algorithm, the researchers built a multi-sensor platform, featuring a Scout 2.0 mobile quad equipped with a MTi-G-700 (IMU), a LinkTrack S (UWB), and an odometer (ODOM) for its onboard sensors. The platform ran on an Ubuntu system and used Robot Operation System (ROS) for synchronized data acquisition.

1659597043580789.png

Figure 1 UWB base station layout location

1659597094121170.png

Figure 2 IMU, UWB mounting location

To obtain the actual position of the platform, the researchers used the NOKOV motion capture system. Three reflective identification points were attached to the platform and were tracked by 16 Mars optical lenses placed above the 5m × 5m testing field. Because the positioning accuracy of the NOKOV motion capture system reaches the sub-millimeter level, it provided reliable information for the true position and movement trajectory of the trolley.

1659597152305208.png

Figure 3 NOKOV motion capture system

1659597186433426.png

Fig. 4 Reflective marker placement

To analyze the performance of the navigation system, the experiment compared the IMU+UWB+ODOM data with the actual parameters recorded by the motion capture system, then compared this data again with the data obtained by a single sensor.

The researchers concluded that single sensors face certain limitations, while the multi-sensor algorithm makes a drastic improvement over the performance of a single-sensor system. They also analyzed the efficiency and robustness of the multi-sensor system, proving that the method greatly improved the computational efficiency and robustness of the navigation system, which was also able to obtain more accurate positioning information than previously possible in the case of sensor failure.

Bibliography:

Shen Hebing. Research on multi-source sensor information fusion navigation technology of unmanned vehicle [D]. Harbin Institute of Technology, 2021.DOI:10.27061/d.cnki.ghgdu.2021.004020.

Prev
Applications of motion capture systems in wire-driven continuum robot research
Next
Applications of motion capture systems to control the end positioning of flexible robotic arms

Applications of motion capture systems in wire-driven continuum robot research

Sichuan University
2022-06-17

Applications of Motion Capture Systems for Robot Joint Displacement and Geometric Parameter Calibration

School of Aerospace Engineering and Applied Mechanics,Tongji University
2022-06-18

Applications of motion capture for snake movement analysis and snake robot development

Changchun University of Science and Technology
2022-06-22

IROS 2024 | Manta Ray-Inspired Soft Robotic Swimmer Achieves High-speed and Multi-modal Swimming

South China University of Technology
2024-12-12

By using this site, you agree to our terms, which outline our use of cookies. CLOSE ×

Contact us
We are committed to responding promptly and will connect with you through our local distributors for further assistance.
Engineering Virtual Reality Movement Sciences Entertainment
I would like to receive a quote
Beijing NOKOV Science & Technology Co., Ltd (Headquarter)
Room820, China Minmetals Tower, Chaoyang Dist., Beijing
info@nokov.cn
+ 86-10-64922321
Capture Volume*
Objective*
Full Bodies Drones/Robots Others
Quantity
Camera Type
Pluto1.3C Mars1.3H Mars2H Mars4H Underwater Others/I do not know
Camera Count
4 6 8 12 16 20 24 Others/I don't know