English 中文 日本語 한국어 Русский

Capturing Motion,
Crafting Stories

Explore Our Case Studies: Transforming Motion into Masterpieces Across Industries

Suture Skills Learning of Surgical Robot Based on Learning from Demonstration

Client
Chongqing University of Posts and Telecommunications
Capture volume
Application
Surgical Robot, Learning from Demonstration, DMPs
Objects
Needle Holder
Equipment used

Surgical assistant robot has the characteristics of accurate control, stable operation and high operation precision, which can help surgeons overcome the difficulties in operation precision, working space, distance and cooperative work of traditional surgery.

In order to make the surgical robot system realize the high quality automatic operation like the doctor, an important basic work is to establish the surgical operation model. To this end, Professor Yang Dewei's team from Chongqing University of Posts and Telecommunications took superficial tissue suturing as the modeling object to study suturing skills and modeling.

operation flow chart of surgical robot

In order to solve the problem of poor migration ability of traditional models in new scenarios, Professor Yang proposed a "demonstration-disintegration-modeling" skill learning modeling framework. The suture process was decomposed into several sub-processes, and DMPs method was used to model the trajectories of the sub-processes.

The method of learning from demonstration has better migration ability for scenes with similar but different trajectories. In order to obtain the data during the suture demonstration, the researchers established a suture demonstration acquisition system.

Motion capture collect data in suturing demonstration

The system includes NOKOV motion capture system, needle holder, suture needles, thread and wound model. The NOKOV motion capture system is equipped with 7 infrared optical cameras to measure and capture the stitching process. Two needle holders are respectively pasted with 3 markers. The motion capture system was used to obtain the three-dimensional coordinates of the markers, and the continuous real-time position and posture trajectory of the needle holder were calculated.

As shown in the figure below, the motion trajectory of the needle holders in the wound coordinate system can be obtained through coordinate transformation. In order to eliminate the doctor's hand shaking, the track data is processed by low-pass filter.

motion trajectory captured by optical motion capture

The suturing process can be divided into 3 stages: needle penetration into skin tissue, tie a knot, and then tighten the sutures. The DMPs method proposed by the author can represent the dynamic process at each stage. The figure below shows the trajectories obtained after training using DMPs method. It can be seen that the trajectory obtained by DMPs is in good consistency with the real trajectory.

Trajectories learned after training with DMPs

One advantage of DMPs model is good migration effect. As shown in the figure below, the dynamic process of suture is similar when the endpoint position is changed, which makes it easy to use the learning suture model to plan new trajectories for various positions and types of wounds.

1631086124142629.jpg

Solid blue lines: Suture tracks collected by NOKOV's motion capture system

Red dotted line: Corresponding track after DMPs learning after changing target position

 

References:

[1]   D. Yang, Q. Lv, G. Liao, K. Zheng, J. Luo and B. Wei,"Learning from Demonstration: Dynamical Movement Primitives Based Reusable Suturing Skill Modelling Method," 2018 Chinese Automation Congress (CAC),2018, pp. 4252-4257, doi: 10.1109/CAC.2018.8623781.

Prev
Research & development of Biomechatronics Robots
Next
Applications of motion capture systems in wire-driven continuum robot research

Applications of motion capture systems in wire-driven continuum robot research

Sichuan University
2022-06-17

Applications of Motion Capture Systems for Robot Joint Displacement and Geometric Parameter Calibration

School of Aerospace Engineering and Applied Mechanics,Tongji University
2022-06-18

Applications of motion capture for snake movement analysis and snake robot development

Changchun University of Science and Technology
2022-06-22

Behind the Scenes: How to Drive Virtual Human

MetaMaker
2022-10-26

By using this site, you agree to our terms, which outline our use of cookies. CLOSE ×

Contact us
We are committed to responding promptly and will connect with you through our local distributors for further assistance.
Engineering Virtual Reality Movement Sciences Entertainment
I would like to receive a quote
Beijing NOKOV Science & Technology Co., Ltd (Headquarter)
Room820, China Minmetals Tower, Chaoyang Dist., Beijing
info@nokov.cn
+ 86-10-64922321
Capture Volume*
Objective*
Full Bodies Drones/Robots Others
Quantity
Camera Type
Pluto1.3C Mars1.3H Mars2H Mars4H Underwater Others/I do not know
Camera Count
4 6 8 12 16 20 24 Others/I don't know