A research team from Stanford University will present a paper on synthesizing dexterous hand movements through physical simulation at the upcoming SIGGRAPH Asia 2024 conference. This work is applicable to tasks requiring high levels of bilateral coordination and precise control.
In the study, a synthesized virtual guitarist was able to accurately perform musical pieces it had not been trained on. The authors abandoned the approach of training both hands as a single unit. Instead, they treated each hand as an independent agent, training them separately before coordinating their movements. This approach avoids direct policy learning in high-dimensional state-action spaces, thereby significantly improving training efficiency.
The study’s training data requirements are extremely rigorous. To ensure natural and physically plausible movements, the research team used NOKOV motion capture technology to collect performance data from professional guitarists, providing high-precision training data for the virtual guitarist.