Human-computer interaction is a key feature of virtual reality technology. Today, with the continuous update of intelligent hardware and the acceleration of mobile network, human-computer interaction has been rapidly developed, among which hand gesture recognition is the most popular human-computer interaction mode. At present, hand gesture interaction has been used as a new generation of human-computer interaction in smart cars, wearable devices, automotive electronics, smart phones and other fields.
To realize hand gesture interaction, the first step is to collect gesture data. There are two ways to realize data acquisition: visual gesture capture based on camera image and inertial gesture capture based on sensor tracking. However, these two methods still have obvious disadvantages, such as, insufficient capture accuracy, high data noise and data pre-processing. Dr. Wang Yifeng, from School of Science, Harbin Institute of Technology, has been working on the Hand Gesture interaction of smart bracelets.
Dr. Wang Yifeng used NOKOV motion capture system to obtain hand gesture data. Through the marker on the surface of the smart bracelet, the motion capture system based on infrared optics can output the three-dimensional coordinate of the marker in real time. When wearing the smart bracelet and making gestures, the information of different gestures can be reflected by the change of the position of the marker, and the accuracy reaches sub-millimeter level. The speed, acceleration and other information of hand movement are also provided by the motion capture system. All data is directly imported into different systems through the rich SDK interface provided by NOKOV motion capture system. Instead of spending time pre-processing large amounts of data, researchers can focus on classification and recognition algorithms.
After the classification and recognition model was trained by imported data information, various gestures with 26 letters were used as test samples to verify the algorithm, and the accuracy of recognition and classification was analyzed by statistics of the correct recognition frequency and the category and frequency of wrong classification of gesture samples through continuous real-time test.