Hand gesture based control with multi-modality data - towards surgical applications
Conference: BIBE 2019 - The Third International Conference on Biological Information and Biomedical Engineering
06/20/2019 - 06/22/2019 at Hangzhou, China
Proceedings: BIBE 2019
Pages: 4Language: englishTyp: PDF
Personal VDE Members are entitled to a 10% discount on this title
Authors:
Sun, Yu; Miao, Lijie; Yuan, Zhenming (School of Information Science and Engineering, Hangzhou Normal University, Hangzhou, Zhejiang, China)
Sun, Xiaoyan (School of Information Science and Engineering, Hangzhou Normal University, Hangzhou, Zhejiang, China & Engineering Research Center of Mobile Health Management System, Ministry of Education, Hangzhou, Zhejiang, China)
Abstract:
Image-guided surgery provides the surgeon with additional information to perform the operation more accurately. However, how to visualize the preoperative planning in an interactive way remains a challenge. Hand gesture based control provides touchless user interface thus a potential for surgical environment. Comparing to static hand gestures, dynamic gestures provide more natural way for human control. However, considering the complexity of dynamic hand gestures, there are challenges associated with sufficient feature extraction. In this paper, a multi-modality system was proposed for dynamic hand gesture recognition. Two types of features were defined including finger pose and hand motion, which were extracted by combining information from Leap Motion and depth images. A Long Short-Term Memory (LSTM) network was trained based on the extracted feature sequence for hand gesture recognition. Experiments were performed to evaluate the feasibility of the system with a 16-class dynamic gesture dataset, and an average accuracy of 94.10% was achieved. The results demonstrated that depth images can effectively compensate for feature missing due to lost tracking by Leap Motion, thus providing the possibility for surgical applications.