A. Bigdelou, T. Benz, L. Schwarz, N. Navab
Customizable Gesturing Interface for the Operating Room using Kinect ChaLearn Gesture Challenge, CVPR Workshop on Gesture Recognition, Providence, Rhode Island, June 2012 (bib) |
||
Sterility requirements and interventional workflow in the Operating Room (OR) often make it challenging for surgeons to interact with computerised systems. As a solution, we propose a customizable gesture-based interface for the OR. Our solution consists of a gesture recognition technique and an integration platform. Our recognition technique simultaneously detects the type of the gesture (categorical information) and the state that a user holds within the current gesture (spatio-temporal information). Introducing several feature extraction methods this technique performs independent to the human body proportions and Kinect placement. To exploit this gesturing technique in the OR, we additionally introduce an extensible software platform which allows to define context-aware gesturing interface for several intra-operative devices. The behavior of the gesturing interface can be customized using a visual editor. | ||
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each authors copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder. |