The goal of the GreifbAR project is to make extended reality (XR) worlds, including virtual (VR) and mixed reality (MR), tangible and graspable by allowing users to interact with real and virtual objects with their bare hands. Hand accuracy and dexterity is paramount for performing precise tasks in many fields, but the capture of hand-object interaction in current XR systems is woefully inadequate. Current systems rely on hand-held controllers or capture devices that are limited to hand gestures without contact with real objects. GreifbAR solves this limitation by proposing a sensing system that detects both the full hand grip including hand surface and object pose when users interact with real objects or tools. This sensing system will be integrated into a mixed reality training simulator.
Competent handling of instruments and suture material is the basis of every surgical activity. The main instruments used in surgery are in the hands of the surgical staff. Their work is characterised by the targeted use of a large number of instruments that have to be operated and controlled in different ways. Until now, surgical knotting techniques have been learned by means of personal instruction by experienced surgeons, blackboard images and video-based tutorials. A training and teaching concept based on the acquisition of finger movement does not yet exist in surgical education and training. Learning surgical account techniques through participant observation and direct instruction by experienced surgeons is cost-intensive and hardly scalable. This type of training is increasingly reaching its limits in daily clinical practice, which can be attributed in particular to the changed economic, social and regulatory conditions in surgical practice. Students and trainees as well as specialist staff in further training are therefore faced with the problem of applying and practising acquired theoretical knowledge in a practice-oriented manner. Text- and image-based media allow scalable theoretical knowledge acquisition independent of time and place. However, gestures and work steps can only be passively observed and subsequently imitated. Moreover, the learning success cannot be quantitatively measured and verified.
The aim of the Charité's sub-project is therefore to develop a surgical application scenario for Mixed/Augmented Reality (MR/AR) for the spatial guidance and verifying recording of complex fine motor finger movements for the creation of surgical knots, the practical implementation and technical testing of the developed concept within the framework of a demonstrator, and the evaluation of the usability of the system for use in a clinical context.