Designing a Novel VR Simulator for Core Laparoscopic Skills and Assessing Its Construct Validity via Machine Learning.
Abstract
We introduce an innovative and highly portable VR simulator, SECMA, designed to enhance minimally invasive surgery (MIS) training through immersive simulations of basic laparoscopy techniques. This simulator transforms the Oculus Quest headset into an educational tool with four key components: (1) a mechanical interface emulating surgical instruments, (2) virtual scenarios replicating operating rooms, (3) a real-time data capture system, and (4) machine learning tools to differentiate between the proficiency levels of experienced surgeons and novices. SECMA underwent construct validation through iterative simulations of two distinct virtual scenarios: (1) Coordination and (2) Grasp and Transport. The study cohort consisted of 21 individuals, stratified into eleven novices with limited experience and ten experts, each with over a hundred endoscopic procedures. A set of metrics, including activity elapsed time, error scores, right-hand speed, and pathway length, was systematically collected for subsequent in-depth analysis. Data, automatically acquired by the simulator, were subjected to statistical analyses (hypothesis testing, linear regressions, ANOVA, PCA) and harnessed for machine learning classification (using LDA, GLM, KNN, SVM, XGBOOST, RF). The experiment outcomes revealed that experts outperformed novices across all assessed parameters. The discernible discrepancy between the two cohorts underscores SECMA's ability to discriminate between the skill levels of experienced surgeons and novices, yielding substantial evidence of its construct validity. The discussion highlights the potential of devices like SECMA, which repurpose VR headsets, to revolutionize virtual education across various domains of expertise. By providing an immersive and adaptable learning experience, SECMA holds promise as a paradigm-shifting tool capable of reshaping MIS training.
Más información
Fecha de publicación: | 2024 |
Idioma: | Inglés |
URL: | https://doi.org/10.1007/978-3-031-53960-2_44 |