Neuro-Visual Adaptive Control for Precision in Robot-Assisted Surgery

Urrea, C.; Garcia-Garcia, Y; Kern, J.; Rodriguez-Guillen, R

Keywords: visual tracking, autonomous robotic surgical assistant, model reference-based neuro-visual adaptive control, surgical instrument detection, CfC-mmRNN, data-driven inverse modeling, YOLO11n

Abstract

This study introduces a Neuro-Visual Adaptive Control (NVAC) architecture designed to enhance precision and safety in robot-assisted surgery. The proposed system enables semi-autonomous guidance of the laparoscope based on image input. To achieve this, the architecture integrates the following: (1) a computer vision system based on the YOLO11n model, which detects surgical instruments in real time; (2) a Model Reference Adaptive Control with Proportional-Derivative terms (MRAC-PD), which adjusts the robot's behavior in response to environmental changes; and (3) Closed-Form Continuous-Time Neural Networks (CfC-mmRNNs), which efficiently model the system's dynamics. These networks address common deep learning challenges, such as the vanishing gradient problem, and facilitate the generation of smooth control signals that minimize wear on the robot's actuators. Performance evaluations were conducted in CoppeliaSim, utilizing real cholecystectomy images featuring surgical tools. Experimental results demonstrate that the NVAC achieves maximum tracking errors of 1.80 x 10-3 m, 1.08 x 10-4 m, and 1.90 x 10-3 m along the x, y, and z axes, respectively, under highly significant dynamic disturbances. This hybrid approach provides a scalable framework for advancing autonomy in robotic surgery.

Más información

Título según WOS: Neuro-Visual Adaptive Control for Precision in Robot-Assisted Surgery
Título de la Revista: TECHNOLOGIES
Volumen: 13
Número: 4
Editorial: MDPI
Fecha de publicación: 2025
Idioma: English
DOI:

10.3390/technologies13040135

Notas: ISI