Surgical Robot Automation

Surgical robots, such as Intuitive Surgical’s da Vinci Surgical System, have brought about more efficient surgeries by improving dexterity and reducing surgeon fatigue through teleoperational control. While these systems are already providing great care to patients, they have also opened the door to a variety of research, including surgical task automation. Surgical task automation has furthermore been an increasing area of research in an effort to improve patient throughput, reduce quality-of-care variance among surgeries, and potentially deliver automated surgery in the future. We are developing algorithms and control policies that automate surgical tasks to work towards this future.

Learning to Automate Surgery

Reinforcement Learning (RL) is a machine learning framework for AI systems to solve complex problems. In recent years, success in solving challenging games and robotic manipulation tasks has increased, partly due to collaborative efforts on open-sourced environment simulators like OpenAI's Gym. We present the first open-sourced reinforcement learning environment for surgical robotics, called dVRL, which is functionally equivalent to Gym. dVRL enables prototyping and implementing state-of-art RL algorithms on surgical robotics problems, aiming to introduce autonomous robotic precision and accuracy during surgery. Combining dVRL with the da Vinci Surgical Research Kits network, we enable the surgical robotics community to leverage the newest RL strategies and RL scientists to develop algorithms for autonomous surgery challenges.

Through dVRL, we successfully trained a robot to perform bimanual suture needle regrasping, a crucial subtask for suturing, with real-time replanning ability. Our proposed RL algorithm leverages the ego-centric state space and demonstrations from a sub-optimal policy to achieve efficient learning and generalizability. The learned regrasping policies pave the way for deploying other automated suture throw techniques by relieving them of task-specific suture needle grasping mechanisms.

Robust Surgical Automation

Robustness is a critical factor in surgical automation since the nature of surgical procedures demands unwavering reliability and resilience. Robust surgical automation should be capable of seamlessly adapting to dynamic surgical scenarios, handling variability in patient anatomy, and quickly recovering from disruptions, minimizing the potential for adverse impacts on the patient's well-being. Achieving robustness in surgical automation is crucial for instilling confidence in both surgeons and patients, as it guarantees consistent and reliable performance, even in challenging or unpredictable circumstances.

To achieve robust surgical automation, we introduce SURESTEP (Surgical Uncertainty-aware Robust ESTimation TrajEctory Protocol), a framework for uncertainty-aware trajectory optimization that enables robust automation by enhancing visual tool tracking accuracy. It optimizes trajectories from any policies to be robust to motion and observation uncertainties commonly encountered in surgical settings. We applied SURESTEP to a suture needle regrasping task and demonstrated significant improvement in success rates on the da Vinci Research Kit (dVRK), even under challenging conditions like a moving endoscopic camera.



  • Jingpei Liu
  • Nikhil Shinde
  • Zih-Yun Chiu
  • Neelay Joglekar
  • Yun-Jie Ho
  • *Florian Richter



  • *Ryan Orosco
  • *Emily Funk



SURESTEP: An Uncertainty-Aware Trajectory Optimization Framework to Enhance Visual Tool Tracking for Robust Surgical Automation

N.U. Shinde*, Z.Y. Chiu*, F. Richter, J. Lim, Y. Zhi, S. Herbert, M.C. Yip

arXiv preprint arXiv:2404.00123 [arxiv]

Bimanual Regrasping for Suture Needles using Reinforcement Learning for Rapid Motion Planning

Z.Y. Chiu, F. Richter, E.K. Funk, R.K. Orosco, M.C. Yip

IEEE Conference on Robotics and Automation (Accepted). Xi'an, China (2021). [arxiv][video]

Real-to-Sim Registration of Deformable Soft-Tissue with Position-Based Dynamics for Surgical Robot Autonomy

F. Liu, Z. Li, Y. Han, J. Lu, F. Richter, M.C. Yip

IEEE Conference on Robotics and Automation (Accepted). Xi'an, China (2021). [arxiv][video]

Model-Predictive Control of Blood Suction for Surgical Hemostasis using Differentiable Fluid Simulations

J. Huang*, F. Liu*, F. Richter, M.C. Yip

IEEE Conference on Robotics and Automation (Accepted). Xi'an, China (2021).  [arxiv][video]

SuPer Deep: A Surgical Perception Framework for Robotic Tissue Manipulation using Deep Learning for Feature Extraction

J. Lu, A. Jayakumari, F. Richter, Y. Li, M.C. Yip

IEEE Conference on Robotics and Automation (Accepted). Xi'an, China (2021). [website][arxiv][video]

Optimal Multi-Manipulator Arm Placement for Maximal Dexterity during Robotics Surgery

J. Di, M. Xu, N. Das, M.C. Yip

IEEE Conference on Robotics and Automation (Accepted). Xi'an, China (2021). [arxiv][video]

Autonomous Robotic Suction to Clear the Surgical Field for Hemostasis using Image-based Blood Flow Detection

F. Richter, S. Shen, F. Liu, J. Huang, E.K. Funk, R.K. Orosco, M.C. Yip

IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 1383-1390.  [arxiv][video]

SuPer: A Surgical Perception Framework for Endoscopic Tissue Manipulation with Surgical Robotics

Y Li, F Richter, J Lu, EK Funk, RK Orosco, J Zhu, MC Yip

arXiv preprint arXiv:1909.05405, 2019. [pdf][website]

Open-Sourced Reinforcement Learning Environments for Surgical Robotics 

F. Richter, R. K. Orosco, M.C. Yip

arXiv preprint arXiv:1903.02090, 2019. [arxiv][vid][git]