Researchers operate a complicated surgical robot in a research laboratory
Alan Kuntz (right) manipulates the teleoperation controls of a decommissioned surgical robot while lab member Jordan Thompson adjusts a camera. By correlating a surgeon’s maneuvers with what the robot sees, Kuntz and his colleagues will train such robots to autonomously perform certain procedures. Kuntz will collaborate with fellow Kahlert faculty Tucker Hermans and Daniel Brown, as well as researchers from Vanderbilt and Johns Hopkins University.

A surgical robot capable of performing an entire surgery without human intervention: That’s the goal of a landmark, multi-institution project that has received an up-to-$12 million award from the Advanced Research Projects Agency for Health (ARPA-H).

Researchers at the U’s Price College of Engineering, led by Kahlert School of Computing Assistant Professor Alan Kuntz, will take part in this innovative initiative, focusing on the artificial intelligence and machine learning techniques that would enable a robot to master these intricate, high-stakes tasks.   

The U’s contingent also includes Kahlert Associate Professor Tucker Hermans and Assistant Professor Daniel Brown.

The project is led by Vanderbilt University Professor Robert J. Webster III and also includes robotics experts from Johns Hopkins University (JHU), the University of Tennessee, Knoxville, as well as surgeons from Vanderbilt University Medical Center (VUMC) and JHU. The team also includes hardware and software experts from Virtuoso Surgical, a surgical robotics company co-founded by Webster and Duke Herrell, professor of urology and Director of VUMC’s Minimally Invasive Urologic Surgery/Robotics Program.

 “Fully autonomous surgical robots will transform medicine,” Webster said. “Not only will they make routine procedures safer and more affordable, but they will also address the worldwide shortage of surgeons and expand global access to lifesaving surgeries.”

 Current surgical robotic technologies rely on one of two automation techniques. The first is known as model-based automation in which procedure sequences and start-stop conditions are pre-programmed. Another method draws on machine-learning algorithms requiring enormous datasets of procedures and task sequences. While each approach has led to advances in surgical robotics, both lack scalability, generalizability and adaptability.

 “We will create brand new machine-learning algorithms beyond anything that exists today,” Webster said. “The key to making them practical is to simultaneously look at how human surgeons perform their work. What basic set of maneuvers do they use? How do they sequence those maneuvers? The answers to these questions enable effective learning on a tractable amount of data.”

An extreme closeup of a medical robot, equipped with piercing and grasping tools
The research team includes members of Vanderbilt spinoff Virtuoso Surgical. These snake-like robots are currently tele-operated by human surgeons during minimally-invasive procedures, but the team’s research could enable them to work on their own.

The team’s robots  — developed at the Vanderbilt Institute for Surgery and Engineering (VISE) — are the size and shape of a needle, and are deployed through a sequence of flexible concentric tubes. Kuntz, Webster, and collaborators have previously demonstrated the ability of similar robots to safely thread their way through maze-like bronchial branches and lung tissue to a tumor target in a live animal; the next step is creating a system that can perform more general surgical tasks that require manipulating and cutting tissue. 

By having human surgeons teleoperate the researchers’ robotic systems while the human performs surgery, the robot can “shadow” the human’s decision-making, learning more generalized skills. 

“Our algorithms are watching the commands the surgeon sends to the robot and what they were looking at through the robot’s camera at the time,” said Kuntz. “By looking at the history of those commands, the algorithms can infer how what the surgeons did changed based on what they saw. The robot can then apply these learned strategies to new surgical decisions that it hasn’t encountered before.” 

The Kahlert team of professors Kuntz, Hermans, and Brown represents one of the world’s leading research teams in this concept; autonomy learned from surgeon demonstration. 

“We will further enable the robot to understand its own uncertainty, knowing what it doesn’t know,” Kuntz elaborates, “so that it can ask a human surgeon for input, clarification, or to take over when it is unsure of how to proceed. That’s a key building block for robust autonomy.”

 Webster said one of the related progress milestones will involve having the robot make situationally aware statements like, “I think I should cut here, with the goal of removing this volume of tissue.” A human surgeon would then confirm or adjust the action. These interactions will be aggregated to continue to improve robotic performance to the point of fully independent autonomy.

Within three years, the research team hopes to demonstrate a robotic surgical device capable of removing tumors from the trachea and prostate without the intervention of a surgeon. The team also foresees this research applying to uterine fibroids, bladder tumors, spine procedures, and brain cysts, among other clinical applications. As described in the ARPA-H proposal, these would initially be demonstrated in simulated conditions and not on live patients.

“Creating a system that can learn from human surgeons — and continue to improve performance — will be a game changer,” Herrell said. “Our vision is not to replace surgeons, but to vastly expand the work they do to improve patient’s lives and long-term health outcomes.”