유진수_교수_간이식_AI_1.jpg
The role of artificial intelligence in the operating room continues to expand at a remarkable pace. Following its application in laparoscopic living-donor liver transplantation, AI is now being used to provide real-time guidance on safe resection margins during robotic breast cancer surgery — with back-to-back research publications marking this rapid progression.

A research team from Samsung Medical Center (SMC) — comprising breast surgeons Prof. Jai Min Ryu and Prof. Woong Ki Park, and transplant surgeons Prof. Jin Soo Rhu and Prof. Nam Kee Oh — has developed an AI-powered intraoperative navigation system capable of delineating safe resection planes in real time during robotic nipple-sparing mastectomy. The system's clinical applicability was confirmed through multicenter external validation. The findings were published in a recent issue of the European Journal of Surgical Oncology (official journal of the European Society of Surgical Oncology; IF 2.9).

Robotic nipple-sparing mastectomy involves making a small axillary incision through which robotic arms are introduced to remove breast parenchyma while preserving the nipple-areola complex and overlying skin. The approach offers high patient satisfaction due to minimal visible scarring. However, robotic surgery presents an inherent limitation: the absence of haptic feedback. In open surgery, surgeons rely on tactile sensation to identify tissue boundaries. In the robotic setting, the surgeon must depend entirely on the visual field displayed on screen.

Precisely identifying the boundary between the subcutaneous fat layer and glandular breast tissue is particularly challenging. Dissection that is too superficial may leave residual breast tissue behind; dissection that is too deep risks disrupting the cutaneous blood supply, potentially leading to skin necrosis.

The research team addressed this challenge using AI-based image analysis. The system analyzes intraoperative video footage in real time, overlaying the boundary between the fat and glandular layers — the safe resection plane — directly onto the surgical display. Much like a GPS navigation system guiding a driver along a route, the AI provides the surgeon with a continuous visual reference for the resection margin.

To train the AI model, 1,996 frames were extracted from 29 robotic mastectomy videos performed at SMC. Breast surgery specialists manually annotated the safe resection plane in each frame; the AI then learned from this labeled dataset to automatically identify resection margins in live surgical footage.

Internal validation using SMC data yielded a Dice Similarity Coefficient (DSC) accuracy of 74.0%. External validation using 8 surgical videos from Samsung Changwon Hospital produced a comparable DSC of 70.8%, confirming that the system performs reliably across different institutions and surgeons.

This same research group previously developed an AI navigation system for laparoscopic living-donor hepatectomy, publishing their results in Scientific Reports. In that study, 48 surgical videos from three institutions — Samsung Medical Center, Myongji Hospital, and Yeungnam University Medical Center — were analyzed, with the AI providing real-time visualization of hepatic vascular anatomy and safe dissection planes. The current study extends the reach of AI surgical navigation from the liver to the breast.

Prof. Jai Min Ryu commented: "This is the first study to develop an AI navigation system for robotic mastectomy and complete multicenter external validation. The ability of AI to guide surgeons to safe resection planes in real time has the potential to enhance both the precision and safety of the procedure."

Prof. Jin Soo Rhu added: "We have expanded the application of AI-guided navigation from liver transplantation to breast cancer surgery. We will continue this research with the goal of integrating AI into a broader range of minimally invasive procedures for the benefit of our patients."