Evaluation of a touch-based image guidance system for robotic partial nephrectomy

JOURNAL OF UROLOGY(2023)

引用 0|浏览2
暂无评分
摘要
You have accessJournal of UrologyCME1 Apr 2023MP58-06 EVALUATION OF A TOUCH-BASED IMAGE GUIDANCE SYSTEM FOR ROBOTIC PARTIAL NEPHRECTOMY Shaan Setia, Piper Canon, James Ferguson, Nicholas Kavoussi, Robert Webster, and Duke Herrell Shaan SetiaShaan Setia More articles by this author , Piper CanonPiper Canon More articles by this author , James FergusonJames Ferguson More articles by this author , Nicholas KavoussiNicholas Kavoussi More articles by this author , Robert WebsterRobert Webster More articles by this author , and Duke HerrellDuke Herrell More articles by this author View All Author Informationhttps://doi.org/10.1097/JU.0000000000003311.06AboutPDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareFacebookLinked InTwitterEmail Abstract INTRODUCTION AND OBJECTIVE: Image-guidance during robotic partial nephrectomy (rPN) employs a 3D model of renal anatomy to allow for sub-surface identification of structures. Most image guidance surgery (IGS) systems involve manual registration, which is inaccurate and requires a separate individual to align the model intraoperatively. We sought to evaluate an automated, touch-based 3D IGS system and its accuracy compared to expert surgeons. METHODS: We identified patients with renal masses with planned rPN. Preop CT scans were used to generate virtual patient-specific 3D anatomical kidney models. Our IGS system was deployed during 6 rPNs by two primary surgeons. Registration was achieved by aligning a touch-based point cloud of the kidney surface tracing (using the robotic tool tip) to the models. The primary surgeon used the tool tip to estimate the location of the renal artery, vein, tumor centroid and intersection of the mass with parenchyma. After registration, a secondary surgeon (blinded to preop imaging) used the tool tip to record the location of these structures with the IGS system. The ground-truth, intraoperative locations of the target structures were recorded after dissection was complete. Target registration errors (TRE) with and without IGS were compared. RESULTS: Computed registration was performed successfully across 6 patients (mean axial diameter 3.0 cm, mean nephrometry score 6.5, Figure 1). After surface tracing, registration on average took 100 seconds. Median overall TRE for the secondary surgeon using the IGS compared to the primary surgeon was 9.8 vs 8.6 mm, respectively (p=0.25). TRE of localization was less for the lesion intersection with kidney (3.7 vs 2.2 mm, p=0.43) compared to the renal artery (14.1 vs 18.7 mm, p=0.61) and vein (14.4 vs 9.6 mm, p=0.06) (Figure 2). CONCLUSIONS: We report an automated, touch-based IGS system for rPN. The system is as accurate as an expert surgeon in target structure identification, though accuracy of registration for rigid structures was better than that of hilar vessels. Further improvements are needed to accurately model tissue deformation especially for hilar anatomy. Source of Funding: NIH R01-EB023717Software interface support from Intuitive © 2023 by American Urological Association Education and Research, Inc.FiguresReferencesRelatedDetails Volume 209Issue Supplement 4April 2023Page: e797 Advertisement Copyright & Permissions© 2023 by American Urological Association Education and Research, Inc.MetricsAuthor Information Shaan Setia More articles by this author Piper Canon More articles by this author James Ferguson More articles by this author Nicholas Kavoussi More articles by this author Robert Webster More articles by this author Duke Herrell More articles by this author Expand All Advertisement PDF downloadLoading ...
更多
查看译文
关键词
partial nephrectomy,image guidance system,touch-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要