This website requires JavaScript.

MURPHY: Relations Matter in Surgical Workflow Analysis

Shang ZhaoYanzhe LiuQiyuan WangDai SunRong LiuS.Kevin Zhou
Dec 2022
Autonomous robotic surgery has advanced significantly based on analysis ofvisual and temporal cues in surgical workflow, but relational cues from domainknowledge remain under investigation. Complex relations in surgical annotationscan be divided into intra- and inter-relations, both valuable to autonomoussystems to comprehend surgical workflows. Intra- and inter-relations describethe relevance of various categories within a particular annotation type and therelevance of different annotation types, respectively. This paper aims tosystematically investigate the importance of relational cues in surgery. First,we contribute the RLLS12M dataset, a large-scale collection of robotic leftlateral sectionectomy (RLLS), by curating 50 videos of 50 patients operated by5 surgeons and annotating a hierarchical workflow, which consists of 3 inter-and 6 intra-relations, 6 steps, 15 tasks, and 38 activities represented as thetriplet of 11 instruments, 8 actions, and 16 objects, totaling 2,113,510 videoframes and 12,681,060 annotation entities. Correspondingly, we propose amulti-relation purification hybrid network (MURPHY), which aptly incorporatesnovel relation modules to augment the feature representation by purifyingrelational features using the intra- and inter-relations embodied inannotations. The intra-relation module leverages a R-GCN to implant visualfeatures in different graph relations, which are aggregated using a targetedrelation purification with affinity information measuring label consistency andfeature similarity. The inter-relation module is motivated by attentionmechanisms to regularize the influence of relational features based on thehierarchy of annotation types from the domain knowledge. Extensive experimentalresults on the curated RLLS dataset confirm the effectiveness of our approach,demonstrating that relations matter in surgical workflow analysis.