Skip to main navigation Skip to search Skip to main content

A Spatiotemporal Agent for Robust Multimodal Registration

  • Ziwei Luo
  • , Xin Wang
  • , Xi Wu
  • , Youbing Yin
  • , Kunlin Cao
  • , Qi Song
  • , Jing Hu
  • Chengdu University of Information Technology
  • CuraCloud Corporation

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Multimodal image registration is a crucial step for a variety of medical applications to provide complementary information from the combination of various data sources. Conventional image registration methods aim at finding a suited similarity metric as well as a descriptive image feature, which is quite challenging due to the high diversity of tissue appearance across modalities. In this paper, we present a novel approach to register images via an asynchronously trained reinforcement learning agent automatically. Within this approach, convolutional gated recurrent units (ConvGRU) is incorporated after stacked convolutional layers to extract both spatial and temporal features of the neighboring frames and implicitly learn the similarity metric. Moreover, we propose a customized reward function driven by fixed points error (FPE) to guide the agent to the correct registration direction. A Monte Carlo rollout strategy is also leveraged to perform a look-ahead inference to the elimination of jitter in the test stage. Evaluation is performed on paired CT and MR images from patients diagnosed as nasopharyngeal carcinoma. The results demonstrate that our method achieves state-of-the-art performance in medical image registration.

Original languageEnglish
Article number9075173
Pages (from-to)75347-75358
Number of pages12
JournalIEEE Access
Volume8
DOIs
StatePublished - 2020

Keywords

  • Medical image
  • actor-critic
  • convolutional GRU
  • multimodal registration
  • reinforcement learning

Fingerprint

Dive into the research topics of 'A Spatiotemporal Agent for Robust Multimodal Registration'. Together they form a unique fingerprint.

Cite this