Skip to main navigation Skip to search Skip to main content

Unsupervised Online Learning for Robotic Interestingness with Visual Memory

  • Chen Wang
  • , Yuheng Qiu
  • , Wenshan Wang
  • , Yafei Hu
  • , Seungchan Kim
  • , Sebastian Scherer

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Autonomous robots frequently need to detect 'interesting' scenes to decide on further exploration, or to decide which data to share for cooperation. These scenarios often require fast deployment with little or no training data. Prior work considers 'interestingness' based on data from the same distribution. Instead, we propose to develop a method that automatically adapts online to the environment to report interesting scenes quickly. To address this problem, we develop a novel translation-invariant visual memory and design a three-stage architecture for long-term, short-term, and online learning, which enables the system to learn human-like experience, environmental knowledge, and online adaption, respectively. With this system, we achieve an average of 20% higher accuracy than the state-of-the-art unsupervised methods in a subterranean tunnel environment. We show comparable performance to supervised methods for robot exploration scenarios showing the efficacy of our approach. We expect that the presented method will play an important role in the robotic interestingness recognition exploration tasks.

Original languageEnglish
Pages (from-to)2446-2461
Number of pages16
JournalIEEE Transactions on Robotics
Volume38
Issue number4
DOIs
StatePublished - Aug 1 2022

Keywords

  • Online learning
  • robotic interestingness
  • unsupervised learning
  • visual memory

Fingerprint

Dive into the research topics of 'Unsupervised Online Learning for Robotic Interestingness with Visual Memory'. Together they form a unique fingerprint.

Cite this