*Result*: Visual-tactile perception for a biomimetic robot in constrained environments.

Title:
Visual-tactile perception for a biomimetic robot in constrained environments.
Authors:
Source:
Bioinspiration & biomimetics [Bioinspir Biomim] 2026 Jan 02; Vol. 21 (1). Date of Electronic Publication: 2026 Jan 02.
Publication Type:
Journal Article
Language:
English
Journal Info:
Publisher: Institute of Physics Publishing Country of Publication: England NLM ID: 101292902 Publication Model: Electronic Cited Medium: Internet ISSN: 1748-3190 (Electronic) Linking ISSN: 17483182 NLM ISO Abbreviation: Bioinspir Biomim Subsets: MEDLINE
Imprint Name(s):
Original Publication: Bristol, UK : Institute of Physics Publishing, 2006-
Contributed Indexing:
Keywords: bio-inspired perception; biomimetics; robotics
Entry Date(s):
Date Created: 20251120 Date Completed: 20260102 Latest Revision: 20260102
Update Code:
20260130
DOI:
10.1088/1748-3190/ae224d
PMID:
41265038
Database:
MEDLINE

*Further Information*

*Environmental perception is a crucial foundation for enhancing the application potential of biomimetic robots. Motivated by the complementary roles of visual and tactile sensing observed in rats, this work proposes a visual-tactile perception for a small-scale bio-inspired robotic rat. The method leverages binocular vision to estimate depth images through an attention-based network and improve perception and localization accuracy by 14.22% based on a dynamic objects removal module. Besides, the whisker sensor is applied to enhance the robot's ability to identify object contours and environmental boundaries in narrow spaces, with obstacle contour and environment boundary reconstruction goodness of fit exceeding 97.00% and 93.87%, respectively. In addition, by integrating the above individual perception methods, we achieve the fusion of vision and tactile sensing for complex environment perception. To the best of our knowledge, this is the first study to implement vision-tactile fusion perception on a miniature biomimetic robot through physical experiments. The experiments demonstrate that our method exhibits promising results on the robotic rat, reducing localization errors in narrow and dim scenes by an average of 29.14% compared to existing state-of-the-art methods.
(© 2026 IOP Publishing Ltd. All rights, including for text and data mining, AI training, and similar technologies, are reserved.)*