Bioinspired multisensory neural network with crossmodal integration and recognition

Hongwei Tan*, Yifan Zhou, Quanzheng Tao, Johanna Rosen, Sebastiaan van Dijken

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

110 Citations (Scopus)
126 Downloads (Pure)


The integration and interaction of vision, touch, hearing, smell, and taste in the human multisensory neural network facilitate high-level cognitive functionalities, such as crossmodal integration, recognition, and imagination for accurate evaluation and comprehensive understanding of the multimodal world. Here, we report a bioinspired multisensory neural network that integrates artificial optic, afferent, auditory, and simulated olfactory and gustatory sensory nerves. With distributed multiple sensors and biomimetic hierarchical architectures, our system can not only sense, process, and memorize multimodal information, but also fuse multisensory data at hardware and software level. Using crossmodal learning, the system is capable of crossmodally recognizing and imagining multimodal information, such as visualizing alphabet letters upon handwritten input, recognizing multimodal visual/smell/taste information or imagining a never-seen picture when hearing its description. Our multisensory neural network provides a promising approach towards robotic sensing and perception.

Original languageEnglish
Article number1120
Number of pages9
JournalNature Communications
Issue number1
Publication statusPublished - 18 Feb 2021
MoE publication typeA1 Journal article-refereed


Dive into the research topics of 'Bioinspired multisensory neural network with crossmodal integration and recognition'. Together they form a unique fingerprint.

Cite this