Hardware-Friendly Synaptic Orders and Timescales in Liquid State Machines for Speech Classification

Vivek Saraswat, Ajinkya Gorad, Anand Naik, Aakash Patil, Udayan Ganguly*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

37 Downloads (Pure)

Abstract

Liquid State Machines are brain inspired spiking neural networks (SNNs) with random reservoir connectivity and bio-mimetic neuronal and synaptic models. Reservoir computing networks are proposed as an alternative to deep neural networks to solve temporal classification problems. Previous studies suggest 2nd order (double exponential) synaptic waveform to be crucial for achieving high accuracy for TI-46 spoken digits recognition. The proposal of long-time range (ms) bio-mimetic synaptic waveforms is a challenge to compact and power efficient neuromorphic hardware. In this work, we analyze the role of synaptic orders namely:.. (high output for single time step), 0th (rectangular with a finite pulse width), 1st (exponential fall) and 2nd order (exponential rise and fall) and synaptic timescales on the reservoir output response and on the TI-46 spoken digits classification accuracy under a more comprehensive parameter sweep. We find the optimal operating point to be correlated to an optimal range of spiking activity in the reservoir. Further, the proposed 0th order synapses perform at par with the biologically plausible 2nd order synapses. This is substantial relaxation for circuit designers as synapses are the most abundant components in an in-memory implementation for SNNs. The circuit benefits for both analog and mixed-signal realizations of 0th order synapse are highlighted demonstrating 2-3 orders of savings in area and power consumptions by eliminating Op-Amps and Digital to Analog Converter circuits. This has major implications on a complete neural network implementation with focus on peripheral limitations and algorithmic simplifications to overcome them.

Original languageEnglish
Title of host publication2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)
PublisherIEEE
Number of pages8
ISBN (Electronic)978-1-6654-3900-8
DOIs
Publication statusPublished - 20 Dec 2021
MoE publication typeA4 Conference publication
EventInternational Joint Conference on Neural Networks - Virtual, Online
Duration: 18 Jul 202122 Jul 2021

Publication series

NameIEEE International Joint Conference on Neural Networks (IJCNN)
PublisherIEEE
ISSN (Print)2161-4393

Conference

ConferenceInternational Joint Conference on Neural Networks
Abbreviated titleIJCNN
CityVirtual, Online
Period18/07/202122/07/2021

Keywords

  • LSM
  • reservoir
  • speech classification
  • SNNs
  • synapse
  • order
  • timescale
  • ON-CHIP
  • DYNAMICS
  • MEMORY

Fingerprint

Dive into the research topics of 'Hardware-Friendly Synaptic Orders and Timescales in Liquid State Machines for Speech Classification'. Together they form a unique fingerprint.

Cite this