Options
Liquid State Machine on Loihi: Memory Metric for Performance Prediction
Journal
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISSN
03029743
Date Issued
2022-01-01
Author(s)
Patel, Rajat
Saraswat, Vivek
Ganguly, Udayan
Abstract
Liquid State Machine (LSM) is a spiking variant of recurrent neural networks with promising results for speech, video and other temporal datasets classification. LSM employ a network of fixed and randomly connected neurons, called a reservoir. Parameter selection for building the best performing reservoir is a difficult task given the vast parameter space. A memory metric extracted from a state-space approximation of the LSM has been proposed in the past and empirically shown to be best-in-class for performance prediction. However, the working principle of this memory metric has not been studied. We first show equivalence of LSM simulated on MATLAB to those run on Intel’s neuromorphic chip Loihi. This enables us to perform in-depth statistical analysis of the memory metric on Loihi: effect of weight scaling and effect of time averaging window. Analysis of state space matrices generated with a reasonably sized averaging window reveal that the diagonal elements are sufficient to capture network dynamics. This strengthens the relevance of the first order decay constant based memory metric which correlates well with the classification performance.
Volume
13531 LNCS
Subjects