Options
Stochasticity invariance control in Pr<inf>1−x </inf>Ca<inf>x</inf>MnO<inf>3</inf> RRAM to enable large-scale stochastic recurrent neural networks
Journal
Neuromorphic Computing and Engineering
Date Issued
2022-03-01
Author(s)
Saraswat, Vivek
Ganguly, Udayan
Abstract
Emerging non-volatile memories have been proposed for a wide range of applications, from easing the von-Neumann bottleneck to neuromorphic applications. Specifically, scalable RRAMs based on Pr1−x Cax MnO3 (PCMO) exhibit analog switching have been demonstrated as an integrating neuron, an analog synapse, and a voltage-controlled oscillator. More recently, the inherent stochasticity of memristors has been proposed for efficient hardware implementations of Boltzmann machines. However, as the problem size scales, the number of neurons increases and controlling the stochastic distribution tightly over many iterations is necessary. This requires parametric control over stochasticity. Here, we characterize the stochastic set in PCMO RRAMs. We identify that the set time distribution depends on the internal state of the device (i.e., resistance) in addition to external input (i.e., voltage pulse). This requires the confluence of contradictory properties like stochastic switching as well as deterministic state control in the same device. Unlike ‘stochastic-everywhere’ filamentary memristors, in PCMO RRAMs, we leverage the (i) stochastic set in negative polarity and (ii) deterministic analog Reset in positive polarity to demonstrate 100× reduced set time distribution drift. The impact on Boltzmann machines’ performance is analyzed and as opposed to the ‘fixed external input stochasticity’, the ‘state-monitored stochasticity’ can solve problems 20× larger in size. State monitoring also tunes out the device-to-device variability effect on distributions providing 10× better performance. In addition to the physical insights, this study establishes the use of experimental stochasticity in PCMO RRAMs in stochastic recurrent neural networks reliably over many iterations.
Subjects