Skip to Main content Skip to Navigation
Conference papers

Power-Efficient Deep Neural Networks with Noisy Memristor Implementation

Abstract : This paper considers Deep Neural Network (DNN) linear-nonlinear computations implemented on memristor crossbar substrates. To address the case where true memristor conductance values may differ from their target values, it introduces a theoretical framework that characterizes the effect of conductance value variations on the final inference computation. With only second-order moment assumptions, theoretical results on tracking the mean, variance, and covariance of the layer-bylayer noisy computations are given. By allowing the possibility of amplifying certain signals within the DNN, power consumption is characterized and then optimized via KKT conditions. Simulation results verify the accuracy of the proposed analysis and demonstrate the significant power efficiency gains that are possible via optimization for a target mean squared error.
Document type :
Conference papers
Complete list of metadata

https://hal-imt-atlantique.archives-ouvertes.fr/hal-03337122
Contributor : Elsa Dupraz Connect in order to contact the contributor
Submitted on : Tuesday, September 7, 2021 - 3:59:11 PM
Last modification on : Monday, October 11, 2021 - 2:24:03 PM

File

Dupraz21ITW.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03337122, version 1

Citation

Elsa Dupraz, Lav Varshney, François Leduc-Primeau. Power-Efficient Deep Neural Networks with Noisy Memristor Implementation. Information Theory Workshop (ITW) 2021, Oct 2021, Kanazawa, Japan. ⟨hal-03337122⟩

Share

Metrics

Record views

43

Files downloads

13