Presentation
4 October 2022 Implementation of a binary neural network on a passive array of magnetic tunnel junctions (Conference Presentation)
Author Affiliations +
Abstract
Magnetic tunnel junctions (MTJs) provide an attractive platform for implementing neural networks because of their simplicity, nonvolatility and scalability. In a hardware realization, however, device variations, write errors, and parasitic resistance will generally degrade performance. To quantify such effects, we perform experiments on a 2-layer perceptron constructed from a 15 × 15 passive array of MTJs, examining classification accuracy and write fidelity. Despite imperfections, we achieve accuracy of up to 95.3 % with proper tuning of network parameters. The success of this tuning process shows that new metrics are needed to characterize and optimize networks reproduced in mixed signal hardware.
Conference Presentation
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jonathan M. Goodwill, Nitin Prasad, Brian D. Hoskins, Matthew W. Daniels, Advait Madhavan, Lei Wan, Tiffany S. Santos, Michael Tran, Jordan A Katine, Patrick M. Braganca, Mark D. Stiles, and Jabez McClelland "Implementation of a binary neural network on a passive array of magnetic tunnel junctions (Conference Presentation)", Proc. SPIE PC12205, Spintronics XV, PC122050L (4 October 2022); https://doi.org/10.1117/12.2632314
Advertisement
Advertisement
KEYWORDS
Neural networks

Binary data

Magnetism

Signal processing

Instrument modeling

Performance modeling

Resistance

Back to Top