The 64th JSAP Spring Meeting, 2017

Presentation information

Oral presentation

CS Code-sharing session » CS.6 10.1,10.2,10.3,10.4 Code-sharing session

[15p-501-1~19] CS.6 10.1,10.2,10.3,10.4 Code-sharing session "Emerging control-methods of magnetization and related phenomena"

Wed. Mar 15, 2017 1:15 PM - 6:30 PM 501 (501)

Seiji Mitani(NIMS), Masamitsu Hayashi(Univ. of Tokyo)

4:30 PM - 4:45 PM

[15p-501-13] An Analogue Spin-Orbit Torque Device for an Artificial Neural Network

〇(M1)William Andrew Borders1, Hisanao Akima1, Shunsuke Fukami1, Satoshi Moriya1, Shouta Kurihara1, Aleksandr Kurenkov1, Yoshihiko Horio1, Shigeo Sato1, Hideo Ohno1 (1.Tohoku University)

Keywords:Spin-Orbit Torque, Neuro-morphic Computing, Antiferromanget-Ferromagnet

Development of nonvolatile memories for computers with the von Neumann architecture has been one of the mainstream outlets of spintronics research in the last few decades. Meanwhile, non-von Neumann artificial intelligence(AI) technologies have attracted great attention in the field of information processing, completing complex tasks at high speeds and at low power consumption levels that conventional computers struggle with. In this work, we show a demonstration of a brain-like associative memory operation using a spin-orbit torque (SOT) induced switching device comprised of an antiferromagnet (AFM)-ferromagnet (FM) stack structure, which was found to show analogue-like resistance switching [1], serving as an artificial synapse. The SOT device is composed of an AFM PtMn and FM Co/Ni multilayer. The role of the PtMn layer is twofold; as a spin source for applying SOT on the above FM, and as a source for an exchange bias induced in-plane field to allow field-free switching [1]. The fabricated 36-devices’ array is then implemented into a demonstration system as synapses to associate several 3x3 block patterns through learning. The system determines a synaptic weight matrix that describes the weight relating one block to the other blocks, then produces a "recalled" vector based on the synaptic weight matrix and a "key" vector sent from the simulation. If the "recalled" vector and "memorized" vectors differ, an iterative learning process [2] is conducted. The direction cosine of each test, or the convergence between the recalled vectors and memorized vectors (1 being complete agreement) is determined to test the system's learning ability, when one block in the pattern is 'flipped'. Over 100 tests, the neural network 'recovered' from a direction cosine value of 0.601 before learning, to a value of 0.852, demonstrating the SOT device's capability, as a synapse, to learn patterns for associative memory [3].
A portion of this work was supported by R&D Project for ICT Key Technology of MEXT, and JSPS KAKENHI 15K18044.
[1] S. Fukami et al., Nature Mater., 15, 535 (2016).
[2] D. H. Ackley and G. E. Hinton, Cognitive Sci., 9, 147 (1985).
[3] W. A. Borders et al., Appl. Phys. Express 10, 013007 (2017).