帳號:guest(3.144.3.181)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):何耕維
作者(外文):Ho, Keng-Wei
論文名稱(中文):具學習能力的隨機循環突波式神經網路之低功耗數位電路設計
論文名稱(外文):Low Power Digital Design of a Stochastic Recurrent Spiking Neural Network with On-Chip Learning Capability
指導教授(中文):陳新
指導教授(外文):Chen, Hsin
口試委員(中文):彭盛裕
盧峙丞
學位類別:碩士
校院名稱:國立清華大學
系所名稱:電機工程學系
學號:109061567
出版年(民國):111
畢業學年度:111
語文別:英文
論文頁數:44
中文關鍵詞:突波式神經網路電路設計隨機
外文關鍵詞:SNNDigitalStochasticRecurrent
相關次數:
  • 推薦推薦:0
  • 點閱點閱:169
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
近年來,機器學習已被廣泛的應用到生活周遭,例如人臉辨識、自動駕駛系統等等,然而隨著神經網路的架構越來越複雜,龐大的運算量也使其在硬體上執行的功耗較高,在終端裝置上實現低功耗的神經網路及演算法成為一項挑戰。突波式神經網路為有發展潛力的研究方向之一,它模仿了生物神經系統利用突波傳遞訊號,能夠以較簡易的神經網路架構及學習演算法達到低功耗的運算。
研究發現雜訊有助於突波式神經網路的學習。隨機循環突波式神經網路在神經元上加入了雜訊電流,且利用受限波茲曼機的數學理論來學習資料的機率分布,以達到資料的無監督學習。
本論文首先利用軟體模擬並分析隨機循環突波式神經網路,並對LIF神經元模型、神經網路的架構以及學習演算法進行簡化,接著設計數位電路實現具學習能力之循環突波式神經網路的晶片,最後在晶片上訓練並測試MNIST手寫數字的辨識以及圖形重建。
In recent years, machine learning has been widely used in our life, such as face recognition, autonomous driving systems, etc. However, as the architecture of the neural network becomes more and more complex, the enormous amount of computation also increases the power consumption on the hardware. Implementing a low-power neural network and algorithm on edge devices has become a challenge. Spiking neural network is one of the potential research directions. It imitates the biological nervous system using spikes to transmit information and can achieve low-power computing with a simpler neural network structure and learning algorithm.
Research has found that noise contributes to the learning of spiking neural networks. The stochastic recurrent spiking neural network adds noise currents to neurons. It uses the mathematical theory of restricted Boltzmann machines to learn the probability distribution of data and is able to perform unsupervised learning.
This thesis first uses software to simulate and analyze the stochastic recurrent spiking neural network. Then simplify the LIF neuron model, the neural network's architecture, and the learning algorithm. Afterward, design a digital chip of the stochastic recurrent spiking neural network with on-chip learning capability. Finally, the recognition and image reconstruction of MNIST handwritten digits are trained and tested on the chip.
Chapter1 Introduction-----------------------------------1
Chapter2 Theory Description and Literature Review-------2
2.1 Integrate and Fire Neuron Model---------------------2
2.2 Spike-Timing-Dependent Plasticity-------------------3
2.3 Restricted Boltzmann Machine------------------------5
2.4 Recurrent Spiking Neural Network--------------------7
2.5 Previous Digital Implementation of RSNN-------------9
Chapter3 Software Simulation of the Stochastic RSNN----10
3.1 Neural Network Architecture------------------------10
3.2 Algorithm Description------------------------------12
3.3 Simulation Result and Discussion-------------------15
3.3.1 Stochastic LIF Neuron----------------------------15
3.3.2 MNIST Classification-----------------------------16
3.3.3 Data Retrieval from Partial Data Input-----------19
Chapter4 Stochastic RSNN Digital Design----------------21
4.1 System Architecture--------------------------------21
4.2 Stochastic IF Neuron Circuit-----------------------23
4.2.1 Noise Generator----------------------------------23
4.2.2 Neuron Core--------------------------------------24
4.3 Memory Storage-------------------------------------26
4.3.1 Register-----------------------------------------26
4.3.2 SRAM---------------------------------------------27
4.4 STDP Update Logic----------------------------------29
Chapter5 Back-end Simulation---------------------------30
5.1 Chip Specification---------------------------------30
5.2 System-Level Simulation----------------------------32
5.2.1 Operation Modes----------------------------------32
5.2.2 Simulation Result and Power Analysis-------------36
5.3 Comparison with Analog Design----------------------39
5.4 Comparison with State-of-the-Art Digital Design----41
Chapter6 Conclusion and Future Work--------------------42
Reference----------------------------------------------44
[1] Neftci, E., et al., Event-driven contrastive divergence for spiking neuromorphic systems. Frontiers in neuroscience, 2014. 7: p. 272.
[2] Asl, M.M., Propagation delays determine the effects of synaptic plasticity on the structure and dynamics of neuronal networks. 2018.
[3] Qiao, N., et al., A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses. Frontiers in neuroscience, 2015. 9: p. 141.
[4] Ting-Heng Yu, Stochastic silicon neuron circuits based on RPO-FET for stochastic neuromorphic system, 2020.
[5] Wu-Hsun Lai, The analysis and implementation of stochastic spiking neural network with on-chip learning capability, 2020.
[6] Frenkel, C., et al., A 0.086-mm^2 12.7-pJ/SOP 64k-synapse 256-neuron online-learning digital spiking neuromorphic processor in 28-nm CMOS. IEEE transactions on biomedical circuits and systems, 2018. 13(1): p. 145-158.
[7] Chen, G.K., et al., A 4096-neuron 1M-synapse 3.8-pJ/SOP spiking neural network with on-chip STDP learning and sparse weights in 10-nm FinFET CMOS. IEEE Journal of Solid-State Circuits, 2018. 54(4): p. 992-1002.
[8] LeCun, Y., The MNIST database of handwritten digits. http://yann. lecun. com/exdb/mnist/, 1998.
[9] Abbott, L.F., Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain research bulletin, 1999. 50(5-6): p. 303-304.
[10] Caporale, N. and Y. Dan, Spike timing-dependent plasticity: a Hebbian learning rule. Annual review of neuroscience, 2008. 31(1): p. 25-46.
[11] Fischer, A. and C. Igel. An introduction to restricted Boltzmann machines. in Iberoamerican congress on pattern recognition. 2012. Springer.
[12] Hinton, G.E., A practical guide to training restricted Boltzmann machines, in Neural networks: Tricks of the trade. 2012, Springer. p. 599-619.
[13] Marsaglia, G., Xorshift rngs. Journal of Statistical Software, 2003. 8: p. 1-6.
[14] Taherkhani, A., et al., A review of learning in biologically plausible spiking neural networks. Neural Networks, 2020. 122: p. 253-272.
[15] Diehl, P.U. and M. Cook, Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Frontiers in computational neuroscience, 2015. 9: p. 99.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *