帳號:guest(3.144.124.107)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):張晏誌
作者(外文):Chang, Yen-Chih
論文名稱(中文):單類遞歸神經網路之穩態
論文名稱(外文):Steady States of A Class of Recurrent Neural Networks
指導教授(中文):葉維彰
指導教授(外文):Yeh, Wei-Chang
口試委員(中文):王俊堯
呂宗澤
張志鴻
林建仲
學位類別:博士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:101034801
出版年(民國):110
畢業學年度:109
語文別:英文
論文頁數:101
中文關鍵詞:類神經網路遞歸型類神經網路離散差分方程
外文關鍵詞:Artificial Neural NetworksRecurrent Neural NetworksDiscrete Difference Equations
相關次數:
  • 推薦推薦:0
  • 點閱點閱:417
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
由於計算機科學在硬體與軟體方面的進步帶動了人工智慧在各個領域的應用發展,其中又以大量運用數學模型的類神經網路最為突出。
現今的類神經網路數學模型為數眾多。根據這些模型,多樣的輸入輸出資料可以透過學習與訓練做出匹配,從而導致不少的應用。
但是仍有不少新的數學模型需要發掘,特別是涉及到能模擬具閥域控制的類神經網路。本論文研究一類含多個連成環狀神經元的遞歸神經網路(Cellular Neural Networks, CeNNs)。其中該網路的閥域控制函數為開關式控制的不連續函數,其特性更能模擬生物內的神經網路行為,例如:激發(excitation)或是抑制(inhibition)。
進一步的模擬證明,這類CeNNs雖然只是原型,但確有著豐富的靜態與動態行為,足以提供透過組合或以互連狀態方式,進而形成極其複雜的動力系統。
雖然到目前為止,我們對於CeNNs的複雜行為認知有限,但既然模擬結果顯示其具有豐富的動態或靜態行為,那麼就值得我們使用不同的手段,進行分析歸納。
根據目前已知的研究成果,我們能證明CeNNs在特定的參數組合條件下有著多種類型的穩態存在。這些多樣性的穩態行為可以用作圖形記憶(進一步可類比一些生物系統的認知及記憶功能),所以分析清楚參數組合條件與穩態之間的理論關係有著關鍵的必要性。
本論文收集整理了多年來所完成的研究成果,提供了多組參數組合與穩態之間的理論關係,並以多種數學工具,做出了嚴謹的數理證明。在本論文中,我們將CeNNs的穩態解轉換為相對應的不連續二階差分方程的數列解。在這個轉換的基礎上,每個差分方程的週期數列解都能對應到一組CeNNs的穩態解。透過本研究所發展出來的理論與技巧,我們能證明在特定參數組合為(1,1),(0,1)或是(-1,1)之下,其所有的差分方程數列解都是有週期的。另一方面,一組包含無理數的參數組合情況下,則對應的差分方程數列解存在週期解與非週期解,其中週期數列解的種類無法全面列舉出來,而我們也證明至少有兩類的數列解沒有週期。而在這一系列的證明中,我們也發現了CeNNs相對應的二階差分方程,其數列解為無週期的充分條件。我們認為,基於本論文所發展出來的理論與技巧,能對未來進一步理解CeNNs,以及後繼更複雜的同類型網路提供了強而有力的發展基礎。
The advances of computer science in hardware as well as software have boosted the development of Artificial Neural Networks (ANNs) which are mathematical models used in many powerful Artificial Intelligence approaches. The modern ANNs such as Deep-Learning Neural Networks (DNNs) or Recurrent Neural Networks (RNNs) are implemented by learning and training methods. Among the state-of-the-art neural networks, specific neural network models are applied to certain domains of problems, depending on the types of input data. However, none of them can perform well on the contextual input data. The DNNs cannot do well in terms of processing contextual information (the meaningful relation cannot be kept or processed well during trainings); on the other hand, the RNNs are designed to process the contextual information yet the complexity of its structure leads to many limitations. Recently, the Spiking Neural Networks (SNNs), which are more biologically realistic among other ANNs are studied by many researchers; however, it's still a challenge to train the SNNs.
In this dissertation, we study a class of cellular neural network (CeNNs), a variant of RNNs, with bang bang control which can associate meaningful input and output pairs as a key solution of ANNs to process the contextual information. We use the discrete bang bang control as the activation function of our neural networks so that their behaviors of excitation as well as inhibition can simulate biological neurons.
Numerical simulation can be employed to show rich and useful static and dynamical behaviors of our RNNs. Although many of these behaviors cannot be explained yet, we may, however, prove that some of these neural networks have abundant steady states so that the relations among the input and output pairs may be kept during training processes. The steady states of the dynamical systems are more realistic to the memory of biological neural systems in brains. Hence, our neural networks may not require sophisticated mechanisms (compared with RNNs or SNNs) applied in the multi-layer trainings to keep the contextual information (using the external storage like Differentiable Neural Computer).
Abstract i
Abstract in Chinese ii
List of Figures iii
1 Introduction 1
1.1 Arti…cial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Our Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2 Steady States 11
2.1 The three-term recurrence relation . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.1.1 Translations of real sequences and subsets in R2 . . . . . . . . . . . . . . 11
2.1.2 The error and companion sequences of solutions of (1.10) . . . . . . . . . 13
2.1.3 The periodic solutions " of (2.11) with
" 2 f1; 2g . . . . . . . . . . . . . 16
2.2 The orbits of error sequences on conic sections . . . . . . . . . . . . . . . . . . . 18
2.2.1 The conic sections and Tracking Procedure . . . . . . . . . . . . . . . 18
2.2.2 The asymptotic behaviors of solutions of (2.11) . . . . . . . . . . . . . . 22
3 The periodic solutions of (2.1) with  = 􀀀1; 0; 1; or 1+p5
2 26
3.1 The solutions of (2.1) with  = 1 . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.2 The solutions of (2.1) with  = 0 . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.2.1 The solutions  = fkgk2Z of (3.42) with (0; 1) 2 T0 . . . . . . . . . . . 38
3.3 The solutions of (2.1) with  = 􀀀1 . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.3.1 The solutions ' = f'kgk2Z of (3.42) with ('0; '1) 2 F1 . . . . . . . . . . 43
3.3.2 The solutions = f kgk2Z of (3.42) with ( 0; 1) 2 F2 . . . . . . . . . . 51
3.3.3 The solutions  = fkgk2Z of (3.42) with (0; 1) 2 F3 . . . . . . . . . . . 61
3.3.4 The solutions  = fkgk2Z of (3.42) with (0; 1) 2 F4 . . . . . . . . . . . 68
3.4 The solutions of (2.1) with  =
􀀀
1 + p5

=2 . . . . . . . . . . . . . . . . . . . . 76
3.4.1 The non-exhaustive periodic solutions of (3.159) . . . . . . . . . . . . . . 77
3.4.2 Two collections of aperiodic solutions of (3.159) . . . . . . . . . . . . . . 94
4 Conclusion 96
4.1 Summary of our works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
4.2 Future studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
References 100
References
[1] Y.C. Chang & S.S. Cheng (2014), Complete periodic behaviours of real and complex bang
bang dynamical systems, Journal of Di¤erence Equations and Applications, 20:5-6, 765-
810, DOI: 10.1080/10236198.2013.816694.
[2] Y.C. Chang, G.Q. Wang & S.S. Cheng (2012), Complete set of periodic solutions of a
discontinuous recurrence equation, Journal of Di¤erence Equations and Applications, 18:7,
1133-1162, DOI: 10.1080/10236198.2011.552502.
[3] Y.C. Chang & S.S. Cheng (2013), Complete periodicity analysis for a discontinuous recur-
rence equation, International Journal of Bifurcation and Chaos, 23:4, 1330012 (34 pages),
DOI: 10.1142/S0218127413300127.
[4] Y.C. Chang, S.S. Cheng & W.C. Yeh (2017), Nonnegative periodic solutions of a three-
term recurrence relation depending on two real parameters, Discrete Dynamics in Nature
and Society, 2017, (21 pages), DOI:10.1155/2017/7163809.
[5] Y.C. Chang, S.S. Cheng & W.C. Yeh (2019), Abundant periodic and aperiodic solutions
of a discontinuous three-term recurrence relation, Journal of Di¤erence Equations and
Applications, 25:8, 1082-1106, DOI: 10.1080/10236198.2019.1649397.
[6] W.C. Yueh & S.S. Cheng (2008), Explicit eigenvalues and inverses of tridiagonal Toeplitz
matrices with four perturbed corners, The ANZIAM Journal, 49:3, 361-387, DOI:
10.1017/S1446181108000102.
[7] H. Hu, Y.J. Shang, X. Yang, et al. (2019), Constructing an Associative Memory Sys-
tem Using Spiking Neural Network, Frontiers in Neuroscience, 13, 650(15 pages), DOI:
10.3389/fnins.2019.00650.
[8] H. A. Rowley, S. Baluja & T. Kanade (1998), Neural network-based face detection,
IEEE Transactions on Pattern Analysis and Machine Intelligence, 20:1, 23-38, DOI:
10.1109/34.655647.
[9] A. Zaknich (1998), Introduction to the modi…ed probabilistic neural network for general
signal processing applications, IEEE Transactions on Signal Processing, 46:7, 1980-1990,
DOI: 10.1109/78.700969.
100
[10] M. Egmont-Petersen, D. de Ridder & H. Handels (2002), Image processing with
neural networks— a review, Pattern Recognition, 35:10, 2279-2301, DOI: 10.1016/S0031-
3203(01)00178-9.
[11] J. Schmidhuber (2015), Deep learning in neural networks: An overview, Neural Networks,
61, 85-117, DOI: 10.1016/j.neunet.2014.09.003.
[12] Q. Wu, Y. Liu, Q. Li, S. Jin & F. Li (2017), The application of deep learn-
ing in computer vision, 2017 Chinese Automation Congress (CAC), 6522-6527, DOI:
10.1109/CAC.2017.8243952.
[13] T. Park, I. Choi & M. Lee (2021), Distributed associative memory network with memory
refreshing loss, [Online] Available: https://arxiv.org/abs/2007.10637v2.
[14] A. Graves, G.Wayne, M. Reynolds, et al. (2016), Hybrid computing using a neural network
with dynamic external memory, Nature, 538, 471-476, DOI: 10.1038/nature20101.
[15] A. Cima, A. Gasull, V. Mañosa & F. Mañosas (2020), Pointwise periodic maps with
quantized …rst integrals, [Online] Available: https://arxiv.org/abs/2010.12901v2.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *