帳號:guest(18.221.35.58)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):周冠廷
作者(外文):Chou, Kuan-Ting
論文名稱(中文):機器學習在動力學系統之應用
論文名稱(外文):Applications of Machine Learning in Dynamical Systems
指導教授(中文):王道維
指導教授(外文):Wang, Daw-Wei
口試委員(中文):張明強
洪在明
陳柏中
學位類別:碩士
校院名稱:國立清華大學
系所名稱:物理學系
學號:107022534
出版年(民國):109
畢業學年度:108
語文別:英文
論文頁數:53
中文關鍵詞:機器學習神經極性果蠅連接組多體物理
外文關鍵詞:Machine LearningNeuronal PolarityDrosophilaConnectomeMany-Body Physics
相關次數:
  • 推薦推薦:0
  • 點閱點閱:134
  • 評分評分:*****
  • 下載下載:46
  • 收藏收藏:0
這篇論文包含了兩種在不同動力學系統的應用:果蠅腦神經極性之分類、有時變外加場之一維易辛模型(Ising Model)。
在論文的第一個部分,我們開發了一種機器學習算法,即基於節點的神經極性分類器(NPIN),以識別神經網絡中資訊流的方向。這是理解腦複雜動態的關鍵之一。我們所提出的模型僅以神經節點的資訊進行訓練,包含細胞體特徵(包含從給定節點到細胞體的空間資訊),和局部特徵(包含給定節點的形態資訊)。透過使用分佈在果蠅大腦不同區域的213個projection neurons的資料集並考慮節點間的空間相關性,NPIN為神經元極性分類提供了高準確度(大於96%),甚至對於擁有兩簇以上的複雜神經也是如此。最後,我們進一步應用NPIN在分類麗蠅的神經極性,該物種的神經數據相較之下少很多。從我們的研究結果顯示,NPIN是分類昆蟲神經極性並繪製出大腦神經網絡資訊流的強大工具。這個部分與此論文相關:Identification of Neuronal Polarity by Node-Based Machine Learning, Chen-Zhi Su, Kuan-Ting Chou, Hsuan-Pei Huang, Chung-Chuan Lo , and Daw-Wei Wang. bioRxiv: https://biorxiv.org/cgi/content/short/2020.06.20.160564v1 (submitted to Neuroninformatics)
在論文的第二部分,我們建構了一個經過調整的遞歸神經網絡(RNN),此網絡可以預測不同參數範圍的物理量。此網絡的結構包含:將系統初始狀態映射至RNN的初始隱藏層變數的編碼器、跟隨RNN一連串隱藏層並透過提取隱藏層變數預測出目標物理量的解碼器。我們將此方法應用於具有時變外加磁場的一維易辛模型。我們藉由在特定的參數範圍內,使用短時間的數據訓練此模型,以預測在其他參數範圍中較長時間尺度的物理量(例如:相關函數、磁化強度和總能量等)。透過調整外加磁場,我們還可以將此方法應用於不同的相。我們的模型顯示了將機器學習應用於多體動力學的可能性。
This thesis contains applications of machine learning in two different dynamical systems: Identification of Neuronal Polarity in Drosophila Brain, and 1D Ising Model with Arbitrary Time-Dependent External Field.
In the first part, we develop a machine learning algorithm, Node-Based Polarity Identifier of Neurons (NPIN), to identify the directions of signal flows in neuronal networks, which is one of the keys for understanding the intricate information dynamics of a living brain. The proposed model is trained by nodal information only and includes both Soma Features (which contain spatial information from a given node to a soma) and Local Features (which contain morphological information of a given node). By using a dataset of 213 projection neurons distributed in different regions of a Drosophila brain and considering the spatial correlations between nodal polarities, NPIN provided high accuracy (>96.0%) for the classification of neuronal polarity, even for complex neurons containing more than two dendrite/axon clusters. Finally, we further apply NPIN to classify the neuronal polarity of the blowfly, which has much less neuronal data available. Our results demonstrate that NPIN is a powerful tool to identify the neuronal polarity of insects and to map out the signal flows in the brain’s neuronal networks. This topic is associated to the project:Identification of Neuronal Polarity by Node-Based Machine Learning, Chen-Zhi Su, Kuan-Ting Chou, Hsuan-Pei Huang, Chung-Chuan Lo , and Daw-Wei Wang. bioRxiv: https://biorxiv.org/cgi/content/short/2020.06.20.160564v1 (submitted to Neuroninformatics)
In the second part, we construct a modified Recurrent Neural Network (RNN) that can predict dynamical observables in different parameter regimes. Our architecture involves an encoder mapping the initial configuration of a given system to the initial hidden variables of RNN, and a decoder following the successive layers extracting the information of hidden variables and giving prediction of targeted quantities. We apply this method to 1D Ising Model with a time-depenent transverse magnetic field. We train the model in a certain parameter regime for a short time, but could simulate physical quantities (such as correlation functions, transverse magnetization, and total energy etc.) in some other parameter regime for a long time behavior. By varying the external magnetic field, we can also apply our approach in the different phases. Our model shows the possibility to apply machine learning in many-body dynamics.
摘要
Abstract
Acknowledgements ------------------------------------------ i
Content ------------------------------------------ ii
I Identification of Neuronal Polarity ------------------------------------------ 1
1 Introduction ------------------------------------------ 2
2 Neuronal Morphology ------------------------------------------ 4
2.1 Dataset ------------------------------------------ 4
2.2 Feature distribution ------------------------------------------ 6
3 Method ------------------------------------------ 9
3.1 Standard Representation ------------------------------------------ 12
3.2 Nodal Polarity ------------------------------------------ 15
3.3 Feature Extraction ------------------------------------------ 17
3.4 Machine Learning Models ------------------------------------------ 17
3.5 Implementation and Spatial Correlation of Nodal Polarity ------------------------------------------ 18
4 Result ------------------------------------------ 19
4.1 Identification Results of Model I: Using Both Soma Features and Local Features ------------------------------------------ 20
4.2 Identification Results of Model II: Using Soma Features Only ------------------------------------------ 22
4.3 Comparison of Models I, II, and III for Complex Neurons ------------------------------------------ 23
4.4 Application: Transfer Learning ------------------------------------------ 26
5 Discussion ------------------------------------------ 29
5.1 Comparison ------------------------------------------ 29
5.2 Neurons with low accuracy ------------------------------------------ 30
5.3 Other types ------------------------------------------ 31
6 Conclusion ------------------------------------------ 32
II Toward Quantum Many-Body Dynamics ------------------------------------------ 33
7 Introduction ------------------------------------------ 34
8 Quantum Ising Model ------------------------------------------ 36
8.1 Transverse Field Ising Chain ------------------------------------------ 36
8.2 Time Dependent External Field ------------------------------------------ 38
8.3 Physical Observables ------------------------------------------ 39
9 Modified Recurrent Neural Network ------------------------------------------ 40
9.1 Overview ------------------------------------------ 40
9.2 Initial Configuration Encoder ------------------------------------------ 41
9.3 Evolution of Hidden Variables ------------------------------------------ 42
9.4 Physical Observable Decoder ------------------------------------------ 43
10 Result ------------------------------------------ 44
10.1 Prediction Result of Task I: Training Data in two different perturbative regimes ------------------------------------------ 44
10.2 Prediction Result of Task II: Training Data in a perturbative regime ------------------------------------------ 47
11 Discussion and Conclusion ------------------------------------------ 50
References ------------------------------------------ 51
[1] A. S. Chiang et al., Curr. Biol. 21, 1 (2011).
[2] L. Kuan et al., Methods 73, 4 (2015).
[3] N. Milyaev et al., Bioinformatics (Oxford, England) 28, 411 (2012).
[4] R. Parekh, and G.A. Ascoli, Neuron 77 , 1017 (2013).
[5] H. Peng et al., Neuron 87, 252 (2015).
[6] K. Shinomiya et al., The Journal of Comparative Neurology 519, 807 (2011).
[7] M. Xu et al., PLoS. One 8, 54050 (2013).
[8] C. C. Lo, and A,-S, Chiang, Journal of Neuroscience 36, 1137 (2016).
[9] A. M. Craig, and G. Banker, Annual Review of Neuroscience 17, 267 (1994).
[10] A. Matus et al., Proceedings of the National Academy of Sciences of the United Sates of America 78, 3010 (1981).
[11] J. Wang et al., Neuron 43, 663 (2004).
[12] H. Cuntz et al., PLoS. Comput. Biol. 4, 1000251 (2008).
[13] Y.-H. Lee et al., Neuroinformatics 12, 487 (2014).
[14] Y. LeCun et al., IEEE 86, 2278 (1998).
[15] A. Krizvsky et al., Commun. ACM, 60, 84 (2012).
[16] Y. LeCun et al., Nature 521, 436 (2015).
[17] H. Asri et al., Procedia Computer Science 83, 1064 (2016).
[18] T. M. Malta et al., Cell 173, 338 (2018).
[19] H. Mohsen et al., Future Computing and Informatics Journal 3, 68 (2018).
[20] U. Hanesch et al., Cell and Tissue Research 258, 441 (1989).
[21] L. R. Squire, D. Berg, F. Bloom, S. Lac, and A. Ghosh, Fundamental Neuroscience, Third Edition (Academic Press, 2008).
[22] C, Y. Lee et al., Neuroinformatics 12, 487 (2013).
[23] K. F. Fischbach et al., Cell and Tissue Research 258, 441 (1989).
[24] M. Wu et al., eLife. 5, 21022 (2016).
[25] Y. C. Huang et al., Frontiers in Neuroinformatics 12 (2019).
[26] T. Kinoshita, T. Wenger, and D. Weiss, Nature 440, 900 (2006).
[27] I. Bloach, J. Dalibard, and W. Zwerger, Rev. Mod. Phys. 80, 885964 (2008).
[28] M. Lewenstein, A. Sanpera, V. Ahufinger, B. Damski, A. Sen, and U. Sen, Advances in Physics 56, 243 (2007).
[29] S. Hofferberth et al., Nature 449, 324327 (2007).
[30] S. Trotzky, T. Chen, A. Flesch et al., Nature 8, 325 (2012).
[31] D. Fioretto, and G. Mussardo, New Journal of Physics 12, 055015 (2010).
[32] J.-y. Choi, et al., Science 352, 1547 (2016).
[33] S. J. Ran et al., Two-Dimensional Tensor Networks and Contraction Algorithms, Tensor Network Contractions, Lecture Notes in Physics 964 (Springer, Cham, 2020)
[34] G. Carleo, and M. Troyer, Science 355, 602 (2017).
[35] P. Czarnik, et al., Phys. Rev. B, 99, 035115 (2019).
[36] J. Dziarmaga, Phys. Rev, Lett. 95, 245701 (2005).
[37] T. Puskarov, and D. Schuricht, SciPost Phys. 1, 003 (2016).
[38] E. Barouch, and B. M. McCoy, Phys. Rev. A 3, 786 (1971).
[39] W. H. Press et al., Numerical Recipes in C++, Second Edition (Cambridge University Press, 1993).
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *