帳號:guest(216.73.216.42)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):李芳妤
作者(外文):Lee, Fang-Yu
論文名稱(中文):以神經網路進行函數逼近
論文名稱(外文):Function approximation with neural network - A direct construction
指導教授(中文):林得勝
李俊璋
指導教授(外文):Lin, Te-Sheng
Lee, Chiun-Chang
口試委員(中文):陳人豪
曾昱豪
口試委員(外文):Chen, Jen-Hao
Zeng, Yu-Hao
學位類別:碩士
校院名稱:國立清華大學
系所名稱:計算與建模科學研究所
學號:108026511
出版年(民國):110
畢業學年度:109
語文別:英文
論文頁數:147
中文關鍵詞:神經網路人工智慧機器學習函數逼近
外文關鍵詞:Neural networkArtificial intelligenceMachine learningFunction approximation
相關次數:
  • 推薦推薦:0
  • 點閱點閱:132
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
在本論文中,我們證明了任何連續函數和決策函數都可以通過只有一個隱藏層的神經網絡來近似。我們直接構建神經網絡,並以連續的sigmoidal函數作為激活函數,我們表明如果適當地調整神經網絡的參數,可以準確地逼近任何 N 維連續函數。同樣,任何 N 維連續函數也可以通過使用 relu 函數作為激活函數的神經網絡來近似。
In this thesis we demonstrate that any continuous function as well as decision functions can be approximated by a neural networks with only a single hidden layer. We construct the neural network directly. Taking the continuous sigmoidal function as the activation function, we show that if the parameters of the neural network are adjusted appropriately, any N-dimensional continuous function can be approximated accurately. Similarly, any N-dimensional continuous function can also be approximated by a neural network that uses the relu function as the activation function.
1 Introduction p.1
2 Universal approximation - Theorem and experiments p.3
2.1 Theoretical results for function approximation p.3
2.2 Numerical experiments of universal approximation p.4
2.3 Theoretical results for classification p.8
2.4 Classification experiments p.9
2.4.1 Binary classification p.10
2.4.2 Multiple classification p.13
3 Direct construction of universal approximation p.17
3.1 Universality of neural network constructed by continuous sigmoidal function p.17
3.1.1 Approximate one-dimensional function p.17
3.1.2 Approximate two-dimensional functions p.85
3.1.3 Approximate three-dimensional functions p.97
3.1.4 Approximate multiple-dimensional function p.100
3.2 Universality of neural network constructed by continuous relu function p.101
3.2.1 Approximate one-dimensional function p.101
4 Conclusion p.117

Appendices p.119
.1 Numerical experiments of universal approximation p.120
.2 Classification experiment of universal approximation p.127
2.1 Binary classification p.127
2.2 Multiple classification p.132
.3 Experiments of direct construction of universal approximation p.139
3.1 Use an untrained neural network which constructed from many one-dimensional bump functions and two step functions to approximate a one-dimensional function p.139
3.2 Use an trained neural network which constructed from many one-dimensional bump functions and two step functions to approximate a one-dimensional function p.141
[AJO+18] Oludare Isaac Abiodun, Aman Jantan, Abiodun Esther Omolara, Kemi Victoria Dada, Nachaat AbdElatif Mohamed, and Humaira Ar- shad. State-of-the-art in artificial neural network applications: A sur- vey. Heliyon, 4(11):e00938, 2018.

[BDTD+16] Mariusz Bojarski, Davide Del Testa, Daniel Dworakowski, Bernhard Firner, Beat Flepp, Prasoon Goyal, Lawrence D Jackel, Mathew Mon- fort, Urs Muller, Jiakai Zhang, et al. End to end learning for self-driving cars. arXiv preprint arXiv:1604.07316, 2016.

[Cyb89] George Cybenko. Approximation by superpositions of a sigmoidal func- tion. Mathematics of control, signals and systems, 2(4):303–314, 1989.

[KB10] Mehdi Khashei and Mehdi Bijari. An artificial neural network (p, d, q) model for timeseries forecasting. Expert Systems with applications, 37(1):479–489, 2010.

[Lip89] Richard P Lippmann. Pattern classification using neural networks.
IEEE communications magazine, 27(11):47–50, 1989.

[LSD15] Jonathan Long, Evan Shelhamer, and Trevor Darrell. Fully convolu- tional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 3431– 3440, 2015.

[Nie16] Michael Nielsen. A visual proof that neural nets can compute any func- tion. URL: http://neuralnetworksanddeeplearning. com/chap4. html, 2016.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *