帳號:guest(3.135.193.105)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):許啟宏
作者(外文):Hsu, Chi-Hung
論文名稱(中文):應用強化學習方法之多目標類神經網路架構探索
論文名稱(外文):MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning
指導教授(中文):張世杰
指導教授(外文):Chang, Shih-Chieh
口試委員(中文):周志遠
彭文志
口試委員(外文):Chou, Chih-Yuan
Peng, Wen-Chih
學位類別:碩士
校院名稱:國立清華大學
系所名稱:資訊工程學系所
學號:105062568
出版年(民國):107
畢業學年度:106
語文別:英文
論文頁數:24
中文關鍵詞:類神經網路架構探索強化學習能源效率卷積神經網路
外文關鍵詞:Neural Architecture SearchReinforcement LearningEnergy EfficiencyConvolutional Neural Network
相關次數:
  • 推薦推薦:0
  • 點閱點閱:1936
  • 評分評分:*****
  • 下載下載:62
  • 收藏收藏:0
近年來對類神經架構搜索(Neural Architecture Search ,NAS)的研究表明,自動設計類神經網絡已經能達到與人工設計一樣好的水準。儘管大多數現有的類神經架構搜索都是針對尋找優化預測精度的架構。這些方法可能會產生消耗過高能耗的複雜體系結構,這不適用於能源消耗預算有限的計算環境。本篇論文我們提出MONAS(Multi-Objective Neural Architecture Search),一種多目標類神經架構搜索,具有新穎的獎勵功能,在探索類神經架構時既考慮預測精度又考慮能源消耗。MONAS 能有效地探索設計空間並探索滿足給定要求的架構。實驗結果表明,MONAS 發現的架構實現了與現有技術模型相當或更好的精確度,同時具有更好的能源效率。
Recent studies on neural architecture search have shown that automatically designed neural networks perform as good as human-designed architectures. While most existing works on neural architecture search aim at finding architectures that optimize for prediction accuracy. These methods may generate complex architectures consuming excessively high energy consumption, which is not suitable for computing environment with limited power budgets. We propose MONAS, a Multi-Objective Neural Architecture Search with novel reward functions that consider both prediction accuracy and power consumption when exploring neural architectures. MONAS effectively explores the design space and searches for architectures satisfying the given requirements. The experimental results demonstrate that the architectures found by MONAS achieve accuracy comparable to or better than the state-of-the-art models, while having better energy efficiency.
1 Introduction 1
2 Background 3
3 Methodology 5
3.1 Framework Overview 5
3.2 Implementation Details 6
3.3 Reinforcement Learning Process 7
4 Experiment Setup 11
4.1 Experimental Setup 11
4.2 Power and Energy measurement 12
4.3 Dataset 12
4.4 Image preprocessing 12
4.5 Training Details on AlexNet: 13
4.6 Training Details on CondenseNet: 13
5 Result and Discussion 14
5.1 Adaptability 14
5.2 Efficiency 16
5.3 Pareto Frontier 19
5.4 Discover Better Models 19
6 Conclusion 22
References 23
[1] B. Baker, O. Gupta, N. Naik, and R. Raskar. Designing neural network architectures using reinforcement learning. In International Conference on Learning Representations, 2017.
[2] G. Huang, S. Liu, L. van der Maaten, and K. Q. Weinberger. Condensenet: An efficient densenet using learned group convolutions. arXiv preprint arXiv:1711.09224, 2017.
[3] D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. In International Conference on Learning Representations, 2015.
[4] A. Krizhevsky, I. Sutskever, and G. E. Hinton. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems, pages 1097–1105, 2012.
[5] S. Liu. Condensenet: Light weighted cnn for mobile devices, 2017.
[6] J. Peters and S. Schaal. Policy gradient methods for robotics. In 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 2219–2225, 2006.
[7] S. C. Steven Tartakovsky and M. McCourt. Deep learning hyperparameter optimization with competing objectives, 2017.
[8] TensorFlow. Convolutional neural networks, 2018.
[9] J. H. M. Thomas Elsken and F. Hutter. Multi-objective architecture search for cnns. arXiv preprint arXiv:1804.09081, 2018.
[10] S. Y. Ye-Hoon Kim, Bhargava Reddy and C. Seo. Nemo: Neuro-evolution with multiobjective optimization of deep neural network for speed and accuracy. In ICML17 AutoML Workshop, 2017.
[11] B. Zoph and Q. V. Le. Neural architecture search with reinforcement learning. In International Conference on Learning Representations, 2017.
[12] V. V. S. J. Zoph, Barret and Q. V. Le. Learning Transferable Architectures for Scalable Image Recognition. arXiv preprint arXiv:1707.07012, 2017.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *