帳號:guest(216.73.216.146)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):范丞佑
作者(外文):FAN, Cheng-You
論文名稱(中文):基於Transformer模型用於預測糾纏二分量子系統之近似
論文名稱(外文):Transformer-Based Model for Predicting the Approximation to the Entangled Bipartite Quantum System
指導教授(中文):陳人豪
指導教授(外文):Chen, Jen-Hao
口試委員(中文):李金龍
陳仁純
口試委員(外文):Li, Chin-Lung
Chen, Ren-Chuen
學位類別:碩士
校院名稱:國立清華大學
系所名稱:計算與建模科學研究所
學號:111026513
出版年(民國):113
畢業學年度:112
語文別:中文
論文頁數:25
中文關鍵詞:注意力機制人工智慧量子糾纏量子系統
外文關鍵詞:TransformerArtificial IntelligenceQuantum EntanglementQuantum system
相關次數:
  • 推薦推薦:0
  • 點閱點閱:0
  • 評分評分:*****
  • 下載下載:11
  • 收藏收藏:0
本研究主要探討將Transformer模型用於預測糾纏二分量子之近似。量子糾纏是一種基本且重要的量子力學現象,其特性使得量子計算機在某些特定問題上具有高效的計算效率。本研究旨在通過訓練深度學習模型來分解量子密度矩陣,從而縮短計算時間。本研究結果表明,雖然Transformer模型在訓練集上的表現令人滿意,但在測試集上的表現仍有待提高。研究結果和方法對於未來量子計算的應用具有重要意義。
This study mainly explores the application of the Transformer model for predicting the approximation of entangled bipartite quantum systems. Quantum entanglement is a fundamental and significant phenomenon in quantum mechanics, whose properties enable quantum computers to achieve efficient computational performance on certain specific problems. This research aims to decompose quantum density matrices by training a deep learning model, thereby reducing computation time. The results indicate that the Transformer model performs satisfactorily on the training set. However, the performance on the test set still needs improvement. The findings and methods of this research hold significant implications for future applications in quantum computing.
目錄
第一章 緒論 1
1.1 背景 1
1.2 目的 1
第二章 人工智慧與量子糾纏介紹 3
2.1 人工智慧 3
2.2神經網路 4
2.3卷積神經網路 6
2.4 大型語言模型LLM 8
2.5 Transformer 10
2.6 二分量子系統 12
2.7 量子糾纏 13
第三章 研究方法 15
3.1二分量子糾纏數學公式 15
(一)問題說明 15
3.2 Transformer計算方法 15
(一)Scaled Dot-Product Attention 15
(二)Multi-head Attention 16
(三)編碼器與解碼器 17
第四章 研究結果與結論 19
4.1 資料集 19
4.2 資料處理 19
4.3 模型設定 19
4.3 研究結果 20
參考文獻 23

[1] N. Zou, "Quantum Entanglement and Its Application in Quantum Communication", (2021).
[2] R. L. Burden and J. D. Faires, Numerical Analysis, 8th ed., Belmont, CA: Thomson Brooks/Cole, (2005).
[3] W. Ertel, Introduction to Artificial Intelligence, Springer, (2018).
[4] K. Gurney, An Introduction to Neural Networks, CRC Press, (1997).
[5] S. Khan, M. Naseer, M. Hayat, S. W. Zamir, F. S. Khan, and M. Shah, "Review of Deep Learning: Concepts, CNN Architectures, Challenges, Applications, Future Directions", Journal of Big Data, (2021).
[6] Z. Li, F. Liu, W. Yang, S. Peng, and J. Zhou, "A Survey of Convolutional Neural Networks: Analysis, Applications, and Prospects", IEEE, (2021).
[7] J. Lederer, et al., "Activation Functions in Artificial Neural Networks: A Systematic Overview", Computer Science, (2021).
[8] Chigozie Nwankpa, Winifred Ijomah, Anurag Gachagan, and Stephen Marshall, "Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark", Computer Science, (2018).
[9] J. Yang, H. Jin, R. Tang, X. Han, Q. Feng, H. Jiang, B. Yin, and X. Hu, "Harnessing the Power of LLMs in Practice: A Survey on ChatGPT and Beyond", Computer Science, (2023).
[10] L. Feng, et al., "Attention as an RNN", Computer Science, (2024).
[11] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, "Attention Is All You Need", Computer Science, (2017).
[12] Z. Zhang and C. You, "Enhancing Quantum Entanglement in Bipartite Systems: Leveraging Optimal Control and Physics-Informed Neural Networks", Quantum Physics, (2024).
[13] M. T. Chu and M. M. Lin, "A complex-valued gradient flow for the entangled bipartite low rank approximation", Computer Physics Communications, (2022).
[14] A. Radford, J. Wu, R. Child, D. Luan, D. Amodei, and I. Sutskever, "Language Models are Unsupervised Multitask Learners", Computer Science, (2019).
[15] M. M. Lin and M. T. Chu, "Low Rank Approximation of Entangled Bipartite Systems", (2022).
[16] Z. Zhang, C. You, et al, "Entanglement-Based Quantum Information Technology", Quantum Physics, (2023).
[17] R. Horodecki, P. Horodecki, M. Horodecki, and K. Horodecki, "Quantum entanglement", Quantum Physics, (2007).
[18] OpenAI, et al., "GPT-4 Technical Report", Computer Science, (2023).
[19] J. Finnie-Ansley, P. Denny, B. A. Becker, A. Luxton-Reilly, and J. Prather, "The Robots Are Coming: Exploring the Implications of OpenAI Codex on Introductory Programming", Australasian Computing Education Conference, (2022).

[20] A. Chowdhery, et al., "Palm: Scaling language modeling with pathways", Journal of Machine Learning Research, (2023).
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *