|
[1] Mordecai Avriel. Nonlinear programming: analysis and methods. Courier Corporation, 2003. [2] Carl M Bender and Stefan Boettcher. Real spectra in non-hermitian hamil-tonians having p t symmetry. Physical review letters, 80(24):5243, 1998. [3] Feliks Aleksandrovich Berezin and Mikhail Shubin. The Schrödinger Equa-tion, volume 66. Springer Science & Business Media, 2012. [4] Charles G Broyden. A class of methods for solving nonlinear simultaneous equations. Mathematics of computation, 19(92):577–593, 1965. [5] Richard L Burden, J Douglas Faires, and Annette M Burden. Numerical analysis. Cengage learning, 2015. [6] Chih-Chung Chang and Chih-Jen Lin. Libsvm: a library for support vector machines. ACM transactions on intelligent systems and technology (TIST), 2(3):1–27, 2011. [7] Chenyi Chen, Ari Seff, Alain Kornhauser, and Jianxiong Xiao. Deepdriv-ing: Learning affordance for direct perception in autonomous driving. In Proceedings of the IEEE international conference on computer vision, pages 2722–2730, 2015. [8] Andrew R Conn, Nicholas IM Gould, and Ph L Toint. Convergence of quasi-newton matrices generated by the symmetric rank one update. Mathematical programming, 50(1):177–195, 1991. [9] William C Davidon. Variable metric method for minimization. SIAM Journal on Optimization, 1(1):1–17, 1991. [10] John E Dennis, Jr and Jorge J Moré. Quasi-newton methods, motivation and theory. SIAM review, 19(1):46–89, 1977. [11] Roger Fletcher. Practical methods of optimization. John Wiley & Sons, 2013. [12] Jianlong Fu, Heliang Zheng, and Tao Mei. Look closer to see better: Recur-rent attention convolutional neural network for fine-grained image recogni-tion. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 4438–4446, 2017. [13] Xavier Glorot and Yoshua Bengio. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth inter-national conference on artificial intelligence and statistics, pages 249–256. JMLR Workshop and Conference Proceedings, 2010. [14] Ian Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. Generative ad-versarial nets. Advances in neural information processing systems, 27, 2014. [15] Karol Gregor, Ivo Danihelka, Alex Graves, Danilo Rezende, and Daan Wier-stra. Draw: A recurrent neural network for image generation. In International conference on machine learning, pages 1462–1471. PMLR, 2015. [16] Jiequn Han, Arnulf Jentzen, and Weinan E. Solving high-dimensional par-tial differential equations using deep learning. Proceedings of the National Academy of Sciences, 115(34):8505–8510, 2018. [17] Diederik P Kingma and Jimmy Ba. Adam: A method for stochastic opti-mization. arXiv preprint arXiv:1412.6980, 2014. [18] Isaac E Lagaris, Aristidis Likas, and Dimitrios I Fotiadis. Artificial neural networks for solving ordinary and partial differential equations. IEEE trans-actions on neural networks, 9(5):987–1000, 1998. [19] Quoc V Le, Navdeep Jaitly, and Geoffrey E Hinton. A simple way to initialize recurrent networks of rectified linear units. arXiv preprint arXiv:1504.00941, 2015. [20] G Lévai, A Baran, P Salamon, and T Vertse. Analytical solutions for the radial scarf ii potential. Physics Letters A, 381(23):1936–1942, 2017. [21] Jiaheng Li and Biao Li. Solving forward and inverse problems of the nonlin-ear schrödinger equation with the generalized-symmetric scarf-ii potential via pinn deep learning. Communications in Theoretical Physics, 73(12):125001, 2021. [22] Tomas Mikolov and Geoffrey Zweig. Context dependent recurrent neural net-work language model. In 2012 IEEE Spoken Language Technology Workshop (SLT), pages 234–239. IEEE, 2012. [23] Will Penny and David Frost. Neural networks in clinical medicine. Medical Decision Making, 16(4):386–398, 1996. [24] Maziar Raissi, Paris Perdikaris, and George E Karniadakis. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Com-putational physics, 378:686–707, 2019. [25] A-PN Refenes, A Neil Burgess, and Yves Bentz. Neural networks in financial engineering: A study in methodology. IEEE transactions on Neural networks, 8(6):1222–1267, 1997. [26] Frank Rosenblatt. Principles of neurodynamics. perceptrons and the theory of brain mechanisms. Technical report, Cornell Aeronautical Lab Inc Buffalo NY, 1961. [27] David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. Learning representations by back-propagating errors. nature, 323(6088):533–536, 1986. [28] Esteban Samaniego, Cosmin Anitescu, Somdatta Goswami, Vien Minh Nguyen-Thanh, Hongwei Guo, Khader Hamdia, X Zhuang, and T Rabczuk. An energy approach to the solution of partial differential equations in com-putational mechanics via machine learning: Concepts, implementation and applications. Computer Methods in Applied Mechanics and Engineering, 362:112790, 2020. [29] Ingo Steinwart and Andreas Christmann. Support vector machines. Springer Science & Business Media, 2008. [30] Tjalling J Ypma. Historical development of the newton–raphson method. SIAM review, 37(4):531–551, 1995. [31] Heiga Ze, Andrew Senior, and Mike Schuster. Statistical parametric speech synthesis using deep neural networks. In 2013 ieee international conference on acoustics, speech and signal processing, pages 7962–7966. IEEE, 2013. [32] Dmitry A Zezyulin and Vladimir V Konotop. Nonlinear modes in the har-monic pt-symmetric potential. Physical Review A, 85(4):043840, 2012. [33] Zijian Zhou and Zhenya Yan. Solving forward and inverse problems of the logarithmic nonlinear schrödinger equation with pt-symmetric harmonic po-tential via deep learning. Physics Letters A, 387:127010, 2021. |