|
[1] Heidi A. Tissenbaum, “Using C.elegans for aging research,” Invertebrate Reproduction & Development, 59:sup1, 59-63, DOI: 10.1080/07924259.2014.940470, 2015 [2] Zhang, William B et al. “Extended Twilight among Isogenic C. elegans Causes a Disproportionate Scaling between Lifespan and Health.” Cell systems vol. 3,4 (2016): 333-345.e4. [3] Hsu, Ao-Lin et al. “Identification by machine vision of the rate of motor activity decline as a lifespan predictor in C. elegans.” Neurobiology of aging vol. 30,9 (2009): 1498-503. [4] J. Lin, W. Kuo, Y. Huang, T. Jong, A. Hsu and W. Hsu, "Using Convolutional Neural Networks to Measure the Physiological Age of Caenorhabditis elegans," in IEEE/ACM Transactions on Computational Biology and Bioinformatics. [5] R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh and D. Batra, "Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization," 2017 IEEE International Conference on Computer Vision (ICCV), Venice, 2017, pp. 618-626. [6] Karen Simonyan, Andrew Zisserman (2015)."Very Deep Convolutional Networks for Large-Scale Image Recognition" [7] Christian Szegedy, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed, Dragomir Anguelov, Dumitru Erhan, Vincent Vanhoucke, Andrew Rabinovich (2014): "Going deeper with convolutions" [8] C. Szegedy, S. Ioffe, V. Vanhoucke, and A. Alemi, “Inception-v4, inception-resnet and the impact of residual connections on learning,” in AAAI Conference on Artificial Intelligence, 2017, pp. 4278–4284 [9] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun (2015)."Deep Residual Learning for Image Recognition" [10] Andrew G. Howard, Menglong Zhu, Bo Chen, Dmitry Kalenichenko, Weijun Wang, Tobias Weyand, Marco Andreetto, and Hartwig Adam (2017)."MobileNets: efficient convolutional neural networks for mobile vision applications" [11] F. Chollet, "Xception: Deep Learning with Depthwise Separable Convolutions," 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, 2017, pp. 1800-1807. [12] Nasteski, Vladimir. (2017). An overview of the supervised machine learning methods. HORIZONS.B. 4. 51-62. 10.20544/HORIZONS.B.04.1.17.P05. [13] T. Haifley, "Linear logistic regression: an introduction," IEEE International Integrated Reliability Workshop Final Report, 2002., Lake Tahoe, CA, USA, 2002, pp. 184-187. [14] C. Yonghui, "Study of the Case of Learning Bayesian Network from Incomplete Data," 2009 International Conference on Information Management, Innovation Management and Industrial Engineering, Xi'an, 2009, pp. 66-69. [15] Hsu, C.-W & Chang, C.-C & Lin, C.-J. (2003). A Practical Guide to Support Vector Classification. 101. 1396-1400. [16] F. Pernkopf, “Bayesian network classifiers versus selective k-NN classifiers”, Pattern recognition, vol. 38, no. 1, pp. 1-10, 2005 [17] Quinlan, J. 1986. Induction of decision trees. Machine Learning [18] L. Breiman. 2001. Random forests. Machine learning [19] H. U. Dike, Y. Zhou, K. K. Deveerasetty and Q. Wu, "Unsupervised Learning Based On Artificial Neural Network: A Review,"2018 IEEE International Conference on Cyborg and Bionic Systems (CBS), Shenzhen, 2018, pp. 322-327. [20] W. Qiang and Z. Zhongli, "Reinforcement learning model, algorithms and its application," 2011 International Conference on Mechatronic Science, Electric Engineering and Computer (MEC), Jilin, 2011, pp. 1143-1146. [21] O'Shea, Keiron & Nash, Ryan. (2015). An Introduction to Convolutional Neural Networks. ArXiv e-prints. [22] Alpha Go [Online]. Available: https://deepmind.com/research/alphago/ [23] KEVIN(2016)機器學習介紹(Machine Learning)介紹,[Online].Available: http://hadoopspark.blogspot.com/2016/02/blog-post.html [24] Machine Learning Classification, https://data-flair.training/blogs/machine-learning-classification-algorithms/ [25] Sigmoid function, https://zh.wikipedia.org/wiki/S%E5%87%BD%E6%95%B0 [26] Softmax Classifier and Cross-Entropy, [Online].Available :https://mc.ai/notes-on-deep-learning%E2%80%8A-%E2%80%8Asoftmax-classifier/ [27] Introduction to Support Vector Machines [Online]. Available: https://docs.opencv.org/2.4/doc/tutorials/ml/introduction_to_svm/introduction_to_svm.html [28] K-近鄰算法解讀,[Online].Available:https://kknews.cc/zh-tw/news/gvx3jae.html [29] What is Random Forest? https://medium.com/@ryotennis0503/random-forest-27e650072269 [30] A. Singh, N. Thakur and A. Sharma, "A review of supervised machine learning algorithms," 2016 3rd International Conference on Computing for Sustainable Global Development (INDIACom), New Delhi, 2016, pp. 1310-1315. [31] Y. Lecun, L. Bottou, Y. Bengio and P. Haffner, "Gradient-based learning applied to document recognition," in Proceedings of the IEEE, vol. 86, no. 11, pp. 2278-2324, Nov. 1998. [32] Sherstinsky, Alex. “Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) Network.” Physica D: Nonlinear Phenomena 404 (2020): 132306. Crossref. Web. [33] Fischer, Asja & Igel, Christian. (2012). An Introduction to Restricted Boltzmann Machines. 14-36. 10.1007/978-3-642-33275-3_2. [34] Shiruru, Kuldeep. (2016). AN INTRODUCTION TO ARTIFICIAL NEURAL NETWORK. International Journal of Advance Research and Innovative Ideas in Education. 1. 27-30. [35] Yeh James (2017) 資料分析-機器學習-第5-1講-卷積神經網絡介紹 [Online]. Available: https://medium.com/@yehjames/ [36] GGWithRabitLIFE(2018) [機器學習 ML NOTE]Convolution Neural Network卷積神經網路[Online].Available: https:medium.com/機機與兔兔的工程世界/機器學習-ml-note-convolution-neural-network-卷積神經網路-bfa8566744ep [37] Y. Lecun, L. Bottou, Y. Bengio and P. Haffner, "Gradient-based learning applied to document recognition," in Proceedings of the IEEE, vol. 86, no. 11, pp. 2278-2324, Nov. 1998. [38] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. 2017. ImageNet classification with deep convolutional neural networks. Commun. ACM 60, 6 (May 2017), 84–90. [39] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens and Z. Wojna, "Rethinking the Inception Architecture for Computer Vision," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, 2016, pp. 2818-2826. [40] Ioffe, Sergey & Szegedy, Christian. (2015). Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. [41] Kaiser, Lukasz & Gomez, Aidan & Chollet, Francois. (2017). Depthwise Separable Convolutions for Neural Machine Translation. [42] YIN GUOBING (2018) Separable Convolution,[Online].Available: https://blog.csdn.net/tintinetmilou/article/details/81607721 [43] Shorten, Connor & Khoshgoftaar, Taghi. (2019). A survey on Image Data Augmentation for Deep Learning. Journal of Big Data. 6.10.1186/s40537-019-0197-0. [44] Prechelt, Lutz. (2000). Early Stopping - But When?. 10.1007/3-540-49430-8_3. [45] Krogh, A., and Hertz, J. A. (1992). “A Simple Weight Decay Can Improve Generalization,” in Advances in Neural Information Processing Systems 4, eds. J. E. Moody, S. J. Hanson, and R. P. Lippmann (San Francisco, CA: Morgan Kaufmann), 950--957. Available at: ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/nips-4.bib. [46] Srivastava, Nitish & Hinton, Geoffrey & Krizhevsky, Alex & Sutskever, Ilya & Salakhutdinov, Ruslan. (2014). Dropout: A Simple Way to Prevent Neural Networks from Overfitting. Journal of Machine Learning Research. 15. 1929-1958. [47] 莉森揪(2018) [精進魔法]Regularization:減少Overfitting,提高模型泛化能力,[Online].Available:https://ithelp.ithome.com.tw/articles/10203371 [48] Vogl, Richard. (2018). Deep Learning Methods for Drum Transcription and Drum Pattern Generation. 10.13140/RG.2.2.34638.51529. [49] Microstrong (2019) 深度學習中Dropout原理解析, [Online].Available: https://www.itread01.com/content/1547209261.html [50] B. Zhou, A. Khosla, A. Lapedriza, A. Oliva and A. Torralba, "Learning Deep Features for Discriminative Localization," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, 2016, pp. 2921-2929. [51] Xu, L. & Ren, Jimmy & Liu, C. & Jia, J.. (2014). Deep convolutional neural network for image deconvolution. Advances in Neural Information Processing Systems. 2. 1790-1798. [52] Springenberg, Jost & Dosovitskiy, Alexey & Brox, Thomas & Riedmiller, Martin. (2014). Striving for Simplicity: The All Convolutional Net. [53] Uygur Kabael, Tangül. (2010). Kabael, T. U. (2010). Cognitive development of applying the chain rule through three worlds of mathematics. Australian Senior Mathematics Journal, 24(2), 14-28. [54] Python, https://www.python.org/ [55] Keras, https://keras.io/ [56] Tensorflow, https://www.tensorflow.org/ [57] Theano, http://deeplearning.net/software/theano/ [58] Keras, Theano and TensorFlow on Windows and Linux, [Online].Available: https://gettocode.com/2016/12/02/keras-on-theano-and-tensorflow-on-windows-and-linux/ [59] Jiunn-Liang Lin, Yung-Sheng Chen, Yi-Hao Huang, Ao-Lin Hsu, Tai-Lang Jong, and Wen-Hsing Hsu, “Approach to the Caenorhabditis elegans segmentation from its microscopic image,” IEEE International Conference on Systems, Man, and Cybernetics, Oct. 2018 [60] Wikipedia contributors. "Caenorhabditis elegans." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 27 Jun. 2020. Web. 29 Jun. 2020. [61] Uppaluri, Sravanti, and Clifford P Brangwynne. “A size threshold governs Caenorhabditis elegans developmental progression.”Proceedings. Biological sciences vol. 282,1813 (2015): 20151283. [62] Zhou, Y., Wang, X., Song, M.et al.A secreted microRNA disrupts autophagy in distinct tissues of Caenorhabditis elegans upon ageing. Nat Commun 10, 4827 (2019). [63] Pan CL, Peng CY, Chen CH, McIntire S. Genetic analysis of age-dependent defects of the Caenorhabditis elegans touch receptor neurons.Proc Natl Acad Sci U S A. 2011;108(22):9274‐9279.
|