帳號:guest(3.145.55.14)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):劉原瑞
論文名稱(中文):基於流形學習與局部信息於非線性製程監測與故障診斷
論文名稱(外文):Nonlinear Process Monitoring and Fault Isolation using Manifold Learning and Localized Information
指導教授(中文):姚遠
口試委員(中文):汪上曉
陳榮輝
姚遠
學位類別:碩士
校院名稱:國立清華大學
系所名稱:化學工程學系
學號:100032537
出版年(民國):102
畢業學年度:101
語文別:中文
論文頁數:63
中文關鍵詞:非線性製程監測擴展最大方差展開即時學習多分類支持向量機
相關次數:
  • 推薦推薦:0
  • 點閱點閱:307
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
本研究以流形學習(Manifold Learning)和局部信息之概念對非線性製程變數進行監測、分析以及診斷。在無故障訊息的情況下,傳統上使用的非線性製程監測方法為Kernel Principal Component Analysis (KPCA),而流形學習方法最大方差展開(Maximum Variance Unfolding, MVU)利用保持局部信息的特性建立一全局模型,改善了KPCA無法很好保持潛在流形的特性,但MVU有個缺陷是模型只適用於訓練樣本,因此本文以非線性投影方法Gaussian Process Regression (GPR)逼近MVU之輸入以及輸出空間,發展出一種擴展的最大方差展開方法(Extended Maximum Variance Unfolding, EMVU),GPR的另個優點是其參數能自動學習,而KPCA的核參數選擇問題卻仍未有一個較明確的解決方案,在線上監測中,EMVU不只以傳統的T2和SPE指標監測模型之距離及擬合度,並加入輸出預測變異量作為一新指標,能夠更大的顯示輸入空間中測試樣本與訓練樣本的差異性。在有故障數據的情形下,常用分類方法建立模型,其中,支持向量機是近年來最被廣泛使用的一種分類方法。本研究基於局部模型可能趨向線性或模型非線性複雜度較全局模型低的概念,以即時學習(Just In Time Learning, JITL)的方式,在新樣本採集時即時以鄰近點建立局部多分類支持向量機(Multiclass Support Vector Machines, MSVM)模型、即時監測,因此將之命名為JITL-MSVM,而每個模型建立時皆採用快速的留一交叉驗證法(Fast Leave One Out, FLOO)自動選擇模型參數,以符合”即時”的目的。本研究最後使用田納西伊士曼程序(Tennessee Eastman process process)模擬一連續非線性製程進行分析並與傳統方法比較。
目錄 I
圖目錄 II
表目錄 III
第一章 緒論 1
1.1前言 1
1.2文獻回顧 2
1.3研究動機 6
1.3.1擴展最大方差展開法 6
1.3.2即時學習-多分類支持向量機方法 7
第二章 研究方法 8
2.1核主成分分析 8
2.2擴展最大方差展開法 10
2.2.1最大方差展開 10
2.2.2高斯程序迴歸 12
2.2.3擴展最大方差展開監測方法 14
2.2.4錯誤判定方法 – 重構 15
2.2.5擴展最大方差展開建模方法及監測流程 17
2.3利用局部信息與即時學習於非線性製程監測 18
2.3.1最小二乘支持向量機及1對1多分類支持向量機 18
2.3.2一次性求解多分類支持向量機 19
2.3.3快速留一交叉驗證法 21
2.3.4利用局部信息與即時學習於非線性製程建模方法及監測流程 23
2.4田纳西-伊斯曼製程 23
第三章 研究成果 30
3.1擴展最大方差展開研究成果 30
3.1.1非線性映射 30
3.1.2擴展最大方差展開於數學例子效果展示 33
3.1.3建模階段及參數選擇 40
3.1.5錯誤判定 49
3.2利用局部信息與即時學習於非線性製程監測研究成果 53
3.2.1建模階段及參數選擇 53
3.2.2即時學習-多分類支持向量機方法結果 54
第四章 結論 57
第五章 參考文獻 58
[1] J. Jackson and G. Mudholkar, "Control procedures for residuals associated with principal component analysis," Technometrics, Vol. 21, pp. 341-349, 1979.
[2] B. M. Wise, N. L. Ricker, D. F. Veltkamp and B. R. Kowalski, "A theoretical basis for the use of principal component models for monitoring multivariate processes," Process control and quality, Vol. 1, pp. 41-51, 1990.
[3] T. Kourti, P.Nomikos and J. F. MacGregor, "Analysis, monitoring and fault diagnosis of batch processes using multiblock and multiway PLS," Journal of Process Control, Vol. 5, pp. 277-284, 1995.
[4] A. Raich and A. Cinar, "Statistical process monitoring and disturbance diagnosis in multivariable continuous processes," AIChE Journal, Vol. 42, pp. 995-1009, 1996.
[5] E. Martin, A.J.Morris and J.Zhang, "Process performance monitoring using multivariate statistical process control," IEE Proceedings-Control Theory and Applications, Vol. 143, pp. 132-144, 1996.
[6] M. A. Kramer, "Nonlinear principal component analysis using autoassociative neural networks," AIChE Journal, Vol. 37, pp. 233-243, 1991.
[7] D. Dong and T. McAvoy, "Nonlinear principal component analysis—based on principal curves and neural networks," Computers and Chemical Engineering, Vol. 20, pp. 65-78, 1996.
[8] B. Schölkopf, A. Smolaand K. R. Muller, "Nonlinear component analysis as a kernel eigenvalue problem," Neural computation, Vol. 10, pp. 1299-1319, 1998.
[9] J. M. Lee, C. K. Yoo, S. W. Choi, P. A. Vanrolleghem and I. B. Lee, "Nonlinear process monitoring using kernel principal component analysis," Chemical Engineering Science, Vol. 59, pp. 223-234, 2004.
[10] J. D. Shao and G. Rong, "Nonlinear process monitoring based on maximum variance unfolding projections," Expert Systems with Applications, Vol. 36, pp. 11332-11340, 2009.
[11] S. T. Roweis and L. K. Saul, "Nonlinear dimensionality reduction by locally linear embedding," Science, Vol. 290, pp. 2323-2326, 2000.
[12] J. B. Tenenbaum, V. Silva and J. C. Langford, "A global geometric framework for nonlinear dimensionality reduction," Science, Vol. 290, pp. 2319-2323, 2000.
[13] M. Belkin and P. Niyogi, "Laplacian eigenmaps for dimensionality reduction and data representation," Neural computation, Vol. 15, pp. 1373-1396, 2003.
[14] K. Q. Weinberger and L. K. Saul, "An introduction to nonlinear dimensionality reduction by maximum variance unfolding," Proceedings of the 21st national conference on Artificial intelligence, Boston, Massachusetts, Vol. 2, 2006.
[15] K. Q. Weinberger and L. K. Saul, "Unsupervised learning of image manifolds by semidefinite programming," International Journal of Computer Vision, Vol. 70, pp. 77-90, 2006.
[16] J. Ham, D. D. Lee and S. Mika, "A kernel view of the dimensionality reduction of manifolds," Proceedings of the 21st International Conference on Machine Learning, Banff, Cacada, 2004.
[17] V. Silva and J. B. Tenenbaum, "Global versus local methods in nonlinear dimensionality reduction, "Advances in neural information processing systems, Vol. 15, pp. 705-712, 2002.
[18] C. E. Rasmussen and C. Williams, Gaussian processes for machine learning. Cambridge, MA, USA: The MIT Press, 2006.
[19] K. Q. Weinberger, F. Sha and L. K. Saul, "Learning a kernel matrix for nonlinear dimensionality reduction," Proceedings of the 21st International Conference on Machine Learning, Banff, Canada, 2004.
[20] R. Bro, K. Kieldahl, A. K. Smilde and H. A. Kiers, "Cross-validation of component models: a critical look at current methods," Analytical and Bioanalytical Chemistry, Vol. 390, pp. 1241-1251, 2008.
[21] J. Zhang, E.B Martin and A. J. Morris, "Process monitoring using non-linear statistical techniques," Chemical Engineering Journal, Vol. 67, pp. 181-189, 1997.
[22] T. Chen, J. Morris and E. Martin, "Gaussian process regression for multivariate spectroscopic calibration," Chemometrics and Intelligent Laboratory Systems, Vol. 87, pp. 59-71, 2007.
[23] P. Boyle and M. Frean, "Dependent gaussian processes," Advances in neural information processing systems, Vol. 17, pp. 217-224, 2005.
[24] A. W. Bowman and A. Azzalini, Applied smoothing techniques for data analysis. New York: Oxford University Press, 1997.
[25] P. Miller, R. E. Swanson and C. E. Heckler, "Contribution plots: a missing link in multivariate quality control," Applied Mathematics and Computer Science, Vol. 8, pp. 775-792, 1998.
[26] A. K. Conlin, E. B. Martin and A. J. Morris, "Confidence limits for contribution plots," Journal of Chemometrics, Vol. 14, pp. 725-736, 2000.
[27] S. J. Qin, "Statistical process monitoring: basics and beyond," Journal of chemometrics, Vol. 17, pp. 480-502, 2003.
[28] H. H. Yue and S. J. Qin, "Reconstruction-based fault identification using a combined index," Industrial & Engineering Chemistry Research, Vol. 40, pp. 4403-4414, 2001.
[29] A. R. T. Donders, G. J. Heijden, T. Stijnen and K. G. Moons, "Review: a gentle introduction to imputation of missing values," Journal of clinical epidemiology, Vol. 59, pp. 1087-1091, 2006.
[30] B. N. I. Eskelson, H. TEMESGEN, V. Lemay, T. M. Barrett, N. L. Crookston and A. T. Hudak, "The roles of nearest neighbor methods in imputing missing data in forest inventory and monitoring databases," Scandinavian Journal of Forest Research, Vol. 24, pp. 235-246, 2009.
[31] V. Kariwala, P. E. Odiowei, Y. Cao and T. Chen, "A branch and bound method for isolation of faulty variables through missing variable analysis," Journal of Process Control, Vol. 20, pp. 1198-1206, 2010.
[32] J. Downs and E. Vogel, "A plant-wide industrial process control problem," Computers & Chemical Engineering, Vol. 17, pp. 245-255, 1993.
[33] L. Ricker, "Decentralized control of the Tennessee Eastman challenge process," Journal of Process Control, Vol. 6, pp. 205-221, 1996.
[34] C. G. Atkeson,A.W. Moore and S. Schaal, "Locally weighted learning, " Artificial Intelligence Review, Vol. 11, pp. 11-73, 1997.
[35] A.E. Hoerl and R.W. Kennard, "Ridge regression: Biased estimation for nonorthogonal problems." Technometrics, Vol. 42, pp. 80–86, 1970.
[36] D. Coomans and D.L. Massart, "Alternative k-nearest neighbour rules in supervised pattern recognition : Part 1. k-Nearest neighbour classification by using alternative voting rules," Analytica Chimica Acta, Vol. 136, pp. 15–27, 1982.
[37] R. Kohavi, "A study of cross-validation and bootstrap for accuracy estimation and model selection, " Proceedings of the 14st International Joint Conference on Artificial Intelligence, Vol. 2, pp. 1137–1143, 1995.
[38] S. An, W. Liu and S. Venkatesh, "Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression," Pattern Recognition, Vol. 40, pp. 2154–2162, 2007.
[39] Y. Lee, Y. Lin, and G. Wahba, "Multicategory support vector machines Theory and application to the classification of microarray data and satellite radiance data, " Journal of the American Statistical Association, Vol. 99, pp.67-81, 2004.
[40] P.C. Chen,K.Y. Lee, T.J. Lee, Y.J. Lee and S.Y. Huang, "Multiclass support vector classification via coding and regression". Neurocomputing, Vol. 73, pp. 1501–1512, 2010.
[41] 劉毅, 張錫成, 朱可輝, 王海清, 李平, "自適應遞推核學習及在橡膠混煉過程在線質量預報的工業應用" 控制理論與應用, 27卷, 609-614頁, 2010.
[42] L. H. Chiang, R. D. Braatz and E. Russell, Fault detection and diagnosis in industrial systems, London: Springer Verlag, 2001.
[43] M. Jia, F. Chu, F. Wang and W. Wang, "On-line batch process monitoring using batch dynamic kernel principal component analysis," Chemometrics and Intelligent Laboratory Systems, Vol. 101, pp. 110-122, 2010.
[44] C. Cortes and V. Vapnik, "Support-Vector Networks", Machine Learning, Vol. 20, 1995.
[45] I. Wasito and B. Mirkin, "Nearest neighbour approach in the least-squares data imputation algorithms," Information Sciences, Vol. 169, pp. 1-25, 2005.
[46] G. C. Cawley, " Leave-One-Out Cross-Validation Based Model Selection Criteria for Weighted LS-SVMs," Proceedings of the 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006.
[47] J. Suykens and J. Vandewalle, "Least squares support vector machine classifiers", Neural Process, Vol. 9, pp. 293–300, 1999.
[48] J. Suykens, T. Van Gestel, J. De Brabanter, B. De Moor and J. Vandewalle, Least Squares Support Vector Machines, World Scientific, Singapore, 2002.
[49] C. W. Hsu and C. J. Lin, "A comparison of methods for multi-class support vector machines , "IEEE Transactions on Neural Networks, Vol. 13, pp. 415-425, 2002.
[50] S. Yin, S. X. Ding, A. Haghani, H. Hao and P. Zhang, "A comparison study of basic data-driven fault diagnosis and process monitoring methods on the benchmark Tennessee Eastman process, " Journal of Process Control, Vol. 9, pp. 1567-1581, 2012.
(此全文限內部瀏覽)
電子全文
摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *