|
[1] J. Jackson and G. Mudholkar, "Control procedures for residuals associated with principal component analysis," Technometrics, Vol. 21, pp. 341-349, 1979. [2] B. M. Wise, N. L. Ricker, D. F. Veltkamp and B. R. Kowalski, "A theoretical basis for the use of principal component models for monitoring multivariate processes," Process control and quality, Vol. 1, pp. 41-51, 1990. [3] T. Kourti, P.Nomikos and J. F. MacGregor, "Analysis, monitoring and fault diagnosis of batch processes using multiblock and multiway PLS," Journal of Process Control, Vol. 5, pp. 277-284, 1995. [4] A. Raich and A. Cinar, "Statistical process monitoring and disturbance diagnosis in multivariable continuous processes," AIChE Journal, Vol. 42, pp. 995-1009, 1996. [5] E. Martin, A.J.Morris and J.Zhang, "Process performance monitoring using multivariate statistical process control," IEE Proceedings-Control Theory and Applications, Vol. 143, pp. 132-144, 1996. [6] M. A. Kramer, "Nonlinear principal component analysis using autoassociative neural networks," AIChE Journal, Vol. 37, pp. 233-243, 1991. [7] D. Dong and T. McAvoy, "Nonlinear principal component analysis—based on principal curves and neural networks," Computers and Chemical Engineering, Vol. 20, pp. 65-78, 1996. [8] B. Schölkopf, A. Smolaand K. R. Muller, "Nonlinear component analysis as a kernel eigenvalue problem," Neural computation, Vol. 10, pp. 1299-1319, 1998. [9] J. M. Lee, C. K. Yoo, S. W. Choi, P. A. Vanrolleghem and I. B. Lee, "Nonlinear process monitoring using kernel principal component analysis," Chemical Engineering Science, Vol. 59, pp. 223-234, 2004. [10] J. D. Shao and G. Rong, "Nonlinear process monitoring based on maximum variance unfolding projections," Expert Systems with Applications, Vol. 36, pp. 11332-11340, 2009. [11] S. T. Roweis and L. K. Saul, "Nonlinear dimensionality reduction by locally linear embedding," Science, Vol. 290, pp. 2323-2326, 2000. [12] J. B. Tenenbaum, V. Silva and J. C. Langford, "A global geometric framework for nonlinear dimensionality reduction," Science, Vol. 290, pp. 2319-2323, 2000. [13] M. Belkin and P. Niyogi, "Laplacian eigenmaps for dimensionality reduction and data representation," Neural computation, Vol. 15, pp. 1373-1396, 2003. [14] K. Q. Weinberger and L. K. Saul, "An introduction to nonlinear dimensionality reduction by maximum variance unfolding," Proceedings of the 21st national conference on Artificial intelligence, Boston, Massachusetts, Vol. 2, 2006. [15] K. Q. Weinberger and L. K. Saul, "Unsupervised learning of image manifolds by semidefinite programming," International Journal of Computer Vision, Vol. 70, pp. 77-90, 2006. [16] J. Ham, D. D. Lee and S. Mika, "A kernel view of the dimensionality reduction of manifolds," Proceedings of the 21st International Conference on Machine Learning, Banff, Cacada, 2004. [17] V. Silva and J. B. Tenenbaum, "Global versus local methods in nonlinear dimensionality reduction, "Advances in neural information processing systems, Vol. 15, pp. 705-712, 2002. [18] C. E. Rasmussen and C. Williams, Gaussian processes for machine learning. Cambridge, MA, USA: The MIT Press, 2006. [19] K. Q. Weinberger, F. Sha and L. K. Saul, "Learning a kernel matrix for nonlinear dimensionality reduction," Proceedings of the 21st International Conference on Machine Learning, Banff, Canada, 2004. [20] R. Bro, K. Kieldahl, A. K. Smilde and H. A. Kiers, "Cross-validation of component models: a critical look at current methods," Analytical and Bioanalytical Chemistry, Vol. 390, pp. 1241-1251, 2008. [21] J. Zhang, E.B Martin and A. J. Morris, "Process monitoring using non-linear statistical techniques," Chemical Engineering Journal, Vol. 67, pp. 181-189, 1997. [22] T. Chen, J. Morris and E. Martin, "Gaussian process regression for multivariate spectroscopic calibration," Chemometrics and Intelligent Laboratory Systems, Vol. 87, pp. 59-71, 2007. [23] P. Boyle and M. Frean, "Dependent gaussian processes," Advances in neural information processing systems, Vol. 17, pp. 217-224, 2005. [24] A. W. Bowman and A. Azzalini, Applied smoothing techniques for data analysis. New York: Oxford University Press, 1997. [25] P. Miller, R. E. Swanson and C. E. Heckler, "Contribution plots: a missing link in multivariate quality control," Applied Mathematics and Computer Science, Vol. 8, pp. 775-792, 1998. [26] A. K. Conlin, E. B. Martin and A. J. Morris, "Confidence limits for contribution plots," Journal of Chemometrics, Vol. 14, pp. 725-736, 2000. [27] S. J. Qin, "Statistical process monitoring: basics and beyond," Journal of chemometrics, Vol. 17, pp. 480-502, 2003. [28] H. H. Yue and S. J. Qin, "Reconstruction-based fault identification using a combined index," Industrial & Engineering Chemistry Research, Vol. 40, pp. 4403-4414, 2001. [29] A. R. T. Donders, G. J. Heijden, T. Stijnen and K. G. Moons, "Review: a gentle introduction to imputation of missing values," Journal of clinical epidemiology, Vol. 59, pp. 1087-1091, 2006. [30] B. N. I. Eskelson, H. TEMESGEN, V. Lemay, T. M. Barrett, N. L. Crookston and A. T. Hudak, "The roles of nearest neighbor methods in imputing missing data in forest inventory and monitoring databases," Scandinavian Journal of Forest Research, Vol. 24, pp. 235-246, 2009. [31] V. Kariwala, P. E. Odiowei, Y. Cao and T. Chen, "A branch and bound method for isolation of faulty variables through missing variable analysis," Journal of Process Control, Vol. 20, pp. 1198-1206, 2010. [32] J. Downs and E. Vogel, "A plant-wide industrial process control problem," Computers & Chemical Engineering, Vol. 17, pp. 245-255, 1993. [33] L. Ricker, "Decentralized control of the Tennessee Eastman challenge process," Journal of Process Control, Vol. 6, pp. 205-221, 1996. [34] C. G. Atkeson,A.W. Moore and S. Schaal, "Locally weighted learning, " Artificial Intelligence Review, Vol. 11, pp. 11-73, 1997. [35] A.E. Hoerl and R.W. Kennard, "Ridge regression: Biased estimation for nonorthogonal problems." Technometrics, Vol. 42, pp. 80–86, 1970. [36] D. Coomans and D.L. Massart, "Alternative k-nearest neighbour rules in supervised pattern recognition : Part 1. k-Nearest neighbour classification by using alternative voting rules," Analytica Chimica Acta, Vol. 136, pp. 15–27, 1982. [37] R. Kohavi, "A study of cross-validation and bootstrap for accuracy estimation and model selection, " Proceedings of the 14st International Joint Conference on Artificial Intelligence, Vol. 2, pp. 1137–1143, 1995. [38] S. An, W. Liu and S. Venkatesh, "Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression," Pattern Recognition, Vol. 40, pp. 2154–2162, 2007. [39] Y. Lee, Y. Lin, and G. Wahba, "Multicategory support vector machines Theory and application to the classification of microarray data and satellite radiance data, " Journal of the American Statistical Association, Vol. 99, pp.67-81, 2004. [40] P.C. Chen,K.Y. Lee, T.J. Lee, Y.J. Lee and S.Y. Huang, "Multiclass support vector classification via coding and regression". Neurocomputing, Vol. 73, pp. 1501–1512, 2010. [41] 劉毅, 張錫成, 朱可輝, 王海清, 李平, "自適應遞推核學習及在橡膠混煉過程在線質量預報的工業應用" 控制理論與應用, 27卷, 609-614頁, 2010. [42] L. H. Chiang, R. D. Braatz and E. Russell, Fault detection and diagnosis in industrial systems, London: Springer Verlag, 2001. [43] M. Jia, F. Chu, F. Wang and W. Wang, "On-line batch process monitoring using batch dynamic kernel principal component analysis," Chemometrics and Intelligent Laboratory Systems, Vol. 101, pp. 110-122, 2010. [44] C. Cortes and V. Vapnik, "Support-Vector Networks", Machine Learning, Vol. 20, 1995. [45] I. Wasito and B. Mirkin, "Nearest neighbour approach in the least-squares data imputation algorithms," Information Sciences, Vol. 169, pp. 1-25, 2005. [46] G. C. Cawley, " Leave-One-Out Cross-Validation Based Model Selection Criteria for Weighted LS-SVMs," Proceedings of the 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006. [47] J. Suykens and J. Vandewalle, "Least squares support vector machine classifiers", Neural Process, Vol. 9, pp. 293–300, 1999. [48] J. Suykens, T. Van Gestel, J. De Brabanter, B. De Moor and J. Vandewalle, Least Squares Support Vector Machines, World Scientific, Singapore, 2002. [49] C. W. Hsu and C. J. Lin, "A comparison of methods for multi-class support vector machines , "IEEE Transactions on Neural Networks, Vol. 13, pp. 415-425, 2002. [50] S. Yin, S. X. Ding, A. Haghani, H. Hao and P. Zhang, "A comparison study of basic data-driven fault diagnosis and process monitoring methods on the benchmark Tennessee Eastman process, " Journal of Process Control, Vol. 9, pp. 1567-1581, 2012.
|