|
[1] J.Mohd Ali, M.A.Hussain, M.O.Tade, J.Zhang, Artificial Intelligence techniques applied as estimator in chemical process systems - A literature survey, Expert Syst. Appl. 42 (2015) 5915–5931. [2] C.Shang, F.Yang, D.Huang, W.Lyu, Data-driven soft sensor development based on deep learning technique, J. Process Control. 24 (2014) 223–233. [3] P.Kadlec, B.Gabrys, S.Strandt, Data-driven Soft Sensors in the process industry, Comput. Chem. Eng. 33 (2009) 795–814. [4] B.A.Jensen, J.Abonyi, Neural Networks for Process Modeling, in: Bela Liptak (Ed.), Instrum. Eng. Handbook, Vol. 2 Process Control Optim., 4th ed., Taylor & Francis, 2006: pp. 253–264. [5] L.N.Smith, A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay, US Nav. Res. Lab. Tech. Rep. (2018). http://arxiv.org/abs/1803.09820. [6] S.Chakraborty, R.Tomsett, R.Raghavendra, D.Harborne, M.Alzantot, F.Cerutti, M.Srivastava, A.Preece, S.Julier, M.R.Raghuveer, T.D.Kelley, D.Braines, M.Sensoy, C.J.Willis, P.Gurram, Interpretability of deep learning models: A survey of results, 2017 IEEE SmartWorld, Ubiquitous Intell. Comput. Adv. Trust. Comput. Scalable Comput. Commun. Cloud Big Data Comput. Internet People Smart City Innov. (n.d.). [7] D.Gunning, Explainable Artificial Intelligence (XAI), Def. Adv. Res. Proj. Agency. (2017). [8] A.Adadi, M.Berrada, Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI), IEEE Access. 6 (2018) 52138–52160. [9] W.J.Murdoch, C.Singh, K.Kumbier, R.Abbasi-Asl, B.Yu, Interpretable machine learning: definitions, methods, and applications, ArXiv Prepr. (2019). [10] L.H.Gilpin, D.Bau, B.Z.Yuan, A.Bajwa, M.Specter, L.Kagal, Explaining explanations: An overview of interpretability of machine learning, 2018 IEEE 5th Int. Conf. Data Sci. Adv. Anal. (2018) 80–89. [11] M.T.Ribeiro, S.Singh, C.Guestrin, “Why Should I Trust You?”: Explaining the predictions of any classifier, in: Proc. 22nd ACM SIGKDD Int. Conf. Knowl. Discov. Data Min. - KDD ’16, ACM Press, New York, New York, USA, 2016: pp. 1135–1144. [12] S.Ingrassia, I.Morlini, Neural network modeling for small datasets, Technometrics. 47 (2005) 297–311. [13] L.Fortuna, S.Graziani, M.G.Xibilia, Comparison of soft-sensor design methods for industrial plants using small data sets, IEEE Trans. Instrum. Meas. 58 (2009) 2444–2451. [14] X.Yuan, B.Huang, Y.Wang, C.Yang, W.Gui, Deep Learning-Based Feature Representation and Its Application for Soft Sensor Modeling with Variable-Wise Weighted SAE, IEEE Trans. Ind. Informatics. 14 (2018) 3235–3243. [15] S.Park, C.Han, A nonlinear soft sensor based on multivariate smoothing procedure for quality estimation in distillation columns, Comput. Chem. Eng. 24 (2000) 871–877. [16] L.Wang, C.Shao, H.Wang, H.Wu, Radial Basis Function Neural Networks-Based Modeling of the Membrane Separation Process: Hydrogen Recovery from Refinery Gases, J. Nat. Gas Chem. 15 (2006) 230–234. [17] R.D.DeVeaux, J.Schumi, J.Schweinsberg, L.H.Ungar, Intervals for Neural Networks via Nonlinear Regression, Technometrics. 40 (1998) 273–282. [18] L.Holmstrom, P.Koistinen, Using additive noise in back-propagation training, IEEE Trans. Neural Networks. 3 (1992) 24–38. [19] Y.Grandvalet, S.Canu, S.Boucheron, Noise Injection: Theoretical Prospects, Neural Comput. 9 (1997) 1093–1108. [20] R.M.Zur, Y.Jiang, L.L.Pesce, K.Drukker, Noise injection for training artificial neural networks: A comparison with weight decay and early stopping, Med. Phys. 36 (2009) 4810–4818. [21] J.Zhang, E.B.Martin, A.J.Morris, C.Kiparissides, Inferential estimation of polymer quality using stacked neural networks, Comput. Chem. Eng. 21 (2003) S1025–S1030. [22] G.-J.Qi, J.Luo, Small Data Challenges in Big Data Era: A Survey of Recent Progress on Unsupervised and Semi-Supervised Methods, ArXiv Prepr. (2019). [23] S.J.Pan, Q.Yang, A survey on transfer learning, IEEE Trans. Knowl. Data Eng. 22 (2010) 1345–1359. [24] Y.Guo, H.Shi, A.Kumar, K.Grauman, T.Rosing, R.Feris, SpotTune: Transfer Learning through Adaptive Fine-tuning, Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (2018) 4805–4814. [25] J.Yosinski, J.Clune, Y.Bengio, H.Lipson, How transferable are features in deep neural networks?, Adv. Neural Inf. Process. Syst. (2014). [26] Y.Li, N.Wang, J.Shi, J.Liu, X.Hou, Revisiting Batch Normalization For Practical Domain Adaptation, ArXiv Prepr. (2016). [27] T.Streich, H.Kömpel, J.Geng, M.Renger, Process Engineering and Optimization Secure the best benefits from C 4 hydrocarbon processing-Part 1: Separation sequences, n.d. [28] S.P.Bressa, J.A.Alves, N.J.Mariani, O.M.Martínez, G.F.Barreto, Analysis of operating variables on the performance of a reactor for total hydrogenation of olefins in a C3-C4-stream, Chem. Eng. J. (2003). [29] R.Gani, C.A.Ruiz, I.T.Cameron, A generalized model for distillation columns—I. Model Description and Applications, Comput. Chem. Eng. 10 (1986) 181–198. [30] R.K.Wood, M.W.Berry, Terminal composition control of a binary distillation column, Chem. Eng. Sci. 28 (1973) 1707–1724. [31] Aspen DynamicsTM 12.1 Reference Guide, Aspen Technology, 2003. http://www.aspentech.com/. [32] W.L.Luyben, Distillation Design and Control Using Aspen Simulation, John Wiley & Sons, 2013. [33] H.Kaneko, K.Funatsu, Moving window and just-in-time soft sensor model based on time differences considering a small number of measurements, Ind. Eng. Chem. Res. 54 (2015) 700–704. [34] J.Brownlee, Deep Learning with Python: Develop Deep Learning Models on Theano and Tensorflow Using Keras, 1.2, Machine Learning Mastery, 2016. [35] Z.Lu, H.Pu, F.Wang, Z.Hu, L.Wang, The Expressive Power of Neural Networks: A View from the Width, Adv. Neural Inf. Process. Syst. (2017) 6231–6239. [36] P.L.Bartlett, The Sample Complexity of Pattern Classification with Neural Networks: The Size of the Weights is More Important than the Size of the Network, IEEE Trans. Inf. Theory. 44 (1998) 525. [37] D.E.Seborg, T.F.Edgar, A.M.Duncan, F.J.Doyle, Process Dynamics and Control, 4th ed., John Wiley & Sons, 2016. |