Variable selection technique in fault classification problems applied in industrial process using the MOEADD genetic algorithm
DOI:
https://doi.org/10.35699/2447-6218.2019.15357Keywords:
Industry, KNN, Genetic Operators, Computational intelligenceAbstract
In this work we propose a method of variable selection called MOEADD-KNN-M, which is based on the genetic algorithm MOEADD (Evolutionary Many-Objective Optimization Algorithm Based on Dominance and Decomposition), on the classification algorithm KNN (K-nearest neighbors), and in adapted genetic operators. The approach adopted in the proposed algorithm is bi-objective, where one objective is to minimize the amount of solution variables and another objective is to minimize the failure classification error rate. Experiments were performed with the proposed method using data from a real petrochemical industrial process, called Tennessee Eastman for failure classification, and the results were compared with other algorithms. The results showed that the proposed method leads to solutions with low classification error and low number of sensors, which are the quantities sought to be minimized. Thus, this approach has shown promise for application in the selection of variables in fault classification problems in industrial processes.
Downloads
References
Ahmad, I. (2015). Feature selection using particle swarm optimization in intrusion detection. International Journal of Distributed Sensor Networks, 11(10):806954.
Al-Ani, A., Alsukker, A., e Khushaba, R. N. (2013). Feature subset selection using differential evolution and a wheel based search strategy. Swarm and Evolutionary Computation, 9:15–26.
Allegrini, F. e Olivieri, A. C. (2011). A new and efficient variable selection algorithm based on ant colony optimization. applications to near infrared spectroscopy/partial leastsquares analysis. Analytica chimica acta, 699(1):18–25.
Brusco, M. J. (2014). A comparison of simulated annealing algorithms for variable selection in principal component analysis and discriminant analysis. Computational Statistics & Data Analysis, 77:38–53. Chandrashekar, G. e Sahin, F. (2014). A survey on feature selection methods. Computers & Electrical Engineering, 40(1):16–28.
Foster, D., Karloff, H., e Thaler, J. (2015). Variable selection is hard. In Conference on Learning Theory, p. 696–709.
Guyon, I. e Elisseeff, A. (2003). An introduction to variable and feature selection. Journal of machine learning research, 3(Mar):1157–1182.
Kohavi, R. e John, G. H. (1997). Wrappers for feature subset selection. Artificial intelligence, 97(1-2):273–324.
Li, K., Deb, K., Zhang, Q., e Kwong, S. (2014). An evolutionary manyobjective optimization algorithm based on dominance and decomposition. IEEE Transactions on Evolutionary Computation, 19(5):694–716.
Marill, T. e Green, D. (1963). On the effectiveness of receptors in recognition systems. IEEE transactions on Information Theory, 9(1):11–17.
Mukhopadhyay, A., Maulik, U., Bandyopadhyay, S., e Coello, C. A. C. (2013). A survey of multiobjective evolutionary algorithms for data mining: Part i. IEEE Transactions on Evolutionary Computation, 18(1):4–19.
Silva, F. M. S., Ferreira, J. F., Palhares, R. M., e D’Angelo, M. F. S. V. (2017). Uma abordagem evolutiva multiobjetivo baseada em ponto de atração para seleção de variáveis em problemas de classificação de falhas. XLIX Simpósio Brasileiro de Pesquisa Operacional.
Venkatasubramanian, V., Rengaswamy, R., e Kavuri, S. N. (2003a). A review of process fault detection and diagnosis: Part ii: Qualitative models and search strategies. Computers & chemical engineering, 27(3):313–326.
Venkatasubramanian, V., Rengaswamy, R., Kavuri, S. N., e Yin, K. (2003b). A review of process fault detection and diagnosis: Part iii: Process history based methods. Computers & chemical engineering, 27(3):327–346.
Venkatasubramanian, V., Rengaswamy, R., Yin, K., e Kavuri, S. N. (2003c). A review of process fault detection and diagnosis: Part i: Quantitative model-based methods. Computers & chemical engineering, 27(3):293–311.
Whitney, A. W. (1971). A direct method of nonparametric measurement selection. IEEETransactions on Computers, 100(9):1100–1103.
Yu, L. e Liu, H. (2004). Efficient feature selection via analysis of relevance and redundancy. Journal of machine learning research, 5(Oct):1205–1224.
Published
How to Cite
Issue
Section
License
Authors who publish in this journal agree to the following terms:
The Copyright for articles published in this journal follow authorship. The articles are open access, with their own attributions, in educational and non-commercial applications.
The journal reserves the right to make regulatory, orthographic and grammatical changes in the originals, with the aim of maintaining the standard language and the credibility of the vehicle. It will respect, however, the writing style of the authors.
Changes, corrections or suggestions of conceptual order will be forwarded to the authors, when necessary. In such cases, the articles, once appropriate, should be submitted for further consideration.
The opinions issued by the authors of the articles are their sole responsibility.