How to cite this paper
Al-Shalabi, L. (2024). Hybrid feature selection based ScC and forward selection methods.International Journal of Data and Network Science, 8(2), 1117-1128.
Refrences
Alhenawi, E., Al-Sayyed, R., Hudaib A., & Mirjalili, S. (2023). Improved intelligent water drop-based hybrid feature selection method for microarray data processing, Computational Biology and Chemistry, 103, 107809, https://doi.org/10.1016/j.compbiolchem.2022.107809.
Al-Shalabi, L., Shaaban, Z., & Kasasbeh, B. (2006). Data mining: A Preprocessing Engine. Journal of Computer Science, 2(9), 735-739.
Al-Shalabi, L. (2016). Data Mining Application: Predicting Students’ Performance of ITC program in the Arab Open University in Kuwait – The Blended Learning. International Journal of Computer Science and Information Security (IJCSIS), 14(12), 827-833.
Al-Shalabi, L. (2017). Perceptions of Crime Behavior and Relationships: Rough Set Based Approach. International Journal of Computer Science and Information Security (IJCSIS), 15(3), 413-420.
Al-Shalabi, L. (2019). Rough Set-Based Reduction of Incomplete Medical Datasets by Reducing the Number of Missing Values. The International Arab Journal of Information Technology (IAJIT), 16(2).
Al-Shalabi, L., & Tahhan, Y. (2020). A WGFS-Based Approach to Extract Factors Influencing the Marketing of Korean Language in GCC. Journal of Computing and Information Technology, 28(3), 165–181.
Al-Shalabi, L. (2022). New Feature Selection Algorithm Based on Feature Stability and Correlation. IEEE Access, 10, 4699-4713. doi: 10.1109/ACCESS.2022.3140209.
AL-Shalabi, L. (2022). Evaluation of COVID-19 Vaccine Refusal among AOU Students in Kuwait and their Families and their Expected Inclination Towards the Acceptance or Refusal of the Vaccine. International Journal of Statistics in Medical Research, 11, 147–161.
Aphinyanaphongs, Y., Fu, L.D., Li, Z., Peskin, E.R., Efstathiadis, E., Aliferis, C.F., & Statnikov, A. (2014). A comprehensive empirical comparison of modern supervised classification and feature selection methods for text categorization. J. Assoc. Information Science Technology, 65(10), 1964–1987.
Asghari, S., Nematzadeh, H., Akbari, & Motameni, H. (2023). Mutual information-based filter hybrid feature selection method for medical datasets using feature clustering. Multimed Tools Appl. https://doi.org/10.1007/s11042-023-15143-0
Bolón-Canedo, V., Sánchez-Maroño, N., & Alonso-Betanzos, A. (2013). A review of feature selection methods on synthetic data. Knowledge Information System, 34(3), 483–519.
Bolón-Canedo, V., Sánchez-Marono, N., Alonso-Betanzos, A., Benítez, J.M., & Herrera, F. (2014). A review of microarray datasets and applied feature selection methods. Information Science, 282, 111–135.
Bommert, A., Sun, X., Bischl, B., Rahnenführer, J., & Lang, M. (2020). Benchmark for filter methods for feature selection in high-dimensional classification data. Computational Statistics and Data Analysis, 143: 106839.
Breiman, L. (2001). Random Forests. Machine Learning, 45, 5–32.
Cai, J., Luo, J., Wang, S., & Yang, S. (2018). Feature selection in machine learning: A new perspective. Neurocomputing, 300, 70–79.
Chaudhary, A., & Kolhe, S. (2013). Performance Evaluation of feature selection methods for Mobile devices. Int. Journal of Engineering Research and Applications, 3(6), 587-594.
Cherrington, M., Thabtah, F., Lu, J., & Xu, Q. (2019). Feature Selection: Filter Methods Performance Challenges. 2019 International Conference on Computer and Information Sciences (ICCIS), pp. 1-4. doi: 10.1109/ICCISci.2019.8716478.
Cormen, T.H. (2009). Introduction to Algorithms, MIT Press.
Darshan, S.S., & Jaidhar, C. (2018). Performance evaluation of filter-based feature selection techniques in classifying portable executable files. Procedia Computer Science, 125, 346–356.
Das, A.K., Sengupta, S., & Bhattacharyya, S. (2018). A group incremental feature selection for classification using a rough set theory-based genetic algorithm. Applied Soft Computing, 65, 400–411.
Kamalov, F., Sulieman, H., Moussa, S, Reyes, J.A., & Safaraliev, M. (2023). Nested ensemble selection: An effective hybrid feature selection method. Heliyon, 9(9), e19686, https://doi.org/10.1016/j.heliyon.2023.e19686
Fleuret, F. (2004). Fast binary feature selection with conditional mutual information. Journal of Machine Learning Research, 5, 1531–1555.
Gong, L., Xie, S., Zhang, Y., Wang, M., & Wang, X. (2022). Hybrid feature selection method based on feature subset and factor analysis. IEEE Access, 10, 120792-120803.
Guyon, I., & Elisseeff, A. (2003). An introduction to variable and feature selection. Journal of Machine Learning Research, 3, 1157-1182.
Haq, A.U., Zeb, A., Lei, Z., & Zhang, D. (2021). Forecasting daily stock trend using multi-filter feature selection and deep learning. Expert Systems with Applications, 168, 114444.
Hoque, N., Singh, M., & Bhattacharyya, D.K. (2018). EFS-MI: An ensemble feature selection method for classification. Complex Intelligent System., 4(2), 105–118.
Hu, M., Tsang, E.C.C., Guo, Y., & Xu, W. (2021). Fast and Robust Attribute Reduction Based on the Separability in Fuzzy Decision Systems. IEEE Trans Cybern., doi: 10.1109/TCYB.2020.3040803.
Inza, I., Larrañaga, P., Blanco, R., & Cerrolaza, A.J. (2004). Filter versus wrapper gene selection approaches in DNA microarray domains. Artificial Intellectual Medicine, 31(2), 91–103.
Kang, I-A., Njimbouom, SN., & Kim, J-D. (2023). Optimal Feature Selection-Based Dental Caries Prediction Model Using Machine Learning for Decision Support System. Bioengineering, 10(2), 245. https://doi.org/10.3390/bioengineering10020245Kothari, CR. (2007). Quantitative techniques, New Delhi, UBS Publishers Ltd.
Kim, M., Bae, J., Wang, B., Ko, H., & Lim, JS. (2022). Feature Selection Method Using Multi-Agent Reinforcement Learning Based on Guide Agents. Sensors (Basel), 23(1), 98. doi: 10.3390/s23010098.
Kohavi, R., & John, G.H. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97(1–2), 273–324. http://www.robotics.stanford.edu/~gjohn.
Lazar, C., Taminau, J., Meganck, S., Steenhoff, D., Coletta, A., Molter, C., de Schaetzen, V., Duque, R., Bersini, H., & Nowe, A. (2012). A survey on filter techniques for feature selection in gene expression microarray analysis. IEEE/ACM Trans. Comput. Biol. Bioinform, 9(4), 1106–1119.
Liu, Y. (2004). A comparative study on feature selection methods for drug discovery. Journal of Chemical and Information Computational Science, 44(5), 1823–1828.
Luekiangkhamla, A., Panagant, N., Bureerat, S., & Pholdee, N. (2023). Application of various machine learning models for fault detection in the refrigeration system of a brewing company. Engineering and Applied Science Research, 50(2), 149–154. https://ph01.tci-thaijo.org/index.php/easr/article/view/251060
Ma, B., & Xia, Y. (2017). A tribe competition-based genetic algorithm for feature selection in pattern classification. Appl. Soft Computing, 58, 328–338.
Mienye, I.D., & Sun, Y. (2023). A Machine Learning Method with Hybrid Feature Selection for Improved Credit Card Fraud Detection. Applied Sciences, 13(12):7254. https://doi.org/10.3390/app13127254
Mohtashami, M., & Eftekhari, M. (2019). A hybrid filter-based feature selection method via hesitant fuzzy and rough sets concepts. Iranian Journal of Fuzzy Systems, 16(2), 165–182.
Nayak, SK., Pradhan, BK., Banerjee, I., & Pal, K. (2020). Analysis of heart rate variability to understand the effect of cannabis consumption on Indian male paddy-field workers. Biomed Signal Process Control, 62:102072.
Niu, T., Wang, J., Lu, H., Yang, W., & Du, P. (2020). Developing a deep learning framework with two-stage feature selection for multivariate financial time series forecasting. Expert Systems with Applications, 148: 113237.
Pabuccu, H., & Barbu, A. (2023). Feature Selection for Forecasting. arXiv preprint arXiv:2303.02223.
Peng, C.Y.J., Lee, K.L., & Ingersoll, G.M. (2002). An introduction to logistic regression analysis and reporting. The Journal of Educational Research, 96(1), 3-14. http://dx.doi.org/10.1080/00220670209598786, 2002.
Peng, H., Long, F., & Ding, C. (2005). Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(8), 1226–1238.
Phyu, T.Z., & Oo, N.N. (2016). Performance Comparison of Feature Selection Methods. The 3rd International Conference on Control, Mechatronics and Automation (ICCMA 2015), 42(06002). doi: 10.1051/matecconf/20164206002
Pourpanah, F., Shi, Y., Lim, C., Hao, Q., & Tan, C.J. (2019). Feature selection based on brainstorm optimization for data classification. Applied Soft Computing Journal, 80, 761–775.
Quinlan, J.R. (1989). Unknown Attribute Values in Induction. In Proceedings of the Sixth International Workshop on Machine Learning, 164-168.
Reunanen, J. (2006). Search Strategies. Feature Extraction Foundations and Applications Stud-Fuzz 207, 119–136, Springer.
Ringsquandl, M., Lamparter, S., Brandt, S., Hubauer, T., & Lepratti, R. (2015). Semantic-guided feature selection for industrial automation systems. The Semantic Web - ISWC 2015, pp. 225-240. http://dx.doi.org/10.1007/978-3-319-25010-6_13, 2015.
Sinayobye, J.O., Kaawaase, K.S., Kiwanuka, F.N., & Musabe, R. (2019). Hybrid Model of Correlation Based Filter Feature Selection and Machine Learning Classifiers Applied on Smart Meter Data Set. IEEE/ACM Symposium on Software Engineering in Africa (SEiA), pp. 1-10. doi: 10.1109/SEiA.2019.00009.
Soheili, M., & Moghadam, A.E. (2020). DQPFS: Distributed quadratic programming based feature selection for big data. Journal of Parallel and Distributed Computing, 138, 1-14.
Suresh, S., Newton, DT., Everett, TH., Lin, G., & Duerstock, BS. (2022). Feature Selection Techniques for a Machine Learning Model to Detect Autonomic Dysreflexia. Front Neuroinform, 10(16),901428. doi: 10.3389/fninf.2022.901428.
Tianyi, Z. Yingzhe, Z., & Zhe, W. (2023). Feature selection-based machine learning modeling for distributed model predictive control of nonlinear processes. Computers & Chemical Engineering, 169, 108074. https://doi.org/10.1016/j.compchemeng.2022.108074.
Velayutham, C., & Thangavel, K. (2011). Unsupervised Quick Reduct Algorithm Rough Set Theory. Journal of Electronic Science and Technology, 9(3).
Velusamy, K., & Manavalan, R. (2012). Performance Analysis of Unsupervised Classification based on Optimization. International Journal of Computer Applications, 42(19), 22-27.
Wah, Y.B., Ibrahim, N., Hamid, H.A., Abdul-Rahman, S., & Fong, S. (2018). Feature selection methods: the case of filter and wrapper approaches for maximizing classification accuracy. Pertanika Journal of Science and Technology., 26(1), 329–340.
Xiao, W., Ji, P., & Hu, J. (2021). RnkHEU: A Hybrid Feature Selection Method for Predicting Students’ Performance. Scientific Programming, 2021, 1-16. https://doi.org/10.1155/2021/1670593
Xue, B., Zhang, M., & Browne, W.N. (2015). A comprehensive comparison of evolutionary feature selection approaches to classification. International Journal of Computational Intelligence and Application, 14(2).
Yin, Y., Jang-Jaccard J., Xu W., Singh, A., Zhu, J., Sabrina, F., & Kwak, J. (2023). IGRF-RFE: a hybrid feature selection method for MLP-based network intrusion detection on UNSW-NB15 dataset. Journal of Big Data,10, https://doi.org/10.1186/s40537-023-00694-8
Yongbin, Z., Wenshan, L., & Tao, L. (2023). hybrid Artificial Immune optimization for high-dimensional feature selection, Knowledge-Based Systems, 260, 110111, https://doi.org/10.1016/j.knosys.2022.110111.
Zhang, Y., Gong, D., Hu, Y., & Zhang, W. (2015). Feature selection algorithm based on bare bones particle swarm optimization. Neurocomputing, 148, 150–157.
Zhu, Z., Ong, Y.S., & Dash, M. (2007). Wrapper-filter feature selection algorithm using a memetic framework. IEEE Transactions on Systems, Man, and Cybernetics, 37(1), 70–76.
Al-Shalabi, L., Shaaban, Z., & Kasasbeh, B. (2006). Data mining: A Preprocessing Engine. Journal of Computer Science, 2(9), 735-739.
Al-Shalabi, L. (2016). Data Mining Application: Predicting Students’ Performance of ITC program in the Arab Open University in Kuwait – The Blended Learning. International Journal of Computer Science and Information Security (IJCSIS), 14(12), 827-833.
Al-Shalabi, L. (2017). Perceptions of Crime Behavior and Relationships: Rough Set Based Approach. International Journal of Computer Science and Information Security (IJCSIS), 15(3), 413-420.
Al-Shalabi, L. (2019). Rough Set-Based Reduction of Incomplete Medical Datasets by Reducing the Number of Missing Values. The International Arab Journal of Information Technology (IAJIT), 16(2).
Al-Shalabi, L., & Tahhan, Y. (2020). A WGFS-Based Approach to Extract Factors Influencing the Marketing of Korean Language in GCC. Journal of Computing and Information Technology, 28(3), 165–181.
Al-Shalabi, L. (2022). New Feature Selection Algorithm Based on Feature Stability and Correlation. IEEE Access, 10, 4699-4713. doi: 10.1109/ACCESS.2022.3140209.
AL-Shalabi, L. (2022). Evaluation of COVID-19 Vaccine Refusal among AOU Students in Kuwait and their Families and their Expected Inclination Towards the Acceptance or Refusal of the Vaccine. International Journal of Statistics in Medical Research, 11, 147–161.
Aphinyanaphongs, Y., Fu, L.D., Li, Z., Peskin, E.R., Efstathiadis, E., Aliferis, C.F., & Statnikov, A. (2014). A comprehensive empirical comparison of modern supervised classification and feature selection methods for text categorization. J. Assoc. Information Science Technology, 65(10), 1964–1987.
Asghari, S., Nematzadeh, H., Akbari, & Motameni, H. (2023). Mutual information-based filter hybrid feature selection method for medical datasets using feature clustering. Multimed Tools Appl. https://doi.org/10.1007/s11042-023-15143-0
Bolón-Canedo, V., Sánchez-Maroño, N., & Alonso-Betanzos, A. (2013). A review of feature selection methods on synthetic data. Knowledge Information System, 34(3), 483–519.
Bolón-Canedo, V., Sánchez-Marono, N., Alonso-Betanzos, A., Benítez, J.M., & Herrera, F. (2014). A review of microarray datasets and applied feature selection methods. Information Science, 282, 111–135.
Bommert, A., Sun, X., Bischl, B., Rahnenführer, J., & Lang, M. (2020). Benchmark for filter methods for feature selection in high-dimensional classification data. Computational Statistics and Data Analysis, 143: 106839.
Breiman, L. (2001). Random Forests. Machine Learning, 45, 5–32.
Cai, J., Luo, J., Wang, S., & Yang, S. (2018). Feature selection in machine learning: A new perspective. Neurocomputing, 300, 70–79.
Chaudhary, A., & Kolhe, S. (2013). Performance Evaluation of feature selection methods for Mobile devices. Int. Journal of Engineering Research and Applications, 3(6), 587-594.
Cherrington, M., Thabtah, F., Lu, J., & Xu, Q. (2019). Feature Selection: Filter Methods Performance Challenges. 2019 International Conference on Computer and Information Sciences (ICCIS), pp. 1-4. doi: 10.1109/ICCISci.2019.8716478.
Cormen, T.H. (2009). Introduction to Algorithms, MIT Press.
Darshan, S.S., & Jaidhar, C. (2018). Performance evaluation of filter-based feature selection techniques in classifying portable executable files. Procedia Computer Science, 125, 346–356.
Das, A.K., Sengupta, S., & Bhattacharyya, S. (2018). A group incremental feature selection for classification using a rough set theory-based genetic algorithm. Applied Soft Computing, 65, 400–411.
Kamalov, F., Sulieman, H., Moussa, S, Reyes, J.A., & Safaraliev, M. (2023). Nested ensemble selection: An effective hybrid feature selection method. Heliyon, 9(9), e19686, https://doi.org/10.1016/j.heliyon.2023.e19686
Fleuret, F. (2004). Fast binary feature selection with conditional mutual information. Journal of Machine Learning Research, 5, 1531–1555.
Gong, L., Xie, S., Zhang, Y., Wang, M., & Wang, X. (2022). Hybrid feature selection method based on feature subset and factor analysis. IEEE Access, 10, 120792-120803.
Guyon, I., & Elisseeff, A. (2003). An introduction to variable and feature selection. Journal of Machine Learning Research, 3, 1157-1182.
Haq, A.U., Zeb, A., Lei, Z., & Zhang, D. (2021). Forecasting daily stock trend using multi-filter feature selection and deep learning. Expert Systems with Applications, 168, 114444.
Hoque, N., Singh, M., & Bhattacharyya, D.K. (2018). EFS-MI: An ensemble feature selection method for classification. Complex Intelligent System., 4(2), 105–118.
Hu, M., Tsang, E.C.C., Guo, Y., & Xu, W. (2021). Fast and Robust Attribute Reduction Based on the Separability in Fuzzy Decision Systems. IEEE Trans Cybern., doi: 10.1109/TCYB.2020.3040803.
Inza, I., Larrañaga, P., Blanco, R., & Cerrolaza, A.J. (2004). Filter versus wrapper gene selection approaches in DNA microarray domains. Artificial Intellectual Medicine, 31(2), 91–103.
Kang, I-A., Njimbouom, SN., & Kim, J-D. (2023). Optimal Feature Selection-Based Dental Caries Prediction Model Using Machine Learning for Decision Support System. Bioengineering, 10(2), 245. https://doi.org/10.3390/bioengineering10020245Kothari, CR. (2007). Quantitative techniques, New Delhi, UBS Publishers Ltd.
Kim, M., Bae, J., Wang, B., Ko, H., & Lim, JS. (2022). Feature Selection Method Using Multi-Agent Reinforcement Learning Based on Guide Agents. Sensors (Basel), 23(1), 98. doi: 10.3390/s23010098.
Kohavi, R., & John, G.H. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97(1–2), 273–324. http://www.robotics.stanford.edu/~gjohn.
Lazar, C., Taminau, J., Meganck, S., Steenhoff, D., Coletta, A., Molter, C., de Schaetzen, V., Duque, R., Bersini, H., & Nowe, A. (2012). A survey on filter techniques for feature selection in gene expression microarray analysis. IEEE/ACM Trans. Comput. Biol. Bioinform, 9(4), 1106–1119.
Liu, Y. (2004). A comparative study on feature selection methods for drug discovery. Journal of Chemical and Information Computational Science, 44(5), 1823–1828.
Luekiangkhamla, A., Panagant, N., Bureerat, S., & Pholdee, N. (2023). Application of various machine learning models for fault detection in the refrigeration system of a brewing company. Engineering and Applied Science Research, 50(2), 149–154. https://ph01.tci-thaijo.org/index.php/easr/article/view/251060
Ma, B., & Xia, Y. (2017). A tribe competition-based genetic algorithm for feature selection in pattern classification. Appl. Soft Computing, 58, 328–338.
Mienye, I.D., & Sun, Y. (2023). A Machine Learning Method with Hybrid Feature Selection for Improved Credit Card Fraud Detection. Applied Sciences, 13(12):7254. https://doi.org/10.3390/app13127254
Mohtashami, M., & Eftekhari, M. (2019). A hybrid filter-based feature selection method via hesitant fuzzy and rough sets concepts. Iranian Journal of Fuzzy Systems, 16(2), 165–182.
Nayak, SK., Pradhan, BK., Banerjee, I., & Pal, K. (2020). Analysis of heart rate variability to understand the effect of cannabis consumption on Indian male paddy-field workers. Biomed Signal Process Control, 62:102072.
Niu, T., Wang, J., Lu, H., Yang, W., & Du, P. (2020). Developing a deep learning framework with two-stage feature selection for multivariate financial time series forecasting. Expert Systems with Applications, 148: 113237.
Pabuccu, H., & Barbu, A. (2023). Feature Selection for Forecasting. arXiv preprint arXiv:2303.02223.
Peng, C.Y.J., Lee, K.L., & Ingersoll, G.M. (2002). An introduction to logistic regression analysis and reporting. The Journal of Educational Research, 96(1), 3-14. http://dx.doi.org/10.1080/00220670209598786, 2002.
Peng, H., Long, F., & Ding, C. (2005). Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(8), 1226–1238.
Phyu, T.Z., & Oo, N.N. (2016). Performance Comparison of Feature Selection Methods. The 3rd International Conference on Control, Mechatronics and Automation (ICCMA 2015), 42(06002). doi: 10.1051/matecconf/20164206002
Pourpanah, F., Shi, Y., Lim, C., Hao, Q., & Tan, C.J. (2019). Feature selection based on brainstorm optimization for data classification. Applied Soft Computing Journal, 80, 761–775.
Quinlan, J.R. (1989). Unknown Attribute Values in Induction. In Proceedings of the Sixth International Workshop on Machine Learning, 164-168.
Reunanen, J. (2006). Search Strategies. Feature Extraction Foundations and Applications Stud-Fuzz 207, 119–136, Springer.
Ringsquandl, M., Lamparter, S., Brandt, S., Hubauer, T., & Lepratti, R. (2015). Semantic-guided feature selection for industrial automation systems. The Semantic Web - ISWC 2015, pp. 225-240. http://dx.doi.org/10.1007/978-3-319-25010-6_13, 2015.
Sinayobye, J.O., Kaawaase, K.S., Kiwanuka, F.N., & Musabe, R. (2019). Hybrid Model of Correlation Based Filter Feature Selection and Machine Learning Classifiers Applied on Smart Meter Data Set. IEEE/ACM Symposium on Software Engineering in Africa (SEiA), pp. 1-10. doi: 10.1109/SEiA.2019.00009.
Soheili, M., & Moghadam, A.E. (2020). DQPFS: Distributed quadratic programming based feature selection for big data. Journal of Parallel and Distributed Computing, 138, 1-14.
Suresh, S., Newton, DT., Everett, TH., Lin, G., & Duerstock, BS. (2022). Feature Selection Techniques for a Machine Learning Model to Detect Autonomic Dysreflexia. Front Neuroinform, 10(16),901428. doi: 10.3389/fninf.2022.901428.
Tianyi, Z. Yingzhe, Z., & Zhe, W. (2023). Feature selection-based machine learning modeling for distributed model predictive control of nonlinear processes. Computers & Chemical Engineering, 169, 108074. https://doi.org/10.1016/j.compchemeng.2022.108074.
Velayutham, C., & Thangavel, K. (2011). Unsupervised Quick Reduct Algorithm Rough Set Theory. Journal of Electronic Science and Technology, 9(3).
Velusamy, K., & Manavalan, R. (2012). Performance Analysis of Unsupervised Classification based on Optimization. International Journal of Computer Applications, 42(19), 22-27.
Wah, Y.B., Ibrahim, N., Hamid, H.A., Abdul-Rahman, S., & Fong, S. (2018). Feature selection methods: the case of filter and wrapper approaches for maximizing classification accuracy. Pertanika Journal of Science and Technology., 26(1), 329–340.
Xiao, W., Ji, P., & Hu, J. (2021). RnkHEU: A Hybrid Feature Selection Method for Predicting Students’ Performance. Scientific Programming, 2021, 1-16. https://doi.org/10.1155/2021/1670593
Xue, B., Zhang, M., & Browne, W.N. (2015). A comprehensive comparison of evolutionary feature selection approaches to classification. International Journal of Computational Intelligence and Application, 14(2).
Yin, Y., Jang-Jaccard J., Xu W., Singh, A., Zhu, J., Sabrina, F., & Kwak, J. (2023). IGRF-RFE: a hybrid feature selection method for MLP-based network intrusion detection on UNSW-NB15 dataset. Journal of Big Data,10, https://doi.org/10.1186/s40537-023-00694-8
Yongbin, Z., Wenshan, L., & Tao, L. (2023). hybrid Artificial Immune optimization for high-dimensional feature selection, Knowledge-Based Systems, 260, 110111, https://doi.org/10.1016/j.knosys.2022.110111.
Zhang, Y., Gong, D., Hu, Y., & Zhang, W. (2015). Feature selection algorithm based on bare bones particle swarm optimization. Neurocomputing, 148, 150–157.
Zhu, Z., Ong, Y.S., & Dash, M. (2007). Wrapper-filter feature selection algorithm using a memetic framework. IEEE Transactions on Systems, Man, and Cybernetics, 37(1), 70–76.