Integrated ECOD-KNN Algorithm for Missing Values Imputation in Datasets: Outlier Removal


Authors : Tsitsi Jester Mugejo; Weston Govere

Volume/Issue : Volume 9 - 2024, Issue 7 - July


Google Scholar : https://tinyurl.com/yc6633az

Scribd : https://tinyurl.com/ya8hps4d

DOI : https://doi.org/10.38124/ijisrt/IJISRT24JUL1459

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : Missing data cause the incompleteness of data sets and can lead to poor performance of models which also can result in poor decisions, despite using the best handling methods. When there is a presence of outliers in the data, using KNN algorithm for missing values imputation produce less accurate results. Outliers are anomalies from the observations and removing outliers is one of the most important pre-processing step in all data analysis models. KNN algorithms are able to adapt to missing value imputation even though they are sensitive to outliers, which might end up affecting the quality of the imputation results. KNN is mainly used among other machine learning algorithms because it is simple to implement and have a relatively high accuracy. In the literature, various studies have explored the application of KNN in different domains, however failing to address the issue of how sensitive it is to outliers. In the proposed model, outliers are identified using a combination of the Empirical- Cumulative-distribution-based Outlier Detection (ECOD), Local Outlier Factor (LOF) and isolation forest (IForest). The outliers are substituted using the median of the non- outlier data and the imputation of missing values is done using the k-nearest neighbors algorithm. For the evaluation of the model, different metrics were used such as the Root Mean Square Error (RMSE), (MSE), R2 squared (R2 ) and Mean Absolute Error (MAE). It clearly indicated that dealing with outliers first before imputing missing values produces better imputation results than just using the traditional KNN technique which is sensitive to outliers.

Keywords : Imputation; Outlier; Missing Values; Incomplete; Algorithm.

References :

  1. H. Nugroho, N.P Utama, and K. Surendro, “Normalization and outlier removal in class center‑based firefly algorithm for missing value imputation,” Open Access, J Big Data, (2021)8:129.
  2. D. Chehal, P. Gupta, P. Gulati, and T. Gupta,  “Comparative Study of Missing Value Imputation Techniques on E Commerce Product Ratings,” Informatica 47 (2023) 373–382.
  3. A.F. Sallaby, Azlan, “Analysis of Missing Value Imputation Application with K-Nearest Neighbor (K-NN) Algorithm in Dataset,” (International Journal of Informatics and Computer Science) Vol 5 No 2, July 2021, Page 141-144.
  4. P. Mishra, K.D. Mani, P. Johri, and D. Arya, “ FCMI: Feature Correlation based Missing Data   Imputation”
  5. I.S. Jacobs and C.P. Bean, “Fine particles, thin films and exchange anisotropy,” in Magnetism, vol. III, G.T. Rado and H. Suhl, Eds. New York: Academic, 1963, pp. 271-350.
  6. F. E. Harrell, Jr., “Regression Modeling Strategies,” Nashville, TN, USA July 2015, ISSN 2197-568X
  7. C. K. Enders, “Applied Missing Data Analysis,” Second Edition, 2022 pp1-43,
  8. M. TannousM. MiragliaF. IngleseL. GiorginiF. RicciardiR. PellicciaM. Milazzo, and C. Stefanini,  "Haptic-based Touch Detection for Collaborative Robots in Welding Applications",   ROBOTICS COMPUT. INTEGR. MANUF.,  2020.  (IF: 3)
  9.  L.Y. Wang, D. Wang; Y.H. Chen,  "Prediction Of Protein Subcellular Multisite Localization Using A New Feature Extraction Method",   GENETICS AND MOLECULAR RESEARCH : GMR,  2016
  10. F. Pirotti, R. Ravanelli, F. Fissore, and A. Masiero,  "Implementation and Assessment of Two Density-based Outlier Detection Methods Over Large Spatial Point Clouds",   OPEN GEOSPATIAL DATA, SOFTWARE AND STANDARDS,  2018.  (IF: 3).
  11. P. Keerin, W. Kurutach, and T. Boongoen,  "Cluster-based KNN Missing Value Imputation for DNA Microarray Data",   2012 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND ...,  2012.  (IF: 3)
  12. K.M. Fouad, M.M. Ismail, A.T. Azar, and M.M Arafa,  "Advanced Methods for Missing Values Imputation Based on Similarity Learning,"   PEERJ. COMPUTER SCIENCE,  2021.  (IF: 3).
  13. S. Patra, and B. Ganguly;  "Improvising Singular Value Decomposition By KNN for Use in Movie Recommender Systems",   JOURNAL OF OPERATIONS AND STRATEGIC PLANNING,  2019.
  14. N. Rabiei, A.R. Soltanian, M. Farhadian, and F. Bahreini;  "The Performance Evaluation of The Random Forest Algorithm for A Gene Selection in Identifying Genes Associated with Resectable Pancreatic Cancer in Microarray Dataset: A Retrospective Study",   CELL JOURNAL,  2023.
  15. F. Yang, J. Du, J. Lang, W. Lu, L. Liu, C. Jin, and Q. Kang;  "Missing Value Estimation Methods Research for Arrhythmia Classification Using The Modified Kernel Difference-Weighted KNN Algorithms",   BIOMED RESEARCH INTERNATIONAL,  2020.  (IF: 3)
  16. Z. Zhang,  "Introduction To Machine Learning: K-nearest Neighbors",   ANNALS OF TRANSLATIONAL MEDICINE,  2016.  (IF: 7)
  17. A. Hamed, A. Sobhy, and H. Nassar;  "Accurate Classification of COVID-19 Based on Incomplete Heterogeneous Data Using A KNN Variant Algorithm",   ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING,  2021.  (IF: 3)
  18. N. RabieiA.R. SoltanianM. Farhadian, and  F. Bahreini;  "The Performance Evaluation of The Random Forest Algorithm for A Gene Selection in Identifying Genes Associated with Resectable Pancreatic Cancer in Microarray Dataset: A Retrospective Study"CELL JOURNAL,  2023,
  19. ]  M. ZakiShao-jie ChenJicheng ZhangFan FengLiu Qi,  M.A. Mahdy, and Linlin Jin,  "Optimized Weighted Ensemble Approach for Enhancing Gold Mineralization Prediction",   APPLIED SCIENCES,  2023.
  20. S. SheikhiM.T. Kheirabadi, and A. Bazzazi;  "A Novel Scheme for Improving Accuracy of KNN Classification Algorithm Based on The New Weighting Technique and Stepwise Feature Selection",  2020.
  21. M. Zhang, and W. Xu;  "Study on An Improved Lie Group Machine Learning-based Classification Algorithm",   2020 IEEE 3RD INTERNATIONAL CONFERENCE OF SAFE PRODUCTION ...,  2020.
  22. E.Y. BoatengJ. Otoo, and  D.A. Abaye;  "Basic Tenets of Classification Algorithms K-Nearest-Neighbor, Support Vector Machine, Random Forest and Neural Network: A Review",  2020.  (IF: 4)

Missing data cause the incompleteness of data sets and can lead to poor performance of models which also can result in poor decisions, despite using the best handling methods. When there is a presence of outliers in the data, using KNN algorithm for missing values imputation produce less accurate results. Outliers are anomalies from the observations and removing outliers is one of the most important pre-processing step in all data analysis models. KNN algorithms are able to adapt to missing value imputation even though they are sensitive to outliers, which might end up affecting the quality of the imputation results. KNN is mainly used among other machine learning algorithms because it is simple to implement and have a relatively high accuracy. In the literature, various studies have explored the application of KNN in different domains, however failing to address the issue of how sensitive it is to outliers. In the proposed model, outliers are identified using a combination of the Empirical- Cumulative-distribution-based Outlier Detection (ECOD), Local Outlier Factor (LOF) and isolation forest (IForest). The outliers are substituted using the median of the non- outlier data and the imputation of missing values is done using the k-nearest neighbors algorithm. For the evaluation of the model, different metrics were used such as the Root Mean Square Error (RMSE), (MSE), R2 squared (R2 ) and Mean Absolute Error (MAE). It clearly indicated that dealing with outliers first before imputing missing values produces better imputation results than just using the traditional KNN technique which is sensitive to outliers.

Keywords : Imputation; Outlier; Missing Values; Incomplete; Algorithm.

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe