PERBANDINGAN KINERJA KNN, SVM, DAN ANN UNTUK MEMPREDIKSI LEVEL OBESITAS

Main Article Content

Georgia Sugisandhea
Teny Handhayani

Abstract

This study aims to compare the performance of K-Nearest Neighbors (KNN), Artificial Neural Network (ANN), and Support Vector Machines (SVM), classification methods to find the best suited method to train a machine to classify someone to their group of obesity levels according to their eating habits and physical condition. This experiment uses the “Estimation of Obesity Levels based on Eating Habits and Physical Condition” dataset. The primary focus is on achieving high accuracy score and complex decision boundaries handling without minding the long training times, considering misclassification in the medical field might cause fatal consequences. This experiment’s result shows that the SVM classification method with linear kernel provides the best overall performance for classifying obesity level, with the average accuracy of 0.944, precision of 0.944, recall of 0.942, and f1-score of 0.942. Notably, with the help of C kernel parameter of 200, the model teaches near-perfect performance evaluation scores that has the result of 0.99 score in accuracy, precision, recall, and f1-score.

Article Details

Section
Articles

References

[1] S. Kurnia Saraswati et al., “MEDIA KESEHATAN MASYARAKAT INDONESIA Literature Review : Faktor Risiko Penyebab Obesitas,” MEDIA KESEHATAN MASYARAKAT INDONESIA, 2020.

[2] A. Az-Zahra, I. Muyassar, and S. Maharani, “Pengaruh Gaya Hidup Terhadap Kejadian Obesitas di Indonesia,” 2022.

[3] W. Zhu, R. Qiu, and Y. Fu, “Comparative Study on the Performance of Categorical Variable Encoders in Classification and Regression Tasks,” Jan. 2024, [Online]. Available: http://arxiv.org/abs/2401.09682

[4] S. Rabbani, D. Safitri, N. Rahmadhani, A. A. F. Sani, and M. K. Anam, “Perbandingan Evaluasi Kernel SVM untuk Klasifikasi Sentimen dalam Analisis Kenaikan Harga BBM,” MALCOM: Indonesian Journal of Machine Learning and Computer Science, vol. 3, no. 2, 2023, doi: 10.57152/malcom.v3i2.897.

[5] I. K. Nti, O. Nyarko-Boateng, and J. Aning, “Performance of Machine Learning Algorithms with Different K Values in K-fold CrossValidation,” International Journal of Information Technology and Computer Science, vol. 13, no. 6, pp. 61–71, Dec. 2021, doi: 10.5815/ijitcs.2021.06.05.

[6] X. Zhang and C. A. Liu, “Model averaging prediction by K-fold cross-validation,” J Econom, vol. 235, no. 1, 2023, doi: 10.1016/j.jeconom.2022.04.007.

[7] W. Wijiyanto, A. I. Pradana, S. Sopingi, and V. Atina, “Teknik K-Fold Cross Validation untuk Mengevaluasi Kinerja Mahasiswa,” Jurnal Algoritma, vol. 21, no. 1, May 2024, doi: 10.33364/algoritma/v.21-1.1618.

[8] J. White and S. D. Power, “k-Fold Cross-Validation Can Significantly Over-Estimate True Classification Accuracy in Common EEG-Based Passive BCI Experimental Designs: An Empirical Investigation,” Sensors, vol. 23, no. 13, 2023, doi: 10.3390/s23136077.

[9] S. Zhang and J. Li, “KNN Classification With One-Step Computation,” IEEE Trans Knowl Data Eng, vol. 35, no. 3, 2023, doi: 10.1109/TKDE.2021.3119140.

[10] A. Putri, C. Syaficha Hardiana, E. Novfuja, F. Try Puspa Siregar, Y. Fatma, and R. Wahyuni, “Comparison of K-NN, Naive Bayes and SVM Algorithms for Final-Year Student Graduation Prediction Komparasi Algoritma K-NN, Naive Bayes dan SVM untuk Prediksi Kelulusan Mahasiswa Tingkat Akhir,” Institut Riset dan Publikasi Indonesia (IRPI) MALCOM: Indonesian Journal of Machine Learning and Computer Science Journal Homepage, vol. 3, no. 1, 2023.

[11] E. Y. Boateng, J. Otoo, and D. A. Abaye, “Basic Tenets of Classification Algorithms K-Nearest-Neighbor, Support Vector Machine, Random Forest and Neural Network: A Review,” Journal of Data Analysis and Information Processing, vol. 08, no. 04, 2020, doi: 10.4236/jdaip.2020.84020.

[12] M. G. M. Abdolrasol et al., “Artificial neural networks based optimization techniques: A review,” 2021. doi: 10.3390/electronics10212689.

[13] M. M. Mijwil, “Artificial Neural Networks Advantages and Disadvantages,” Mesopotamian Journal of Big Data, vol. 2021, 2021, doi: 10.58496/mjbd/2021/006.

[14] M. Uzair and N. Jamil, “Effects of Hidden Layers on the Efficiency of Neural networks,” in Proceedings - 2020 23rd IEEE International Multi-Topic Conference, INMIC 2020, 2020. doi: 10.1109/INMIC50486.2020.9318195.

[15] G. R. Yang and X. J. Wang, “Artificial Neural Networks for Neuroscientists: A Primer,” 2020. doi: 10.1016/j.neuron.2020.09.005.

[16] I. Gunawan, “Optimasi Model Artificial Neural Network untuk Klasifikasi Paket Jaringan,” SIMETRIS, vol. 14, no. 2, 2020, doi: 10.51901/simetris.v14i2.135.

[17] D. M. Abdullah and A. M. Abdulazeez, “Machine Learning Applications based on SVM Classification: A Review,” Qubahan Academic Journal, vol. 1, no. 2, 2021, doi: 10.48161/qaj.v1n2a50.

[18] A. Kurani, P. Doshi, A. Vakharia, and M. Shah, “A Comprehensive Comparative Study of Artificial Neural Network (ANN) and Support Vector Machines (SVM) on Stock Forecasting,” 2023. doi: 10.1007/s40745-021-00344-x.

[19] B. Abd-ElMassieh Aiad, K. Basem Zarif, and Z. Mahmoud Gadallah, “Support Vector Machine Kernel Functions Comparison,” 2021.

[20] M. Yalsavar, P. Karimaghaee, A. Sheikh-Akbari, M. H. Khooban, J. Dehmeshki, and S. Al-Majeed, “Kernel Parameter Optimization for Support Vector Machine Based on Sliding Mode Control,” IEEE Access, vol. 10, 2022, doi: 10.1109/ACCESS.2022.3150001.

[21] B. A. Maulana, M. J. Fahmi, A. M. Imran, and N. Hidayati, “Analisis Sentimen Terhadap Aplikasi Pluang Menggunakan Algoritma Naive Bayes dan Support Vector Machine (SVM),” MALCOM: Indonesian Journal of Machine Learning and Computer Science, vol. 4, no. 2, pp. 375–384, Feb. 2024, doi: 10.57152/malcom.v4i2.1206.

[22] M. Vakili, M. Ghamsari, and M. Rezaei, “Performance Analysis and Comparison of Machine and Deep Learning Algorithms for IoT Data Classification,” 2020.

Most read articles by the same author(s)

Similar Articles

<< < 1 2 3 4 5 6 > >> 

You may also start an advanced similarity search for this article.