COMPARISON OF SGD, RMSProp, AND ADAM OPTIMATION IN ANIMAL CLASSIFICATION USING CNNs

Authors

  • Desi Irfan Universitas Potensi Utama, Medan
  • Teddy Surya Gunawan Universitas Potensi Utama, Medan
  • Wanayumini Wanayumini Universitas Potensi Utama, Medan

DOI:

https://doi.org/10.35842/icostec.v2i1.32

Keywords:

Optimization Function, SGD, Adam, RMSProp

Abstract

Many measures have been taken to protect endangered species by using "camera trap" technology which is widespread in the field of technology-based nature protection field research. In this study, a machine learning-based approach is presented to identify endangered wildlife images with a data set containing 5000 images taken from Kaggle and some other sources. The Gradient Descent optimization method is often used for Artificial Neural Network (ANN) training. This method plays a role in finding the weight values that give the best output value. Three optimization methods have been implemented, namely Stochastic Gradient Descent (SGD), ADADELTA, and Adam on the Artificial Neural Network system for animal data classification. In some of the studies reviewed there are differences in the results of SGD and ADAM, which on the one hand SGD is superior, and on the one hand ADAM is superior with the appropriate learning rate. The results of this study show that the CNN method with the Adam optimization function produces the highest accuracy compared to the SGD and RMSprop optimization methods. The model trained using Adam's optimization function achieved an accuracy of 89.81% on the test, showing the feasibility of the approach.

Downloads

Published

2023-02-28