Pengembangan Stochastic Gradient Descent dengan Penambahan Variabel Tetap

Main Article Content

Adimas Tristan Nagara Hartono
Hindriyanto Dwi Purnomo

Abstract

Stochastic Gradient Descent (SGD) is one of the commonly used optimizers in deep learning. Therefore, in this work, we modify stochastic gradient descent (SGD) by adding a fixed variable. We will then look at the differences between standard stochastic gradient descent (SGD) and stochastic gradient descent (SGD) with additional variables. The phases performed in this study were: (1) optimization analysis, (2) fix design, (3) fix implementation, (4) fix test, (5) reporting. The results of this study aim to show the additional impact of fixed variables on the performance of stochastic gradient descent (SGD).

Downloads

Download data is not yet available.

Article Details

How to Cite
Hartono, A. T. N., & Purnomo, H. D. (2023). Pengembangan Stochastic Gradient Descent dengan Penambahan Variabel Tetap. Jurnal JTIK (Jurnal Teknologi Informasi Dan Komunikasi), 7(3), 359–367. https://doi.org/10.35870/jtik.v7i3.840
Section
Computer & Communication Science
Author Biographies

Adimas Tristan Nagara Hartono, Universitas Kristen Satya Wacana

Program Studi Teknik Informatika, Fakultas Teknologi Informasi, Universitas Kristen Satya Wacana, Kota Salatiga, Provinsi Jawa Tengah, Indonesia

Hindriyanto Dwi Purnomo, Universitas Kristen Satya Wacana

Program Studi Teknik Informatika, Fakultas Teknologi Informasi, Universitas Kristen Satya Wacana, Kota Salatiga, Provinsi Jawa Tengah, Indonesia

References

Dahria, M., 2008. Kecerdasan Buatan (Artificial Intelligence). Jurnal Saintikom, 5(2), pp.185-197.

Giri, S., 2020. Writing Custom Optimizer in TensorFlow Keras API, https://cloudxlab.com/blog/writing-custom-optimizer-in-tensorflow-and-keras/ [Accessed at 28 Oktober 2021].

Wright, L. and Demeure, N., 2021. Ranger21: a synergistic deep learning optimizer. arXiv preprint arXiv:2106.13731.

Bircanoğlu, C., 2017. A comparison of loss functions in deep embedding (Master's thesis, Fen Bilimleri Enstitüsü).

Nurfita, R.D. and Gunawan Ariyanto, S.T., 2018. Implementasi Deep Learning Berbasis Tensorflow Untuk Pengenalan Sidik Jari (Doctoral dissertation, Universitas Muhammadiyah Surakarta).
Machine Learning what it is and why it matters, https://www.sas.com/en_in/insights/analytics/machine-learning.html [Accessed at 2 November 2021].

Albers Uzila, 2021, Complete step by step gradient descent algorithm from scratch, https://towardsdatascience.com/complete-step-by-step-gradient-descent-algorithm-from-scratch-acba013e8420 [Accessed at 25 November 2021]

Jason Brownlee, 2021, Adam optimization from the scratch, https://machinelearningmastery.com/adam-optimization-from-scratch/ [Accessed at 25 November 2021]

Ayush Gupta, 2021, Comprehensive Guide on Deep Learning Optimizers, https://www.analyticsvidhya.com/blog/2021/10/a-comprehensive-guide-on-deep-learning-optimizers/ [Accessed at 25 November 2021]

Jason Brownlee, 2021, Gradient descent with rmsprop from scratch, https://machinelearningmastery.com/gradient-descent-with-rmsprop-from-scratch/ [Accessed at 25 November 2021]

Roan Gylberth, 2018, An introduction to adagrad, https://medium.com/konvergen/an-introduction-to-adagrad-f130ae871827 [Accessed at 25 November 2021].

Prasetio, R.T. and Ripandi, E., 2019. Optimasi Klasifikasi jenis hutan menggunakan deep learning berbasis optimize selection. Jurnal Informatika, 6(1), pp.100-106. DOI: https://doi.org/10.31294/ji.v6i1.5176

Bengio, Y. and LeCun, Y., 2007. Scaling learning algorithms towards AI. Large-scale kernel machines, 34(5), pp.1-41. DOI: https://doi.org/10.1038/nature14539.

Nwankpa, C.E., 2020. Advances in optimisation algorithms and techniques for deep learning. Advances in Science, Technology and Engineering Systems Journal, 5(5), pp.563-577. DOI: https://doi.org/10.25046/aj050570.

Sazli, M.H., 2006. A brief review of feed-forward neural networks. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering, 50(01). DOI: https://doi.org/10.1501/commua1-2_0000000026.

Rinanto, N., Wahyudi, M.T. and Khumaidi, A., 2018. Radial basis function neural network sebagai pengklasifikasi citra cacat pengelasan. Rekayasa, 11(2), pp.118-131. DOI: https://doi.org/10.21107/rekayasa.v11i2.4418.

Wibawa, A.P., Lestari, W., Utama, A.B.P., Saputra, I.T. and Izdihar, Z.N., 2020. Multilayer Perceptron untuk Prediksi Sessions pada Sebuah Website Journal Elektronik. Indonesian Journal of Data and Science, 1(3), pp.57-67. DOI: https://doi.org/10.33096/ijodas.v1i3.15.

Ibrahim, H.S., Jondri, J. and Wisesty, U.N., 2018. Analisis Deep Learning Untuk Mengenali Qrs Kompleks Pada Sinyal Ecg Dengan Metode Cnn. eProceedings of Engineering, 5(2).

Faishol, M.A., Endroyono, E. and Irfansyah, A.N., 2020. Predict Urban Air Pollution in Surabaya Using Recurrent Neural Network–Long Short Term Memory. JUTI J. Ilm. Teknol. Inf, 18(2), p.102-114. DOI: http://dx.doi.org/10.12962/j24068535.v18i2.a988.

Kala, R., Vazirani, H., Shukla, A. and Tiwari, R., 2010. Medical Diagnosis using Incremental Evolution of Neural Network. Journal of Hybrid Computing Research, 3(1), pp.9-17.

Wikarta, A., Pramono, A.S. and Ariatedja, J.B., 2020, December. Analisa Bermacam Optimizer Pada Convolutional Neural Network Untuk Deteksi Pemakaian Masker Pengemudi Kendaraan. In Seminar Nasional Informatika (SEMNASIF) (Vol. 1, No. 1, pp. 69-72).

IBM Cloud Education, 2020, Machine Learning, https://www.ibm.com/cloud/learn/machine-learning#:~:text=Machine%20learning%20is%20a%20branch,learn%2C%20gradually%20improving%20its%20accuracy. [Accessed at 5 Juli 2022].