Pengembangan Stochastic Gradient Descent dengan Penambahan Variabel Tetap
Main Article Content
Abstract
Stochastic Gradient Descent (SGD) is one of the commonly used optimizers in deep learning. Therefore, in this work, we modify stochastic gradient descent (SGD) by adding a fixed variable. We will then look at the differences between standard stochastic gradient descent (SGD) and stochastic gradient descent (SGD) with additional variables. The phases performed in this study were: (1) optimization analysis, (2) fix design, (3) fix implementation, (4) fix test, (5) reporting. The results of this study aim to show the additional impact of fixed variables on the performance of stochastic gradient descent (SGD).
Downloads
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
The Authors submitting a manuscript do so on the understanding that if accepted for publication, copyright of the article shall be assigned to JTIK journal and Research Division, KITA Institute as the publisher of the journal. Copyright encompasses rights to reproduce and deliver the article in all form and media, including reprints, photographs, microfilms, and any other similar reproductions, as well as translations.
JTIK journal and Research Division, KITA Institute and the Editors make every effort to ensure that no wrong or misleading data, opinions or statements be published in the journal. In any way, the contents of the articles and advertisements published in JTIK journal are the sole and exclusive responsibility of their respective authors and advertisers.
The Copyright Transfer Form can be downloaded here: [Copyright Transfer Form JTIK]. The copyright form should be signed originally and send to the Editorial Office in the form of original mail, scanned document or fax :
Muhammad Wali (Editor-in-Chief)
Editorial Office of Jurnal JTIK (Jurnal Teknologi Informasi dan Komunikasi)
Research Division, KITA Institute
Teuku Nyak Arief Street Nomor : 7b, Lamnyong, Lamgugop, Kota Banda Aceh
Telp./Fax: 0651-8070141
Email: jtik@lembagakita.org - journal@lembagakita.org
References
Dahria, M., 2008. Kecerdasan Buatan (Artificial Intelligence). Jurnal Saintikom, 5(2), pp.185-197.
Giri, S., 2020. Writing Custom Optimizer in TensorFlow Keras API, https://cloudxlab.com/blog/writing-custom-optimizer-in-tensorflow-and-keras/ [Accessed at 28 Oktober 2021].
Wright, L. and Demeure, N., 2021. Ranger21: a synergistic deep learning optimizer. arXiv preprint arXiv:2106.13731.