Downloads: 4
United States | Computer Science and Information Technology | Volume 8 Issue 3, March 2019 | Pages: 1940 - 1949
Optimizing Neural Networks: A Comparative Analysis of Activation Functions in Deep Learning
Abstract: This paper provides a comprehensive analysis of activation functions in multilayer neural networks. The study delves into the role of different activation functions and their impact when combined with various loss functions. It focuses on how activation functions contribute to neuron activation based on weight calculations and inclinations, thereby introducing nonlinearity in neuron outputs. The research highlights the importance of back-propagation in neural networks facilitated by activation functions. This study aims to offer clear guidance on selecting the appropriate activation function for specific problems and parameters.
Keywords: Neural Networks, Activation Functions, Deep Learning, Back-Propagation, Loss Functions
How to Cite?: Mainak Mitra, Soumit Roy, Vikram Maghnani, "Optimizing Neural Networks: A Comparative Analysis of Activation Functions in Deep Learning", Volume 8 Issue 3, March 2019, International Journal of Science and Research (IJSR), Pages: 1940-1949, https://www.ijsr.net/getabstract.php?paperid=SR231205140623, DOI: https://dx.doi.org/10.21275/SR231205140623
Received Comments
No approved comments available.