Activation Functions for Neural Networks: Application and Performance-based Comparison


Authors : Ajay Kumar; Dr. Nilesh Ware

Volume/Issue : Volume 9 - 2024, Issue 4 - April


Google Scholar : https://tinyurl.com/bdzjkm4t

Scribd : https://tinyurl.com/3jeub6rr

DOI : https://doi.org/10.38124/ijisrt/IJISRT24APR934

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : Past decade has seen explosive growth of Deep Learning (DL) algorithms based on Artificial Neural Networks (ANNs) and its applications in vast emerging domains to solve real world complex problems. The DL architecture uses Activation Functions (AFs), to perform the task of finding relationship between the input feature and the output. Essential building blocks of any ANN are AFs which bring the required non-linearity of the output in the Output layer of network. Layers of ANNs are combinations of linear and nonlinear AFs. Most extensively used AFs are Sigmoid, Hyperbolic Tangent (Tanh), Rectified Linear Unit (ReLU) etc to name a few. Choosing an AF for a particular AF depends on various factors such as Nature of Application, Design of ANN, Optimizers used in the network, Complexity of Data etc. This paper presents a survey on most widely used AFs along with the important consideration while selecting an AF on a specific problem domain. A broad guideline on selecting an AF based on the literature survey has been presented to help researchers in employing suitable AF in their problem domain.

Keywords : Artificial Neural Network, Activation Functions, RNN.

Past decade has seen explosive growth of Deep Learning (DL) algorithms based on Artificial Neural Networks (ANNs) and its applications in vast emerging domains to solve real world complex problems. The DL architecture uses Activation Functions (AFs), to perform the task of finding relationship between the input feature and the output. Essential building blocks of any ANN are AFs which bring the required non-linearity of the output in the Output layer of network. Layers of ANNs are combinations of linear and nonlinear AFs. Most extensively used AFs are Sigmoid, Hyperbolic Tangent (Tanh), Rectified Linear Unit (ReLU) etc to name a few. Choosing an AF for a particular AF depends on various factors such as Nature of Application, Design of ANN, Optimizers used in the network, Complexity of Data etc. This paper presents a survey on most widely used AFs along with the important consideration while selecting an AF on a specific problem domain. A broad guideline on selecting an AF based on the literature survey has been presented to help researchers in employing suitable AF in their problem domain.

Keywords : Artificial Neural Network, Activation Functions, RNN.

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe