Activation Functions

Sigmoid Activation

Sigmoid Activations are a type of activation function for neural networks:

$$f\left(x\right) = \frac{1}{\left(1+\exp\left(-x\right)\right)}$$

Some drawbacks of this activation that have been noted in the literature are: sharp damp gradients during backpropagation from deeper hidden layers to inputs, gradient saturation, and slow convergence.

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 21 2.92%
Decoder 20 2.78%
Classification 20 2.78%
Image-to-Image Translation 16 2.22%
Sentence 15 2.08%
Image Classification 14 1.94%
Time Series Forecasting 14 1.94%
Sentiment Analysis 13 1.81%
Management 13 1.81%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories