Activation Functions

Tanh Activation

Tanh Activation is an activation function used for neural networks:

$$f\left(x\right) = \frac{e^{x} - e^{-x}}{e^{x} + e^{-x}}$$

Historically, the tanh function became preferred over the sigmoid function as it gave better performance for multi-layer neural networks. But it did not solve the vanishing gradient problem that sigmoids suffered, which was tackled more effectively with the introduction of ReLU activations.

Image Source: Junxi Feng

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 25 3.58%
Decoder 22 3.15%
Time Series Forecasting 19 2.72%
Sentence 17 2.43%
Management 16 2.29%
Decision Making 15 2.15%
Sentiment Analysis 14 2.00%
Image-to-Image Translation 13 1.86%
Classification 13 1.86%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories