Trainable activation function as a sigmoid-based generalization of ReLU, Swish and SiLU.
Source: Adaptive hybrid activation function for deep neural networksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Activation Function Synthesis | 2 | 33.33% |
Image Classification | 2 | 33.33% |
Learning Theory | 2 | 33.33% |
Component | Type |
|
---|---|---|
Sigmoid Activation
|
Activation Functions | |
SiLU
|
Activation Functions | |
Swish
|
Activation Functions |