Xavier Initialization, or Glorot Initialization, is an initialization scheme for neural networks. Biases are initialized be 0 and the weights $W_{ij}$ at each layer are initialized as:
$$ W_{ij} \sim U\left[-\frac{\sqrt{6}}{\sqrt{fan_{in} + fan_{out}}}, \frac{\sqrt{6}}{\sqrt{fan_{in} + fan_{out}}}\right] $$
Where $U$ is a uniform distribution and $fan_{in}$ is the size of the previous layer (number of columns in $W$) and $fan_{out}$ is the size of the current layer.
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
General Classification | 15 | 9.62% |
Object Detection | 14 | 8.97% |
Image Classification | 14 | 8.97% |
Classification | 10 | 6.41% |
Semantic Segmentation | 6 | 3.85% |
Quantization | 5 | 3.21% |
Face Recognition | 4 | 2.56% |
Network Pruning | 3 | 1.92% |
Face Verification | 3 | 1.92% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |