Fraternal Dropout is a regularization method for recurrent neural networks that trains two identical copies of an RNN (that share parameters) with different dropout masks while minimizing the difference between their (pre-softmax) predictions. This encourages the representations of RNNs to be invariant to dropout mask, thus being robust.
Source: Fraternal DropoutPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Decoder | 1 | 25.00% |
Text Generation | 1 | 25.00% |
Image Captioning | 1 | 25.00% |
Language Modelling | 1 | 25.00% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |