Training Techniques | NPID++, NPID, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_npid_oss |
SHOW MORE |
Training Techniques | NPID++, NPID, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
Epochs | 800 |
SHOW MORE |
NPID++ (Non-Parametric Instance Discrimination) is a self-supervision approach that takes a non-parametric classification approach. It approves upon NPID by using more negative samples and training for more epochs.
Get started with VISSL by trying one of the Colab tutorial notebooks.
@article{DBLP:journals/corr/abs-1912-01991,
author = {Ishan Misra and
Laurens van der Maaten},
title = {Self-Supervised Learning of Pretext-Invariant Representations},
journal = {CoRR},
volume = {abs/1912.01991},
year = {2019},
url = {http://arxiv.org/abs/1912.01991},
archivePrefix = {arXiv},
eprint = {1912.01991},
timestamp = {Thu, 02 Jan 2020 18:08:18 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1912-01991.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
@misc{goyal2021vissl,
author = {Priya Goyal and Benjamin Lefaudeux and Mannat Singh and Jeremy Reizenstein and Vinicius Reis and
Min Xu and and Matthew Leavitt and Mathilde Caron and Piotr Bojanowski and Armand Joulin and
Ishan Misra},
title = {VISSL},
howpublished = {\url{https://github.com/facebookresearch/vissl}},
year = {2021}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
ImageNet | NPID++ ResNet-50-w2 (32k negatives, 800 epochs, cosine LR) | Top 1 Accuracy | 62.73% | # 301 |
ImageNet | NPID++ ResNet-50 (32k negatives, 800 epochs, cosine LR) | Top 1 Accuracy | 56.68% | # 305 |