Review: PR-260-Momentum Contrast for Unsupervised Visual Representation Learning
- There has been a drastic change in a field of SSL-contrastive learning (SimCLR, MoCo, BYOL…).
- For SSL-contrastive learning, SimCLR has an issue with large memory requirements due to large batch size and instance discrimination has an inconsistency issue due to an outdated memory bank.
- MoCo resolves the aforementioned issues by using a momentum encoder, which is a moving average of the newer version of the encoder.