Review: PR-260-Momentum Contrast for Unsupervised Visual Representation Learning

Joonsu Oh
Mar 21, 2022
  1. There has been a drastic change in a field of SSL-contrastive learning (SimCLR, MoCo, BYOL…).
  2. For SSL-contrastive learning, SimCLR has an issue with large memory requirements due to large batch size and instance discrimination has an inconsistency issue due to an outdated memory bank.
  3. MoCo resolves the aforementioned issues by using a momentum encoder, which is a moving average of the newer version of the encoder.

--

--