Self-Supervised Transfer
SimCLR
Simple contrastive learning framework that maximizes agreement between different augmentations of the same sample after passing through a neural network. This approach demonstrates that data augmentation and batch size are crucial for performance.
← Geri