Knowledge Distillation
Pretext task
Secondary self-supervised task designed to force the model to learn useful representations, such as predicting masked parts or reconstructing corrupted inputs.
← Indietro