BERT (Bidirectional Encoder Representations)
DistilBERT (Distilled BERT)
Lightweight version of BERT created through knowledge distillation, maintaining 97% of BERT base performance with only 40% of the parameters for faster inference.
← Indietro