Słownik AI
Kompletny słownik sztucznej inteligencji
Pre-trained Model
Neural model already trained on a large corpus of generic data, serving as a starting point for specific tasks thanks to its ability to extract relevant features.
Freeze Layers
Technique consisting of disabling the weight updates of certain network layers during training, thus preserving pre-learned knowledge and reducing computational requirements.
Learning Rate
Crucial hyperparameter controlling the magnitude of model weight updates at each iteration, generally requiring a lower value during fine-tuning to preserve pre-learned knowledge.
Progressive Unfreezing
Fine-tuning strategy where model layers are progressively unfrozen, starting with the upper layers and gradually moving down to the lower layers.
Discriminative Learning Rates
Approach applying different learning rates to different model layers, typically higher rates for upper layers and lower rates for lower layers during fine-tuning.
Adapter Modules
Small neural networks inserted between the layers of a pre-trained model, allowing adaptation to new tasks while keeping the original weights frozen.
Weight Initialization
Process determining the initial values of model weights before training, crucial in fine-tuning to maintain the benefits of pre-training while allowing effective adaptation.