LoRA (Low-Rank Adaptation)
PEFT (Parameter-Efficient Fine-Tuning)
Category of model adaptation techniques aiming to modify a minimum of parameters during fine-tuning. LoRA is one of the most popular PEFT approaches along with Adapters, Prefix-tuning and soft prompts.
← Geri