RAG Fine-tuning
Knowledge Distillation for RAG
Technique for transferring knowledge from a complex RAG model (teacher) to a lighter model (student), preserving retrieval-based reasoning capabilities.
← Geri