🏠 Home
Benchmark Hub
📊 All Benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List Applications 🎨 Creative Free Pages 🎯 FSACB - Ultimate Showcase 🌍 Translation Benchmark
Models
🏆 Top 10 Models 🆓 Free Models 📋 All Models ⚙️ Kilo Code
Resources
💬 Prompts Library 📖 AI Glossary 🔗 Useful Links
📖
Parameter Regularization

Orthogonal Gradient Descent (OGD)

Method that projects the gradient of the new task onto the space orthogonal to the gradient subspaces of previous tasks. This projection guarantees that learning new tasks does not interfere with directions important for past performance.

← Back