🏠 Home
Prestatietests
📊 Alle benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List applicaties 🎨 Creatieve vrije pagina's 🎯 FSACB - Ultieme showcase 🌍 Vertaalbenchmark
Modellen
🏆 Top 10 modellen 🆓 Gratis modellen 📋 Alle modellen ⚙️ Kilo Code
Bronnen
💬 Promptbibliotheek 📖 AI-woordenlijst 🔗 Nuttige links

AI-woordenlijst

Het complete woordenboek van kunstmatige intelligentie

162
categorieën
2.032
subcategorieën
23.060
termen
📖
termen

Multi-task CNN

Convolutional neural network architecture designed to simultaneously perform multiple computer vision tasks by sharing learned representations between different outputs.

📖
termen

Parameter sharing

Technique where network weights are reused across different tasks to reduce computational complexity and improve generalization through joint learning.

📖
termen

Multiple output branches

Architectural structure featuring multiple specialized classification or regression heads, each dedicated to a specific task while sharing a common feature extraction backbone.

📖
termen

Multi-task learning

Learning paradigm where a model is trained simultaneously on multiple tasks to benefit from positive transfer effects and improve overall performance.

📖
termen

Shared architecture

Network design where lower layers extract common features shared across tasks, while upper layers specialize for specific predictions.

📖
termen

Multi-task encoder-decoder

Bidirectional structure where the encoder extracts shared features and multiple decoders generate specialized predictions for different vision tasks.

📖
termen

Feature Pyramid Network (FPN)

Hierarchical architecture generating multi-scale feature maps shared across tasks, enabling capture of both fine details and global semantic context.

📖
termen

Multi-task attention

Task-conditioned attention mechanism that dynamically modifies feature representations to optimize their relevance for each specific output.

📖
termen

Multi-task weighted loss

Loss function combining multiple loss terms with adaptive or fixed weights to balance the relative influence of each task during training.

📖
termen

Auxiliary tasks

Secondary tasks added to improve the learning of the main task by providing additional regularization signals and more robust representations.

📖
termen

Multi-task knowledge transfer

Phenomenon where the joint learning of multiple tasks enables beneficial knowledge transfer, improving individual performance compared to isolated learning.

📖
termen

Feature fusion

Process of combining features from different layers or modalities to create enriched representations suitable for the simultaneous execution of multiple tasks.

📖
termen

Task decoupling

Architectural approach that explicitly separates task-specific parameters while maintaining a shared backbone to optimize individual performance.

📖
termen

Dynamic routing

Adaptive mechanism that conditionally selects computational paths in the network based on the specific requirements of each task to optimize efficiency.

📖
termen

Universal representations

Generic features learned on multiple tasks and transferable to new tasks with minimal adaptation, promoting robust generalization.

📖
termen

Multi-task fine-tuning

Process of progressively adapting a pre-trained model on multiple tasks simultaneously to optimize performance on a target set of applications.

📖
termen

Gradient balancing

Optimization technique that dynamically adjusts the contribution of each task to update gradients to prevent one task from dominating others.

📖
termen

Multi-task normalization

Normalization strategies adapted to multi-task architectures, handling distribution differences between tasks to stabilize joint training.

📖
termen

Cross-stitch Networks

Multi-task architecture using cross-stitch units to linearly combine activations from specialized networks, enabling flexible collaborative learning.

🔍

Geen resultaten gevonden