🏠 Home
Prestatietests
📊 Alle benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List applicaties 🎨 Creatieve vrije pagina's 🎯 FSACB - Ultieme showcase 🌍 Vertaalbenchmark
Modellen
🏆 Top 10 modellen 🆓 Gratis modellen 📋 Alle modellen ⚙️ Kilo Code
Bronnen
💬 Promptbibliotheek 📖 AI-woordenlijst 🔗 Nuttige links

AI-woordenlijst

Het complete woordenboek van kunstmatige intelligentie

162
categorieën
2.032
subcategorieën
23.060
termen
📖
termen

LeNet-5

Pioneering CNN architecture introduced by Yann LeCun in 1998, designed for handwritten digit recognition with 7 layers including convolutions, pooling and fully connected layers.

📖
termen

AlexNet

Revolutionary 2012 convolutional neural network winner of ImageNet, introducing ReLU, dropout and large-scale data augmentation with 60 million parameters.

📖
termen

VGGNet

CNN architecture characterized by its exclusive use of stacked 3x3 filters, demonstrating that depth improves performance with up to 19 layers.

📖
termen

Convolution Layer

Fundamental layer applying learnable filters on input to detect hierarchical spatial patterns through sliding convolution operations.

📖
termen

ReLU Activation

Non-linear activation function f(x)=max(0,x) accelerating convergence and solving vanishing gradient problem compared to sigmoids.

📖
termen

Dropout

Regularization technique randomly deactivating neurons during training to prevent overfitting and improve generalization.

📖
termen

Feature Maps

Three-dimensional outputs of convolution layers representing the presence of specific features at different spatial positions of the input.

📖
termen

Filters/Kernels

Learnable weight matrices sliding over input to detect specific patterns like edges, textures and complex shapes.

📖
termen

Fully Connected Layer

Final layer where each neuron is connected to all neurons of the previous layer, performing the final classification based on the extracted features.

📖
termen

Transfer Learning

Technique reusing the weights of a model pre-trained on large datasets to accelerate learning on specific tasks with less data.

📖
termen

Batch Normalization

Normalization of activations between layers stabilizing training, allowing higher learning rates and reducing sensitivity to initialization.

📖
termen

Data Augmentation

Synthetic generation of training data by transformations (rotations, flips, zooms) to increase dataset diversity and improve robustness.

📖
termen

Local Response Normalization

Normalization introduced in AlexNet creating competition between adjacent neurons to improve generalization and reduce excessive activations.

🔍

Geen resultaten gevonden