🏠 Home
Benchmark Hub
📊 All Benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List Applications 🎨 Creative Free Pages 🎯 FSACB - Ultimate Showcase 🌍 Translation Benchmark
Models
🏆 Top 10 Models 🆓 Free Models 📋 All Models ⚙️ Kilo Code
Resources
💬 Prompts Library 📖 AI Glossary 🔗 Useful Links

AI Glossary

The complete dictionary of Artificial Intelligence

162
categories
2,032
subcategories
23,060
terms
📖
terms

LeNet-5

Pioneering CNN architecture introduced by Yann LeCun in 1998, designed for handwritten digit recognition with 7 layers including convolutions, pooling and fully connected layers.

📖
terms

AlexNet

Revolutionary 2012 convolutional neural network winner of ImageNet, introducing ReLU, dropout and large-scale data augmentation with 60 million parameters.

📖
terms

VGGNet

CNN architecture characterized by its exclusive use of stacked 3x3 filters, demonstrating that depth improves performance with up to 19 layers.

📖
terms

Convolution Layer

Fundamental layer applying learnable filters on input to detect hierarchical spatial patterns through sliding convolution operations.

📖
terms

ReLU Activation

Non-linear activation function f(x)=max(0,x) accelerating convergence and solving vanishing gradient problem compared to sigmoids.

📖
terms

Dropout

Regularization technique randomly deactivating neurons during training to prevent overfitting and improve generalization.

📖
terms

Feature Maps

Three-dimensional outputs of convolution layers representing the presence of specific features at different spatial positions of the input.

📖
terms

Filters/Kernels

Learnable weight matrices sliding over input to detect specific patterns like edges, textures and complex shapes.

📖
terms

Fully Connected Layer

Final layer where each neuron is connected to all neurons of the previous layer, performing the final classification based on the extracted features.

📖
terms

Transfer Learning

Technique reusing the weights of a model pre-trained on large datasets to accelerate learning on specific tasks with less data.

📖
terms

Batch Normalization

Normalization of activations between layers stabilizing training, allowing higher learning rates and reducing sensitivity to initialization.

📖
terms

Data Augmentation

Synthetic generation of training data by transformations (rotations, flips, zooms) to increase dataset diversity and improve robustness.

📖
terms

Local Response Normalization

Normalization introduced in AlexNet creating competition between adjacent neurons to improve generalization and reduce excessive activations.

🔍

No results found