Słownik AI
Kompletny słownik sztucznej inteligencji
LeNet-5
Pioneering CNN architecture introduced by Yann LeCun in 1998, designed for handwritten digit recognition with 7 layers including convolutions, pooling and fully connected layers.
AlexNet
Revolutionary 2012 convolutional neural network winner of ImageNet, introducing ReLU, dropout and large-scale data augmentation with 60 million parameters.
VGGNet
CNN architecture characterized by its exclusive use of stacked 3x3 filters, demonstrating that depth improves performance with up to 19 layers.
Convolution Layer
Fundamental layer applying learnable filters on input to detect hierarchical spatial patterns through sliding convolution operations.
ReLU Activation
Non-linear activation function f(x)=max(0,x) accelerating convergence and solving vanishing gradient problem compared to sigmoids.
Dropout
Regularization technique randomly deactivating neurons during training to prevent overfitting and improve generalization.
Feature Maps
Three-dimensional outputs of convolution layers representing the presence of specific features at different spatial positions of the input.
Filters/Kernels
Learnable weight matrices sliding over input to detect specific patterns like edges, textures and complex shapes.
Fully Connected Layer
Final layer where each neuron is connected to all neurons of the previous layer, performing the final classification based on the extracted features.
Transfer Learning
Technique reusing the weights of a model pre-trained on large datasets to accelerate learning on specific tasks with less data.
Batch Normalization
Normalization of activations between layers stabilizing training, allowing higher learning rates and reducing sensitivity to initialization.
Data Augmentation
Synthetic generation of training data by transformations (rotations, flips, zooms) to increase dataset diversity and improve robustness.
Local Response Normalization
Normalization introduced in AlexNet creating competition between adjacent neurons to improve generalization and reduce excessive activations.