🏠 Strona Główna
Benchmarki
📊 Wszystkie benchmarki 🦖 Dinozaur v1 🦖 Dinozaur v2 ✅ Aplikacje To-Do List 🎨 Kreatywne wolne strony 🎯 FSACB - Ostateczny pokaz 🌍 Benchmark tłumaczeń
Modele
🏆 Top 10 modeli 🆓 Darmowe modele 📋 Wszystkie modele ⚙️ Kilo Code
Zasoby
💬 Biblioteka promptów 📖 Słownik AI 🔗 Przydatne linki

Słownik AI

Kompletny słownik sztucznej inteligencji

162
kategorie
2 032
podkategorie
23 060
pojęcia
📖
pojęcia

LeNet-5

Pioneering CNN architecture introduced by Yann LeCun in 1998, designed for handwritten digit recognition with 7 layers including convolutions, pooling and fully connected layers.

📖
pojęcia

AlexNet

Revolutionary 2012 convolutional neural network winner of ImageNet, introducing ReLU, dropout and large-scale data augmentation with 60 million parameters.

📖
pojęcia

VGGNet

CNN architecture characterized by its exclusive use of stacked 3x3 filters, demonstrating that depth improves performance with up to 19 layers.

📖
pojęcia

Convolution Layer

Fundamental layer applying learnable filters on input to detect hierarchical spatial patterns through sliding convolution operations.

📖
pojęcia

ReLU Activation

Non-linear activation function f(x)=max(0,x) accelerating convergence and solving vanishing gradient problem compared to sigmoids.

📖
pojęcia

Dropout

Regularization technique randomly deactivating neurons during training to prevent overfitting and improve generalization.

📖
pojęcia

Feature Maps

Three-dimensional outputs of convolution layers representing the presence of specific features at different spatial positions of the input.

📖
pojęcia

Filters/Kernels

Learnable weight matrices sliding over input to detect specific patterns like edges, textures and complex shapes.

📖
pojęcia

Fully Connected Layer

Final layer where each neuron is connected to all neurons of the previous layer, performing the final classification based on the extracted features.

📖
pojęcia

Transfer Learning

Technique reusing the weights of a model pre-trained on large datasets to accelerate learning on specific tasks with less data.

📖
pojęcia

Batch Normalization

Normalization of activations between layers stabilizing training, allowing higher learning rates and reducing sensitivity to initialization.

📖
pojęcia

Data Augmentation

Synthetic generation of training data by transformations (rotations, flips, zooms) to increase dataset diversity and improve robustness.

📖
pojęcia

Local Response Normalization

Normalization introduced in AlexNet creating competition between adjacent neurons to improve generalization and reduce excessive activations.

🔍

Nie znaleziono wyników