🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📖
termer

LeNet-5

Pioneering CNN architecture introduced by Yann LeCun in 1998, designed for handwritten digit recognition with 7 layers including convolutions, pooling and fully connected layers.

📖
termer

AlexNet

Revolutionary 2012 convolutional neural network winner of ImageNet, introducing ReLU, dropout and large-scale data augmentation with 60 million parameters.

📖
termer

VGGNet

CNN architecture characterized by its exclusive use of stacked 3x3 filters, demonstrating that depth improves performance with up to 19 layers.

📖
termer

Convolution Layer

Fundamental layer applying learnable filters on input to detect hierarchical spatial patterns through sliding convolution operations.

📖
termer

ReLU Activation

Non-linear activation function f(x)=max(0,x) accelerating convergence and solving vanishing gradient problem compared to sigmoids.

📖
termer

Dropout

Regularization technique randomly deactivating neurons during training to prevent overfitting and improve generalization.

📖
termer

Feature Maps

Three-dimensional outputs of convolution layers representing the presence of specific features at different spatial positions of the input.

📖
termer

Filters/Kernels

Learnable weight matrices sliding over input to detect specific patterns like edges, textures and complex shapes.

📖
termer

Fully Connected Layer

Final layer where each neuron is connected to all neurons of the previous layer, performing the final classification based on the extracted features.

📖
termer

Transfer Learning

Technique reusing the weights of a model pre-trained on large datasets to accelerate learning on specific tasks with less data.

📖
termer

Batch Normalization

Normalization of activations between layers stabilizing training, allowing higher learning rates and reducing sensitivity to initialization.

📖
termer

Data Augmentation

Synthetic generation of training data by transformations (rotations, flips, zooms) to increase dataset diversity and improve robustness.

📖
termer

Local Response Normalization

Normalization introduced in AlexNet creating competition between adjacent neurons to improve generalization and reduce excessive activations.

🔍

Inga resultat hittades