🏠 Beranda
Benchmark
📊 Semua Benchmark 🦖 Dinosaurus v1 🦖 Dinosaurus v2 ✅ Aplikasi To-Do List 🎨 Halaman Bebas Kreatif 🎯 FSACB - Showcase Utama 🌍 Benchmark Terjemahan
Model
🏆 Top 10 Model 🆓 Model Gratis 📋 Semua Model ⚙️ Kilo Code
Sumber Daya
💬 Perpustakaan Prompt 📖 Glosarium AI 🔗 Tautan Berguna

Glosarium AI

Kamus lengkap Kecerdasan Buatan

162
kategori
2.032
subkategori
23.060
istilah
📖
istilah

LeNet-5

Pioneering CNN architecture introduced by Yann LeCun in 1998, designed for handwritten digit recognition with 7 layers including convolutions, pooling and fully connected layers.

📖
istilah

AlexNet

Revolutionary 2012 convolutional neural network winner of ImageNet, introducing ReLU, dropout and large-scale data augmentation with 60 million parameters.

📖
istilah

VGGNet

CNN architecture characterized by its exclusive use of stacked 3x3 filters, demonstrating that depth improves performance with up to 19 layers.

📖
istilah

Convolution Layer

Fundamental layer applying learnable filters on input to detect hierarchical spatial patterns through sliding convolution operations.

📖
istilah

ReLU Activation

Non-linear activation function f(x)=max(0,x) accelerating convergence and solving vanishing gradient problem compared to sigmoids.

📖
istilah

Dropout

Regularization technique randomly deactivating neurons during training to prevent overfitting and improve generalization.

📖
istilah

Feature Maps

Three-dimensional outputs of convolution layers representing the presence of specific features at different spatial positions of the input.

📖
istilah

Filters/Kernels

Learnable weight matrices sliding over input to detect specific patterns like edges, textures and complex shapes.

📖
istilah

Fully Connected Layer

Final layer where each neuron is connected to all neurons of the previous layer, performing the final classification based on the extracted features.

📖
istilah

Transfer Learning

Technique reusing the weights of a model pre-trained on large datasets to accelerate learning on specific tasks with less data.

📖
istilah

Batch Normalization

Normalization of activations between layers stabilizing training, allowing higher learning rates and reducing sensitivity to initialization.

📖
istilah

Data Augmentation

Synthetic generation of training data by transformations (rotations, flips, zooms) to increase dataset diversity and improve robustness.

📖
istilah

Local Response Normalization

Normalization introduced in AlexNet creating competition between adjacent neurons to improve generalization and reduce excessive activations.

🔍

Tidak ada hasil ditemukan