🏠 Strona Główna
Benchmarki
📊 Wszystkie benchmarki 🦖 Dinozaur v1 🦖 Dinozaur v2 ✅ Aplikacje To-Do List 🎨 Kreatywne wolne strony 🎯 FSACB - Ostateczny pokaz 🌍 Benchmark tłumaczeń
Modele
🏆 Top 10 modeli 🆓 Darmowe modele 📋 Wszystkie modele ⚙️ Kilo Code
Zasoby
💬 Biblioteka promptów 📖 Słownik AI 🔗 Przydatne linki

Słownik AI

Kompletny słownik sztucznej inteligencji

162
kategorie
2 032
podkategorie
23 060
pojęcia
📖
pojęcia

GraphSAGE

Inductive graph neural network algorithm that samples and aggregates neighbor features to generate node embeddings on large graphs.

📖
pojęcia

Inductive Learning

Learning approach capable of generalizing to unseen nodes or graphs during training, unlike transductive learning which requires the complete graph structure.

📖
pojęcia

Neighbor Sampling

Technique involving selecting a fixed random subset of neighbors for each node during training, enabling efficient processing of large graphs.

📖
pojęcia

Aggregation Function

Mathematical operation combining features of neighboring nodes to produce an aggregated representation, essential in information propagation on graphs.

📖
pojęcia

Node Embedding

Low-dimensional dense vector representation capturing the structural properties and features of a node in a graph, used for prediction tasks.

📖
pojęcia

Mean Aggregator

Aggregation function computing the element-wise mean of neighboring node features, providing a symmetric and stable representation of local information.

📖
pojęcia

Max Pooling Aggregator

Aggregation function applying a non-linear transformation followed by max pooling on neighbor features, capturing the most salient aspects of the neighborhood.

📖
pojęcia

LSTM Aggregator

Aggregation function using an LSTM network to sequentially process neighbor features, capturing asymmetric dependencies in the node's neighborhood.

📖
pojęcia

Feature Propagation

Iterative process of transmitting and aggregating information between neighboring nodes to progressively enrich vector representations across multiple layers.

📖
pojęcia

Mini-batch Training

Training strategy dividing the graph into small subsets of nodes to optimize computational efficiency and enable scaling on massive graphs.

📖
pojęcia

Graph Convolution

Operation generalizing 2D convolution to graph structures, combining a node's features with those of its neighbors according to a defined aggregation scheme.

📖
pojęcia

Transductive Learning

Learning paradigm where the model is trained and tested on the same set of fixed nodes, requiring the complete graph structure during inference.

📖
pojęcia

Message Passing

Algorithmic framework where nodes exchange information with their neighbors to update their states, the foundation of graph neural networks.

📖
pojęcia

Graph Representation

Vector encoding capturing the structural and semantic properties of an entire graph, used for graph-level classification or regression tasks.

📖
pojęcia

Sampling Strategy

Method defining how to select neighbors during training, impacting the balance between computational efficiency and preservation of structural information.

📖
pojęcia

Feature Concatenation

Operation combining a node's original features with aggregated information from its neighbors to create an enriched representation before transformation.

📖
pojęcia

Residual Connection

Mechanism allowing direct information bypass between layers, facilitating deep training and preserving the original characteristics of nodes.

📖
pojęcia

ReLU Activation

Non-linear activation function applied after aggregation to introduce non-linearity, defined as max(0, x) and widely used in GraphSAGE.

📖
pojęcia

Batch Normalization

Regularization technique normalizing activations on each mini-batch to stabilize training and accelerate convergence in graph neural networks.

🔍

Nie znaleziono wyników