Słownik AI
Kompletny słownik sztucznej inteligencji
GraphSAGE
Inductive graph neural network algorithm that samples and aggregates neighbor features to generate node embeddings on large graphs.
Inductive Learning
Learning approach capable of generalizing to unseen nodes or graphs during training, unlike transductive learning which requires the complete graph structure.
Neighbor Sampling
Technique involving selecting a fixed random subset of neighbors for each node during training, enabling efficient processing of large graphs.
Aggregation Function
Mathematical operation combining features of neighboring nodes to produce an aggregated representation, essential in information propagation on graphs.
Node Embedding
Low-dimensional dense vector representation capturing the structural properties and features of a node in a graph, used for prediction tasks.
Mean Aggregator
Aggregation function computing the element-wise mean of neighboring node features, providing a symmetric and stable representation of local information.
Max Pooling Aggregator
Aggregation function applying a non-linear transformation followed by max pooling on neighbor features, capturing the most salient aspects of the neighborhood.
LSTM Aggregator
Aggregation function using an LSTM network to sequentially process neighbor features, capturing asymmetric dependencies in the node's neighborhood.
Feature Propagation
Iterative process of transmitting and aggregating information between neighboring nodes to progressively enrich vector representations across multiple layers.
Mini-batch Training
Training strategy dividing the graph into small subsets of nodes to optimize computational efficiency and enable scaling on massive graphs.
Graph Convolution
Operation generalizing 2D convolution to graph structures, combining a node's features with those of its neighbors according to a defined aggregation scheme.
Transductive Learning
Learning paradigm where the model is trained and tested on the same set of fixed nodes, requiring the complete graph structure during inference.
Message Passing
Algorithmic framework where nodes exchange information with their neighbors to update their states, the foundation of graph neural networks.
Graph Representation
Vector encoding capturing the structural and semantic properties of an entire graph, used for graph-level classification or regression tasks.
Sampling Strategy
Method defining how to select neighbors during training, impacting the balance between computational efficiency and preservation of structural information.
Feature Concatenation
Operation combining a node's original features with aggregated information from its neighbors to create an enriched representation before transformation.
Residual Connection
Mechanism allowing direct information bypass between layers, facilitating deep training and preserving the original characteristics of nodes.
ReLU Activation
Non-linear activation function applied after aggregation to introduce non-linearity, defined as max(0, x) and widely used in GraphSAGE.
Batch Normalization
Regularization technique normalizing activations on each mini-batch to stabilize training and accelerate convergence in graph neural networks.