🏠 Home
Benchmark
📊 Tutti i benchmark 🦖 Dinosauro v1 🦖 Dinosauro v2 ✅ App To-Do List 🎨 Pagine libere creative 🎯 FSACB - Ultimate Showcase 🌍 Benchmark traduzione
Modelli
🏆 Top 10 modelli 🆓 Modelli gratuiti 📋 Tutti i modelli ⚙️ Kilo Code
Risorse
💬 Libreria di prompt 📖 Glossario IA 🔗 Link utili

Glossario IA

Il dizionario completo dell'Intelligenza Artificiale

162
categorie
2.032
sottocategorie
23.060
termini
📖
termini

GraphSAGE

Inductive graph neural network algorithm that samples and aggregates neighbor features to generate node embeddings on large graphs.

📖
termini

Inductive Learning

Learning approach capable of generalizing to unseen nodes or graphs during training, unlike transductive learning which requires the complete graph structure.

📖
termini

Neighbor Sampling

Technique involving selecting a fixed random subset of neighbors for each node during training, enabling efficient processing of large graphs.

📖
termini

Aggregation Function

Mathematical operation combining features of neighboring nodes to produce an aggregated representation, essential in information propagation on graphs.

📖
termini

Node Embedding

Low-dimensional dense vector representation capturing the structural properties and features of a node in a graph, used for prediction tasks.

📖
termini

Mean Aggregator

Aggregation function computing the element-wise mean of neighboring node features, providing a symmetric and stable representation of local information.

📖
termini

Max Pooling Aggregator

Aggregation function applying a non-linear transformation followed by max pooling on neighbor features, capturing the most salient aspects of the neighborhood.

📖
termini

LSTM Aggregator

Aggregation function using an LSTM network to sequentially process neighbor features, capturing asymmetric dependencies in the node's neighborhood.

📖
termini

Feature Propagation

Iterative process of transmitting and aggregating information between neighboring nodes to progressively enrich vector representations across multiple layers.

📖
termini

Mini-batch Training

Training strategy dividing the graph into small subsets of nodes to optimize computational efficiency and enable scaling on massive graphs.

📖
termini

Graph Convolution

Operation generalizing 2D convolution to graph structures, combining a node's features with those of its neighbors according to a defined aggregation scheme.

📖
termini

Transductive Learning

Learning paradigm where the model is trained and tested on the same set of fixed nodes, requiring the complete graph structure during inference.

📖
termini

Message Passing

Algorithmic framework where nodes exchange information with their neighbors to update their states, the foundation of graph neural networks.

📖
termini

Graph Representation

Vector encoding capturing the structural and semantic properties of an entire graph, used for graph-level classification or regression tasks.

📖
termini

Sampling Strategy

Method defining how to select neighbors during training, impacting the balance between computational efficiency and preservation of structural information.

📖
termini

Feature Concatenation

Operation combining a node's original features with aggregated information from its neighbors to create an enriched representation before transformation.

📖
termini

Residual Connection

Mechanism allowing direct information bypass between layers, facilitating deep training and preserving the original characteristics of nodes.

📖
termini

ReLU Activation

Non-linear activation function applied after aggregation to introduce non-linearity, defined as max(0, x) and widely used in GraphSAGE.

📖
termini

Batch Normalization

Regularization technique normalizing activations on each mini-batch to stabilize training and accelerate convergence in graph neural networks.

🔍

Nessun risultato trovato