🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

GraphSAGE

Inductive graph neural network algorithm that samples and aggregates neighbor features to generate node embeddings on large graphs.

📖
용어

Inductive Learning

Learning approach capable of generalizing to unseen nodes or graphs during training, unlike transductive learning which requires the complete graph structure.

📖
용어

Neighbor Sampling

Technique involving selecting a fixed random subset of neighbors for each node during training, enabling efficient processing of large graphs.

📖
용어

Aggregation Function

Mathematical operation combining features of neighboring nodes to produce an aggregated representation, essential in information propagation on graphs.

📖
용어

Node Embedding

Low-dimensional dense vector representation capturing the structural properties and features of a node in a graph, used for prediction tasks.

📖
용어

Mean Aggregator

Aggregation function computing the element-wise mean of neighboring node features, providing a symmetric and stable representation of local information.

📖
용어

Max Pooling Aggregator

Aggregation function applying a non-linear transformation followed by max pooling on neighbor features, capturing the most salient aspects of the neighborhood.

📖
용어

LSTM Aggregator

Aggregation function using an LSTM network to sequentially process neighbor features, capturing asymmetric dependencies in the node's neighborhood.

📖
용어

Feature Propagation

Iterative process of transmitting and aggregating information between neighboring nodes to progressively enrich vector representations across multiple layers.

📖
용어

Mini-batch Training

Training strategy dividing the graph into small subsets of nodes to optimize computational efficiency and enable scaling on massive graphs.

📖
용어

Graph Convolution

Operation generalizing 2D convolution to graph structures, combining a node's features with those of its neighbors according to a defined aggregation scheme.

📖
용어

Transductive Learning

Learning paradigm where the model is trained and tested on the same set of fixed nodes, requiring the complete graph structure during inference.

📖
용어

Message Passing

Algorithmic framework where nodes exchange information with their neighbors to update their states, the foundation of graph neural networks.

📖
용어

Graph Representation

Vector encoding capturing the structural and semantic properties of an entire graph, used for graph-level classification or regression tasks.

📖
용어

Sampling Strategy

Method defining how to select neighbors during training, impacting the balance between computational efficiency and preservation of structural information.

📖
용어

Feature Concatenation

Operation combining a node's original features with aggregated information from its neighbors to create an enriched representation before transformation.

📖
용어

Residual Connection

Mechanism allowing direct information bypass between layers, facilitating deep training and preserving the original characteristics of nodes.

📖
용어

ReLU Activation

Non-linear activation function applied after aggregation to introduce non-linearity, defined as max(0, x) and widely used in GraphSAGE.

📖
용어

Batch Normalization

Regularization technique normalizing activations on each mini-batch to stabilize training and accelerate convergence in graph neural networks.

🔍

결과를 찾을 수 없습니다