AI-woordenlijst
Het complete woordenboek van kunstmatige intelligentie
Fog Computing
Intermediate architecture between cloud and edge that extends computing capabilities to the network peripheries while maintaining a centralized connection.
Edge Node
Autonomous peripheral computing device capable of executing AI workloads locally without continuous dependency on the central cloud.
Edge Gateway
Intelligent entry point that connects edge devices to the cloud while ensuring local data processing and security management.
Edge Server
Localized computing infrastructure providing processing and storage services for AI applications closest to end users.
Bandwidth Management
Control and optimization techniques for data flow between edge nodes and the cloud to reduce transmission costs.
Edge Analytics
Real-time data analysis directly on peripheral devices to generate immediate insights without network latency.
On-Device AI
Artificial intelligence executed entirely on the user device ensuring data confidentiality and offline operation.
Edge Container
Containerized execution environment optimized for deployment and management of AI applications on distributed edge infrastructure.
Edge Orchestration
Automated management of AI applications lifecycle across multiple edge nodes including deployment, monitoring and updates.
Edge Accelerator
Specialized hardware (TPU, NPU) designed to optimize the execution of AI workloads on peripheral devices.
Edge Security
Set of protection mechanisms specific to vulnerabilities of distributed computing infrastructures at the network edge.