AI 용어집
인공지능 완전 사전
VIO (Visual-Inertial Odometry)
Data fusion technique combining visual information from cameras with inertial sensor data to accurately estimate the position and orientation of a drone. This method offers increased robustness in variable lighting conditions or during rapid movements.
Edge Computing for Robotics
Distributed processing architecture where AI computations are performed directly on robots or drones rather than in the cloud, thus reducing latency and improving decision-making autonomy. This approach ensures critical-time response even without network connectivity.
Neural Networks Compression
Set of techniques to reduce the size and complexity of neural networks for their deployment on resource-constrained embedded systems. These methods include quantization, pruning, and knowledge distillation.
Real-time Perception System
Sensory processing system capable of analyzing and interpreting the environment in real time to guide autonomous decisions of a robot or drone. It combines computer vision, object detection, and semantic segmentation for immediate contextual understanding.
Obstacle Avoidance Algorithm
Embedded algorithm using lidar, stereovision, or ultrasonic data to dynamically detect and avoid obstacles during autonomous navigation. These systems must operate with latencies below 100ms to ensure operational safety.
Path Planning with Dynamic Replanning
Ability of an autonomous system to calculate an optimal trajectory towards a goal while continuously adjusting this trajectory in response to environmental changes. This functionality is essential for navigation in dynamic and unpredictable environments.
Multi-sensor Fusion Architecture
System integrating and correlating data from multiple sensors (cameras, lidar, IMU, GPS) to create a coherent and robust representation of the environment. This fusion improves the reliability and accuracy of perception against sensor failures.
Embedded Computer Vision
Implementation of computer vision algorithms optimized to run on embedded processors with power and memory constraints. These systems enable object recognition, tracking, and scene analysis directly on the robot or drone.
Autonomous Navigation System
An integrated set of software and hardware modules that allows a robot or drone to move autonomously from one point to another without human intervention. This system combines localization, mapping, trajectory planning, and motor control.
Onboard Learning Capability
The ability of an embedded system to adapt and improve its AI models directly on the device based on new environmental experiences. This feature enables continuous adaptation without requiring a cloud connection for retraining.
Low-latency Decision Making
Real-time decision-making process with latencies below 50ms, critical for quick reactions of robots and drones in dynamic environments. This performance is achieved through algorithmic optimization and specialized embedded hardware.
Distributed Edge Intelligence
Architecture where multiple robots or drones share and coordinate their AI capabilities to collaboratively accomplish complex tasks. This approach enables decentralized collective intelligence and increased system resilience.
Edge-cloud Continuum
Paradigm of transparent orchestration of AI workloads between edge and cloud based on latency, computing power, and bandwidth requirements. This continuity optimizes resource utilization while ensuring critical real-time performance.
Federated Learning at Edge
Collaborative learning approach where multiple robots or drones collectively train AI models without sharing their raw data, thus preserving privacy while benefiting from collective intelligence. This method is particularly suitable for distributed drone fleets.
Real-time SLAM Optimization
Algorithmic and hardware optimization techniques enabling SLAM execution at frequencies above 30Hz on embedded processors. These optimizations are crucial for stable and precise navigation of high-speed drones.
Edge-based Semantic Segmentation
Ability to classify each pixel of an image into semantic categories directly on the embedded device, enabling detailed environment understanding for autonomous navigation. This feature is essential for intelligent interaction with complex environments.
Autonomous Swarm Intelligence
A system where swarms of drones or robots operate in a coordinated and autonomous manner using local rules and limited inter-agent communication. This emergent collective intelligence allows accomplishing complex tasks impossible for individual units.