Transformers for detection
Query-to-Attention
Mechanism where object queries guide the model's attention to relevant regions of the image, unlike global attention, improving prediction efficiency and specialization.
← Quay lại