Model Robustness
Physical Adversarial Attacks
Attacks where adversarial perturbations are applied in the real world to physical objects to deceive vision systems. These attacks must account for lighting conditions, viewing angles, and other environmental variables.
← Geri