Model Robustness
Gradient masking
Defense technique that modifies or masks the model's gradients to prevent attackers from calculating effective adversarial perturbations. Although it may seem effective, this approach is often bypassable by more sophisticated attacks.
← Geri