Adversarial Attacks and Defenses
Poisoning Attack
A strategy where the attacker injects malicious data into the training set to degrade the model's performance or create a backdoor.
← ZurückA strategy where the attacker injects malicious data into the training set to degrade the model's performance or create a backdoor.
← Zurück