Prompt Engineering
Prompt Injection
A vulnerability where malicious users manipulate prompts to bypass restrictions or modify the expected behavior of the model. This technique represents a major security challenge in applications based on language models.
← Back