MemotivaPrompt Engineering Patterns: Prompt Injection, Jailbreaks, and Defensive Prompting Techniques

What is prompt injection?

Prompt Engineering Patterns: Prompt Injection, Jailbreaks, and Defensive Prompting Techniques

Audio flashcard · 0:19

Nortren·

What is prompt injection?

0:19

Prompt injection is an attack where user input contains instructions that trick the model into ignoring its system prompt, revealing sensitive information, or performing unintended actions. It works because LLMs do not natively distinguish between developer instructions and user content; everything is just tokens. Prompt injection is the OWASP number one risk for LLM applications.
owasp.org