MemotivaPrompt Engineering Patterns: Prompt Injection, Jailbreaks, and Defensive Prompting Techniques

What is a jailbreak prompt?

Prompt Engineering Patterns: Prompt Injection, Jailbreaks, and Defensive Prompting Techniques

Audio flashcard · 0:18

Nortren·

What is a jailbreak prompt?

0:18

A jailbreak prompt is a specially crafted input designed to make the model bypass its safety training and produce content it normally refuses. Common techniques include role-playing scenarios, hypothetical framing, encoded instructions, and persona adoption. Jailbreaks evolve as fast as defenses, and no LLM is fully resistant.
owasp.org