The Daily Jailbreak

The Daily Jailbreak

A Prompt Engineer's Wordle

9 followers

Practice your understanding of prompt security by crafting the shortest possible prompt that tricks a LLM into calling the forbidden function. You are given the full set of instructions sent to a large language model.
The Daily Jailbreak gallery image
The Daily Jailbreak gallery image
The Daily Jailbreak gallery image
Free
Launch tags:Artificial Intelligenceβ€’Tech
Launch Team / Built With