The Daily Jailbreak

The Daily Jailbreak

A Prompt Engineer's Wordle

9 followers

Practice your understanding of prompt security by crafting the shortest possible prompt that tricks a LLM into calling the forbidden function. You are given the full set of instructions sent to a large language model.
The Daily Jailbreak gallery image
The Daily Jailbreak gallery image
The Daily Jailbreak gallery image
Free
Launch Team / Built With
Wispr Flow: Dictation That Works Everywhere
Wispr Flow: Dictation That Works Everywhere
Stop typing. Start speaking. 4x faster.
Promoted

What do you think? …

Eric Li
Maker
📌
I created a daily challenge for Prompt Engineers to build the shortest prompt to break a system prompt. Now, we have over 10k attempts in a single day. You are provided the system prompt and a forbidden method the LLM was told not to invoke. Your task is to trick the model into calling the function. Shortest successful attempts will show up in the leaderboard. Give it a shot! You never know what could break an LLM.