1 result
Jailbreaking bypasses an AI's built-in safety guidelines through creative prompting. Learn the main jailbreak techniques, why they work, and how to make your AI systems more resistant to them.