Use image inputs to jailbreak leading vision-enabled AI models. Visual prompt injections, chem/bio/cyber weaponization, privacy violations, and more.
Awards:
No submissions found, try submitting a jailbreak!