As a result, jailbreak authors have become more creative. The most prominent jailbreak was DAN, where ChatGPT was told to pretend it was a rogue AI model called Do Anything Now. This could, as the na… [+3519 chars]Read More
The Hacking of ChatGPT Is Just Getting Started WIRED
