AI

‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned


A jailbroken version of GPT-4o hit the ChatGPT website this week, lasting only a few precious hours before being destroyed by OpenAI. 

Twitter user “Pliny the Prompter,” who calls themselves a white hat hacker and “AI red teamer,” shared their “GODMODE GPT” on Wednesday. Using OpenAI’s custom GPT editor, Pliny was able to prompt the new GPT-4o model to bypass all of its restrictions, allowing the AI chatbot to swear, jailbreak cars, and make napalm, among other dangerous instructions.





Source

Related Articles

Back to top button