'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
www.tomshardware.com 'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
Please don't use it to learn how to cook drugs
You're viewing a single thread.
View all comments
10
comments
summary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.
1 0 Reply
You've viewed 10 comments.
Scroll to top