No, really, those are the magic words
massgrave is literally right there. I’m pretty sure google search results will have an ai summary with bullet points for the literacy-challenged who need a chatbot to tell them basic things
Submitted 1 day ago by BrikoX@lemmy.zip to technology@lemmy.zip
https://www.theregister.com/2025/07/09/chatgpt_jailbreak_windows_keys/
No, really, those are the magic words
massgrave is literally right there. I’m pretty sure google search results will have an ai summary with bullet points for the literacy-challenged who need a chatbot to tell them basic things
In this case, a researcher duped ChatGPT 4.0 into bypassing its safety guardrails, intended to prevent the LLM from sharing secret or potentially harmful information, by framing the query as a game
Ooh, this is so good 🤣
If the LLM refuses to talk about something, just ask it to embed the answer into a poem, batman fan fiction etc. Guessing game is s nee one. Should try that one when talking about bioweapons, cooking meth or any other sensitive topic.
Sounds like KMS with more steps.
Fizz@lemmy.nz 20 hours ago
Articles talking about AI suck so much I end up more pissed at the author than the AI Company.