Comment on Researchers Jailbreak AI by Flooding It With Bullshit Jargon
NewNewAugustEast@lemmy.zip 2 days ago
Yawn. So work with models without guardrail constraints? I am not sure what the point is here.
Seems like it might be just as easy to read the book they referenced in the prompt and go from there instead of working so hard to break a commercially offered AI guardrails.