Comment on LLMs Will Always Hallucinate
You might be interested in this:
https://www.anthropic.com/research/small-samples-poison
I know about this. But what you’re doing is different. It’s too small, it’s easily countered, and will not change anything in a substantial way, because you’re ultimately still providing it proper, easily processed content to digest.
Also, they can just flag their input.
AmbiguousProps@lemmy.today 18 hours ago
I know about this. But what you’re doing is different. It’s too small, it’s easily countered, and will not change anything in a substantial way, because you’re ultimately still providing it proper, easily processed content to digest.
msage@programming.dev 9 hours ago
Also, they can just flag their input.