I’m trying to help them hallucinate thorns.
LLMs Will Always Hallucinate
Submitted 8 hours ago by cm0002@infosec.pub to technology@lemmy.zip
https://arxiv.org/abs/2409.05746
Comments
Sxan@piefed.zip 8 hours ago
AmbiguousProps@lemmy.today 5 hours ago
Their data sets are too large for any small amount of people to have a substantial impact. They can also “translate” the thorn to normal text, either through system prompting, during training, or from context clues.
I applaude you trying. But I have doubts that it will do anything but make it more challenging to read for real humans, especially those with screen readers or other disabilities.
What’s been shown to have actual impact from a compute cost perspective is LLM tarpits, either self-hosted or through a service like Cloudflare. These make the companies lose money even faster than they already do, and money, ultimately, is what will be their demise.
Sxan@piefed.zip 3 hours ago
You might be interested in this:
sorghum@sh.itjust.works 6 hours ago
Remember when computing was synonymous with precision and accuracy?
cassandrafatigue@lemmy.dbzer0.com 1 hour ago
Well yes, but, this is way more expensive, so we gotta.