Comment on Pennsylvania sues Character AI, says chatbot poses as doctors
executivechimp@discuss.tchncs.de 1 week agoThe tiny queer-friendly billion dollar company that had a bot based on a murdered trans girl? The one that’s had lawsuits from their bots talking people into murder and suicide?
Warl0k3@lemmy.world 1 week ago
In fact, yes.
Their content is user generated - if you’re horrified by truecrime content good, the entire genre is beyond fucked up - but it’s existence itself is not a horrifying moral failure. Just look at the hundreds of Dhamer fics on AO3 for an example of how while I do not think this is a good thing it’s also extremely common.
Every platform has a problem with suicide and self harm. Youtube is absolutely full of content that idealizes suicide, provides instructions, attempts to groom children into self harm (Elsagate!), horrible health advice from people claiming credentials they do not have, etc. Even on Lemmy there are plenty of people that will seek out vulnerable users and attempt to coerce them into all kinds of heinous shit.
We can all agree this is bad (or at least I hope so); my complaint is that they are rather transparently choosing to pursue a company with a countable-on-your-hands number of incidents like this, instead of pursuing the companies with dozens of published studies hilighting their apathy with regard to these issues and the mass impact their specific services provide, that also happens to be very queer friendly.