Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Before We Blame AI For Suicide, We Should Admit How Little We Know About Suicide

⁨2⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨day⁩ ago⁩ by ⁨schnurrito@discuss.tchncs.de⁩ to ⁨technology@lemmy.zip⁩

https://www.techdirt.com/2026/02/19/before-we-blame-ai-for-suicide-we-should-admit-how-little-we-know-about-suicide/

source

Comments

Sort:hotnewtop
  • MagnificentSteiner@lemmy.zip ⁨1⁩ ⁨day⁩ ago

    lemmy.zip/post/56037025

    Image

    source
  • atrielienz@lemmy.world ⁨20⁩ ⁨hours⁩ ago

    While I don’t agree that AI alone caused anything (because there had to be some instability there for the words of the AI to manipulate, I can absolutely agree that use of the AI is a very apparent contributing factor in the cause.

    With suicide you need some very specific circumstances.

    1. Opportunity. A time and place where the person can’t or won’t be stopped from the attempt.
    2. A feeling of pain or helplessness that eclipses that person’s ability to deal with or find and outlet for.
    3. Means/mode. A bus, a rope and anchor point, a weapon.
    4. Intent.

    I think the last one is where things get a bit murky from a legal standpoint.

    Barring accidental suicide, what can legally be considered as responsible for causing suicide is limited. If you encourage a suicidal person to kill themselves, you as the other person had intent to harm, even if you didn’t mean for them to actually follow through, or believe that they would.

    My fear is that these legal battles won’t result in the AI being held accountable because they’re not able to have intent.

    My bigger fear is that the companies who are responsible are not going to be held responsible for the same reason a fun manufacturer isn’t when someone sticks the barrel in their mouth and pulls the trigger. The argument that it’s a “tool” that’s been “misused” is gonna be thrown around a lot.

    I wish I could believe we’d get more stringent regulations out of such lawsuits. But I just don’t have that kind of hope.

    source
  • borokov@lemmy.world ⁨1⁩ ⁨day⁩ ago

    “AI cause suicide” is the new “Video game cause violence”. In the long chain of event that ended up to the act, AI may have been a small link. Like the butterfly that created the hurricane. But chaos theory also showed us that this hurricane would have happened anyway.

    source
    • BertramDitore@lemmy.zip ⁨22⁩ ⁨hours⁩ ago

      These things are very different, and imo not a valid comparison.

      Sure someone might die by suicide no matter what cultural or social inputs they’re exposed to, but it’s equally possible that they wouldn’t, and only did so because of constant pressure and personalized conspiratorial fantasies fabricated by an unthinking and unfeeling algorithm.

      Video games, though we might personally identify with them and their characters, are not actively refining and tailoring themselves to infiltrate our psyche like some LLMs are. I’d go as far as to say video games are passive when compared to LLMs. You play a video game that exists as a single piece of work with predictable outcomes based on your inputs, in a completely fictional space. LLMs are essentially the opposite. LLMs play you and tailor their responses to achieve outcomes they predict you want in the real world, without actually understanding or caring about the potential consequences. Still not necessarily a 1:1 cause and effect, but the incentives are super different.

      source
      • borokov@lemmy.world ⁨7⁩ ⁨hours⁩ ago

        Interesting analysis indeed that make me review my position a bit.

        Where the article is right, is that we know very little about suicide, depression, and cognitive dysfunction in general. As it says, the only prevention method we currently have is basically asking “do you plan to kill yourself ?”. There are some protocol that can estimate the risk level like asking “have you ever consider it before”, “how many time in the last 2 weeks”, “have you ever thought about scenario”, “have you ever tried it”, etc… But some people may “simulate” suicide attempt dozen of time to ask for help, whereas other will kill themselves at first try without any prior sign.

        Lonely person may find refuge in LLM which may end up in a deadly trap. Like he could have ended up in MMO game, social networking or anything else that could lead to the same consequences. I just find too easy to blame AI because it’s a new way to lost itself. But where you are right, is that we could expect a human community to be naturally kind (most of the time…), whereas LLM doesn’t even understand the idea of kindness.

        Then, suicide and depression is not always due to social or environmental causes. Sometime, it’s cause by chemical unbalanced in brain. In that case, there is no psychological treatment and the only solution is medication (in current state of science).

        source