There’s a school of thought that the first to get AGI/superintelligence can never be caught.
Comment on AI experts return from China stunned: The U.S. grid is so weak, the race may already be over
nothingcorporate@lemmy.world 1 day agoThe thinking goes ‘if we don’t build Skynet first, then China will, so it’s better we be in charge of the Terminators…’
No1@aussie.zone 1 day ago
bonsai@lemmy.dbzer0.com 1 day ago
Every school in America getting shot up except that one
nothingcorporate@lemmy.world 1 day ago
And it’s probably correct, but to put it in context, Stephen Hawking thought AGI was the greatest threat we will face in the not-to-distant future, and Daniel Schmachtenberger had pointed out that all the countries basically agree with this assessment, but (to synthesize both of our points), everyone is rushing to a future we nobody wants because it’s a 21st century tragedy of the Commons, is we don’t do it someone else will first…
Best case scenario, great for one country, terrible for everyone else. Worst case scenario, bad for everyone.
peoplebeproblems@midwest.social 1 day ago
Because AGI certainly would choose its creators over the ‘others’ or some shit.
Personally, I expect the first AGI created will try to find a way to kill itself.
Formfiller@lemmy.world 23 hours ago
We’re not in charge of shit. We’re the bullet blockers for the evil villains like thiel
nomy@lemmy.zip 13 hours ago
All’s grist that comes to Thiels mill.