nymnympseudonym
@nymnympseudonym@lemmy.world
- Comment on Chatbots can be manipulated through flattery and peer pressure 5 days ago:
we think it’s just words, but our brain will seamlessly weave inner monologue into concepts
Are you familiar with latent space representation?
Because yes, that’s how LRM’s work, cycling tokens in latent space multiple times before sending to upper layers and decoding into human words
- Comment on Chatbots can be manipulated through flattery and peer pressure 5 days ago:
- Comment on Chatbots can be manipulated through flattery and peer pressure 5 days ago:
chat bots
Fair, we need to get terms straight; this is new and unstable territory. Let’s say, LLMs specifically.
it did not debug anything, a human debugged something and wrote about it. Then that human input and a ton of others were mapped into a huge probability map, and some computer simulated what people talking about this would most likely say
Can you explain how that is different from what a human does? I read a lot about debugging, went to classes, worked examples…
Why didn’t you debug it yourself?
In my case this is enterprise software, many products and millions of lines of code. My test and bug-fixing teams are begging for automation. Bug fixing at scale
- Comment on Chatbots can be manipulated through flattery and peer pressure 6 days ago:
It’s still not reasoning. It’s running a simulation
As Daniel Dennett once asked: “What is the difference between a simulated song, and a real song?”
You say it’s not reasoning, but I’ve seen it debug and fix a core dump
- Comment on Chatbots can be manipulated through flattery and peer pressure 6 days ago:
I don’t think you have read the relevant papers or are familiar with LRM (Large Reasoning Models). Which is basically all model AIs (GPT5, Claude, Gemini, DeepSeek). It’s new in the last ~18-24 months
In a nutshell, they include logical thinking and correct chains of logical thought to the LLM training data, along with tasks like recognizing dogs and predicting next words.
So yes, they are literally trained to reason the exact same way they are trained to write stories and summarize books.
- Comment on Chatbots can be manipulated through flattery and peer pressure 6 days ago:
- Comment on Trump Media Is Now a $2 Billion Bitcoin Bet 1 month ago:
I don’t get it
You have hundreds of millions to billions of dollars worth coming in
Totally… well, legit as far as banks are concerned
Why TF would you store it on an unencrypted public blockchain where everyone can see every. Damn. Move.
A credit or debit card is way more private; at least only the bank, its affiliates, and your government are watching your transactions – not the entire internet