Comment on AI Eroded Doctors' Ability to Spot Cancer Within Months in Study
ChairmanMeow@programming.dev 1 day agoIt’s true that if a tool is objectively better, then it makes little sense to not use it.
But LLMs aren’t that good yet. There’s a reason senior developers are complaining about vibecoding juniors; their code quality is often just bad. And when pressed, they often can’t justify why their code is a certain way.
As long as experienced developers are able to do proper code review, the quality control is maintained. But a vibecoding developer isn’t good at reviewing. And code review is an absolutely essential skill to have.
I see this at my company too. There’s a handful of junior devs that have managed to be fairly productive with LLMs. And to the LLMs credit, the code is better than it was without it. But when I do code review on their stuff and ask them to explain something, I often get a nonsensical, AI-generated response. And that is a problem. These devs also don’t do a lot of code review, if any, and when they do they often have very minor comments or none at all. Some just don’t do any reviews, stating they’re not confident approving code (which is honest, but also problematic of course).
I don’t mind a junior dev, or any dev for that matter, using an LLM as an assistant. I do mind an LLM masquerading as a developer, using a junior dev as a meat puppet, if you get what I mean.
mindbleach@sh.itjust.works 1 day ago
We’re not talking about LLMs.
These doctors didn’t ask ChatGPT “does this look like cancer.” We’re talking about domain-specific medical tools.
ChairmanMeow@programming.dev 1 day ago
I was responding to a thread by RgoueBananas who is clearly talking about LLMs as he drew a parallel with IT.
mindbleach@sh.itjust.works 1 day ago
Are you sure? Check.
Where you jumped in is me, pointing out, repeatedly, that LLMs and IT have nothing to do with the actual article. Y’know, the doctors I keep mentioning? They’re not decorative.
ChairmanMeow@programming.dev 23 hours ago
Hmm, seems I replied to the wrong root comment.
Regardless, the overall point still stands. These tools are great for assistance, but relying on them completely can cause problems. Even these tumor-spotting ML tools aren’t perfect, and they too miss things. Combined with a doctor’s skill this is fine, but if one begins replacing the other the net benefit will be lower.