You can run it on your laptop, I’ve tried it before, what is truly hard is to train.
Comment on Dell admits consumers don’t care about AI PCs
ch00f@lemmy.world 2 days ago
WTF even is an “AI PC”? I saw an ad for some AI laptop. To my knowledge, nobody is running LLMs on their personal hardware, so do these computers have like…a web browser?
Gsus4@mander.xyz 2 days ago
virku@lemmy.world 2 days ago
It seems my laptop at work has a neural chip. I guess a special ai only gpu. I don’t think I could care less about a laptop feature.
artyom@piefed.social 2 days ago
Lots of people are. Typically it means they have an NPU.
ch00f@lemmy.world 2 days ago
I’m talking about an ad I saw on broadcast television during a football game. I don’t think the broad market of people are downloading models from huggingface or whatever.
artyom@piefed.social 1 day ago
The ad you saw said no one was running local AI?
ch00f@lemmy.world 1 day ago
The ad was people doing generic AI stuff. I think it was even showing Copilot.
Either way, the marketing for AI is far to nebulous for it to matter. Just looking for the ad, I found plenty (like this one) that explicitly mention “on-device AI,” but show people just searching for shit or doing nebulous office work. This ad even shows generating images in MS Paint which offloads the AI shit to the cloud.
gravitas_deficiency@sh.itjust.works 2 days ago
It’s two things:
- a machine that has NN-optimized segments on the CPU, or a discrete NPU
- microslop’s idiotic marketing and branding around trying to get everyone to use Copilot
capuccino@lemmy.world 2 days ago
Computers now come with a NPU (Neural Process Unit) to do that job… So yeah.
ch00f@lemmy.world 2 days ago
What kind of consumer-facing software runs on that NPU?
fuckwit_mcbumcrumble@lemmy.dbzer0.com 2 days ago
I know Video editing software uses it for things like motion tracking.
It’s all stuff your GPU can do, but the NPU can do it for like 1/10th to 1/100th the power.
atomicbocks@sh.itjust.works 1 day ago
For what it’s worth an NPU is why your phone could tell you that photo is of a cat years before LLMs were the hot new thing. They were originally marketed as accelerators for machine learning applications before everybody started calling that AI.
capuccino@lemmy.world 2 days ago
New versions of Sony Vegas use the NPU to enhance AI features, nothing that humans cannot do before.
maccentric@sh.itjust.works 1 day ago
Sony still makes laptops? TIL
halcyoncmdr@lemmy.world 2 days ago
Running an LLM locally is entirely possible with fairly decent modern hardware. You just won’t be running the largest versions of the models. You’re going to run ones intended for local use, almost certainly Quantized versions. Those usually are intended to cover 90% of use cases. Most people aren’t really doing super complicated shit with these advanced models. They’re asking it the same questions they typed into Google before, just using phrasing they used 20+ years ago with Ask Jeeves.
master_of_unlocking@piefed.zip 2 days ago
redknight942@sh.itjust.works 2 days ago
It is quite easy to run a distilled local model using a decent rig. I have one in that I use right from the terminal.
Ledivin@lemmy.world 2 days ago
They absolutely are.
msage@programming.dev 2 days ago
Statistically relevant portion?
You know they were hyperbolic.
Ledivin@lemmy.world 2 days ago
“To my knowledge” really doesn’t feel like hyperbole at all, IMO
msage@programming.dev 1 day ago
Do 5% of people you know use local LLMs?
If so, you don’t know a lot of people, or you are heavy into the lLLM scene.