Comment on My new laptop chip has an 'AI' processor in it, and it's a complete waste of space
girsaysdoom@sh.itjust.works 3 days agoThis might partially answer your question: https://github.com/ollama/ollama/issues/5186.
It looks like the answer is, it depends on what you want to run as some configs are partially supported but there’s no clear cut support yet?
sheogorath@lemmy.world 3 days ago
I tried running some models on an Intel 155h NPU and the performance is actually worse than using the CPU directly for inference. However, it wins on power consumption front IIRC.