Comment on Microsoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.

<- View Parent
scholar@lemmy.world ⁨6⁩ ⁨days⁩ ago

No that would require AI (Actual Intelligence)

source
Sort:hotnewtop