felsiq
@felsiq@piefed.zip
- Comment on Megrez2: 21B latent, 7.5B on VRAM, 3B active—MoE on single 8GB card 4 days ago:
Trying to literally ELI5 so this might be oversimplified a bit:
New AI model using a Mixture of Experts (MoE) approach, which combines different AIs that are optimized for certain things into one AI that’s good at more things. This usually needs a lot of space on graphics cards and requires really high end hardware, but this model fits into 8gb of space on a card, which is a very common amount to have on a modern graphics card, so many more people will be able to use it.
- Comment on Mozilla Integrates Google Lens for Visual Search in Firefox Desktop 4 days ago:
Hard fucking pass, thanks
- Comment on E-Paper Display Reaches the Realm of LCD Screens 3 weeks ago:
Wow this is super compelling, pairing one of these with a raspberry pi could actually be so sick
- Comment on Lenovo demos laptop with a screen you can swivel into portrait mode 3 weeks ago:
Both monitors I’ve used over the last decade can do this, and I think I’ve actually turned the screen like twice in that time frame. Laptop screens are already shitty and fragile enough for me, thanks