Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

My new laptop chip has an 'AI' processor in it, and it's a complete waste of space

⁨97⁩ ⁨likes⁩

Submitted ⁨⁨8⁩ ⁨hours⁩ ago⁩ by ⁨cm0002@piefed.world⁩ to ⁨technology@lemmy.zip⁩

https://www.pcgamer.com/hardware/my-new-laptop-chip-has-an-ai-processor-in-it-and-its-a-complete-waste-of-space/

source

Comments

Sort:hotnewtop
  • wuffah@lemmy.world ⁨1⁩ ⁨hour⁩ ago

    Microsoft, Google, and Apple are all quietly putting these and the software infrastructure in place in their operating systems to do on device classification of content: Windows Recall, Google Secure Core, and Apple Intelligence. These services are obsequiously marketed as being for your benefit, while are all privacy and surveillance nightmares. When the security breaking features of these systems are mentioned, each company touts some convoluted workaround to justify the tech.

    The real reason is to watch everything you do in your device, use the tensor hardware you may or may not know that you purchased to analyze the data locally, then export and sell that “anonymized” information to advertisers and the government. All while cryptographically tying the data to your device and the device to you for “security”. This enables mass surveillance, digital rights management, and target advertised previously unseen.

    Each of these providers already scans everything you upload to their services:

    Microsoft has been caught using your stored passwords to decrypt uploaded archives.

    Apple developed client side scanning for CSAM before backlash shut it down, and they already locally scan your photos with a machine learning classification algorithm whether you like it or not. You can’t turn it off.

    Google recently implemented local content scanning with SecureCore to protect you from unwanted content.

    I would rather saw off my own nuts with a rusty spork before willfully purchasing a device with one of these NPUs on-device. I fear that in the next 5-10 years, you won’t be able to avoid them.

    ** Do you trust the rising fascist regimes and their tech lackeys in America and the UK to use this information morally and responsibly?**

    Do you really believe that these features that no one asked for are and that you cannot disable are for your benefit?

    source
  • corroded@lemmy.world ⁨7⁩ ⁨hours⁩ ago

    I have to wonder if NPUs are just going to eventually become a normal part of the instruction set.

    When SIMD was first becoming a thing, it was advertised as accelerating “multimedia,” as that was the hot buzzword of the 1990s. Now, SIMD instructions are used everywhere, any place there is a benefit from processing an array of values in parallel.

    I could see NPUs becoming the same. Developers start using NPU instructions, and the compiler can “NPU-ify” scalar code when it thinks it’s appropriate.

    NPUs are advertised for “AI,” but they’re really just a specialized math coprocessor. I don’t really see this as a bad thing to have. Surely there are plenty of other uses.

    source
    • Dudewitbow@lemmy.zip ⁨7⁩ ⁨hours⁩ ago

      The problem that (local) ai has at the current moment is that its not just a single type of compute, and because of that, breaks usefulness in the pool of what you can do with it.

      on the Surface level, “AI” is a mixture of what is essentially FP16, FP8, and INT8 accelerators, and different implementations have been using different ones. NPUs are basically INT8 only, while GPU intensive ones are FP based, making them not inherently cross compatible.

      It forces devs to either think of the NPUs themselves with small things (e.g background blur with camera) as there isn’t any consumer level chip with a massive INT8 co processor except for the PS5 Pro (300 TOPS INT8, which compared to laptop cpus, have a 50 TOPs, so on a completely different league, PS5 Pro uses it to upscale)

      source
    • addie@feddit.uk ⁨5⁩ ⁨hours⁩ ago

      SIMD is pretty simple really, but it’s been 30 years since it’s been a standard-ish feature in CPUs, and modern compilers are “just about able to sometimes” use SIMD if you’ve got a very simple loop with fixed endpoints that might use it. It’s one thing that you might fall back to writing assembly to use - the FFmpeg developers had an article not too long ago about getting a 10% speed improvement by writing all the SIMD by hand.

      Using an NPU means recognising algorithms that can be broken down into parallelizable, networkable steps with information passing between cells. Basically, you’re playing a game of TIS-100 with your code. It’s fragile and difficult, and there’s no chance that your compiler will do that automatically.

      Best thing to hope for is that some standard libraries can implement it, and then we can all benefit. It’s an okay tool for ‘jobs that can be broken down into separate cells that interact’, so some kinds of image processing, maybe things like liquid flow simulations. There’s a very small overlap between ‘things that are just algorithms that the main CPU would do better’ and ‘things that can be broken down into many many simple steps that a GPU would do better’ where an NPU really makes sense, tho.

      source
  • CixoUwU@lemmy.cixoelectronic.pl ⁨4⁩ ⁨hours⁩ ago

    Yes, i agree and if it must run neural network it could do it on GPU, NPU is not necesary.

    source
    • Endmaker@ani.social ⁨3⁩ ⁨hours⁩ ago

      Someone with the expertise should correct me if I am wrong; it’s been 4-5 years since I learnt about NPUs during my internship so I am very rusty:

      You don’t even need a GPU if all you want to do is to run (i.e. perform inference using) a neural network (abbreviating it to NN). Just a CPU would do if the NN is sufficiently lightweight. The GPU is only needed to speed up the training of NNs.

      The thing is, the CPU is a general-purpose processor, so it won’t be able run the NN optimally / as efficiently as possible. Imagine you want to do something that requires the NN and as a result, you can’t do anything else on your phone / laptop (it won’t be problem for desktops with GPUs though).

      Where NPU really shines is when there are performance constraints on the model: when it has to be fast, lightweight and memory efficient. Use cases include mobile computing and IoT.

      In fact, there’s news about live translation on Apple AirPod. I think this may be the perfect scenario for using NPUs - ideally housed within the earphones directly but if not, within a phone.

      Disclaimer: I am only familiar with NPUs in the context of “old-school” convolutional neural networks (boy, tech moves so quickly). I am not familiar with NPUs for transformers - and LLMs by extension - but I won’t be surprised if NPUs have been adapted to work with them.

      source
      • rumba@lemmy.zip ⁨50⁩ ⁨minutes⁩ ago

        I’m not exactly an expert either but I believe the NPUs were seeing in the wild here are more like efficiency cores for AI.

        Using the GPU would be faster, but have much larger energy consumption. They’re basically mathco processors that are good at matrix calculations.

        source
  • TropicalDingdong@lemmy.world ⁨7⁩ ⁨hours⁩ ago

    would he rip the cuda cores out of his Nvidia GPU?

    source
  • cmnybo@discuss.tchncs.de ⁨7⁩ ⁨hours⁩ ago

    It would be nice if Frigate supported the NPU in AMD CPUs. Then it could be used for object detection with CCTV cameras.

    source
    • ch00f@lemmy.world ⁨7⁩ ⁨hours⁩ ago

      I love how Frigate is like…the only thing you can use AI processors for. I got a Coral M.2 card and used it for Frigate for like a year and then got bored. Sure, I can have it ID birds, but there’s not much more practical use that I found.

      source
      • cmnybo@discuss.tchncs.de ⁨6⁩ ⁨hours⁩ ago

        You can use object detection to send an alert if there is a person in your yard or if a car pulls into your driveway. It avoids the frequent false alerts you get with normal motion detection.

        source
        • -> View More Comments
  • twinnie@feddit.uk ⁨5⁩ ⁨hours⁩ ago

    I expect we’re eventually going to start seeing AI in more sensible places and these NPUs will prove useful. Hopefully soon the bubble will burst and we’ll stop seeing it crammed in everywhere, then it’ll start being used where it actually improves a product rather than wherever an LLM will fit.

    source
    • ohwhatfollyisman@lemmy.world ⁨4⁩ ⁨hours⁩ ago

      i’m upvoting this comment from my internet enabled toaster.

      source
  • Dudewitbow@lemmy.zip ⁨7⁩ ⁨hours⁩ ago

    So, yes, it would be unrealistic to suggest AMD could rip out the NPU and easily put that space to good use in this current chip.

    a few amd chips did have the NPU removed in them, except they were mostly allocated to handheld gaming devices. the Ryzen Z1E for example.

    when redreshed, there were a handful of hawkpoint cpus that were NPU less (Ryzen 7 8745HS)

    Strix point does not have that dumped die cycle yet

    source
  • Gullible@sh.itjust.works ⁨7⁩ ⁨hours⁩ ago

    I was excited when I learned that a new business laptop had a removable battery, a decent graphics card, and 1tb storage standard. I planned to buy it used for a fraction of its current price price, 2k usd new, once some doofus got bored of underusing their machine and decided to trade up. Then I saw the AI chip and my desire wavered. You think there will ever be workarounds to make use of this garbage? I really want that removable battery.

    source
    • Ashiette@lemmy.world ⁨6⁩ ⁨hours⁩ ago

      That NPU is a math coprocessor. It can be very useful. It’s like a cuda core.

      source
  • BigTrout75@lemmy.world ⁨7⁩ ⁨hours⁩ ago

    Bring on the games that require AI

    source
    • Endmaker@ani.social ⁨6⁩ ⁨hours⁩ ago

      I’m pretty sure most video games require AI already. I have difficulty naming even one that doesn’t use AI.

      Neural nets, on the other hand - I find it hard to imagine one running locally without impacting performance.

      source
      • Mika@sopuli.xyz ⁨3⁩ ⁨hours⁩ ago

        Hard, but should definitely be possible. I’m waiting for someone to write a decent roguelike, like imagine the possibilities.

        source