Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Microsoft Copilot has been banned for use by US House staff members, at least for now

⁨270⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨year⁩ ago⁩ by ⁨throws_lemy@lemmy.nz⁩ to ⁨technology@lemmy.zip⁩

https://www.neowin.net/news/microsoft-copilot-has-been-banned-for-use-by-us-house-staff-members-at-least-for-now/

source

Comments

Sort:hotnewtop
  • NounsAndWords@lemmy.world ⁨1⁩ ⁨year⁩ ago

    Until we either solve the problem of LLMs providing false information or the problem of people being too lazy to fact check their work, this is probably the correct course of action.

    source
    • Limeey@lemmy.world ⁨1⁩ ⁨year⁩ ago

      I can’t imagine using any LLM for anything factual. It’s useful for generating boilerplate and that’s basically it. Any time I try to get it to find errors in what I’ve written (either communication or code) it’s basically worthless.

      source
      • Eyck_of_denesle@lemmy.zip ⁨1⁩ ⁨year⁩ ago

        My little brother was using gpt for homework and he asked it the probability of extra Sunday in a leap year(52 weeks 2 days) and it said 3/8. One of the possible outcomes it listed was fkng Sunday, Sunday. I asked how two sundays can come consecutively and it made up a whole bunch of bs. The answer is so simple 2/7. The sources it listed also had the correct answer.

        source
        • -> View More Comments
      • QuaternionsRock@lemmy.world ⁨1⁩ ⁨year⁩ ago

        Really? It spotted a missing push_back like 600 lines deep for me a few days ago. I’ve also had good success at getting it to spot missing semicolons that C++ compilers can’t because C++ is a stupid language.

        source
        • -> View More Comments
      • AeroLemming@lemm.ee ⁨1⁩ ⁨year⁩ ago
        [deleted]
        source
        • -> View More Comments
      • Wizard_Pope@lemmy.world ⁨1⁩ ⁨year⁩ ago

        I find it useful for quickly reformating smaller sample sizes of tables and similar for my reports. It’s often far simpler and quicker to just drop that in there and say what to dp than to program a short python script

        source
    • TrickDacy@lemmy.world ⁨1⁩ ⁨year⁩ ago

      Imo the human laziness is the issue. Every thread where a lot of people chime in about ai, so many talking about how it’s useless because it’s wrong sometimes. It’s basically like people who use Wikipedia but can’t be bothered to cross reference… Except lazier. They literally expect a machine to be flawless because it seems confident or something?

      source
      • Sylvartas@lemmy.world ⁨1⁩ ⁨year⁩ ago

        I think you’re missing the point. I don’t like copilot/chat gpt for important stuff because if I have to double check their solutions I barely gained any time. Especially since it’s correct more often than not because it will make me complacent over enough time (the professors who were patient enough to actually explain why we shouldn’t be using Wikipedia as a primary source also used the same point which I thought made a lot of sense).

        source
        • -> View More Comments
  • Kit@lemmy.blahaj.zone ⁨1⁩ ⁨year⁩ ago

    I’m lead 365 admin for a major corporation and have been working with MS to identify if Copilot would be beneficial and secure for my org. Some major takeaways from my recent meetings with them:

    There’s two parts to Copilot. 1. Copilot 2. Copilot for 365.

    The first is basically Chat GPT. It reaches out to the web to get info and essentially works as a search engine.

    The 2nd part is internal only. It can do things like summarize meetings, compare documents, and search your emails. It abides by the same security, compliance, encryption, and DLP policies as the rest of your tenant.

    You can open up access to one or both.

    Government tenants are a unique case. There’s a specific 365 license for government entities, and their offerings are different from other organizations. This news article isn’t surprising - all new 365 offerings take a while before they’re available to government licenses. It will eventually be available.

    source
    • thisisnotgoingwell@programming.dev ⁨1⁩ ⁨year⁩ ago

      Few questions about that, unless they’re literally taking their model and putting it into your own box using it’s own compute power, I don’t see how that’s possible. They can call it “your” copilot all they want but if they’re reading your data and prompts and computing that on their own box.

      source
      • Kit@lemmy.blahaj.zone ⁨1⁩ ⁨year⁩ ago

        Major organizations use encryption where they hold the keys so Microsoft is unable to read their data. They can have thousands of servers running on Microsoft’s Azure stack and yet Microsoft is unable to read the data that is being processed.

        source
        • -> View More Comments
      • BurningRiver@beehaw.org ⁨1⁩ ⁨year⁩ ago

        I’m not an admin, but I do provision ms cloud licensing and have run across this question more than a few times. At the enterprise level, I’m told the copilot data is “walled off” and secure, and not harvested by MS. I have nothing to back that up, but that’s what I’m told. I’m certain if it weren’t true, I would have heard about it by now.

        source