Hey, mine is empty. Can anyone recommend something I could put in there to poison it?
A couple hundred million 0kb files?
Comment on Microsoft sets Copilot agents loose on your OneDrive files
sad_detective_man@sopuli.xyz 17 hours ago
Hey, mine is empty. Can anyone recommend something I could put in there to poison it?
Hey, mine is empty. Can anyone recommend something I could put in there to poison it?
A couple hundred million 0kb files?
That won’t poison an LLM exactly.
www.anthropic.com/research/small-samples-poison#%….
Theoretically this is a place to start. They probably have mitigations for many of these.
They probably have mitigations for many of these.
Have you seen the state of testing for Microsoft products nowadays? Or rather the apparently complete lack of testing.
I found this study, it looked promising but I think it only works on the one LLM they were targeting. Also they seem to be working to protect ai models so results they find will probably be implemented as ways to protect against poisoning. I guess intentional dataset poisoning hasn’t come as far as I hoped
A ton of folders
zip bomb
You could have a really simple Markov chain generator fill a gigabyte’s worth of .txt files with nonsense sentences. At least that’s “content” they have to parse.
nexguy@lemmy.world 16 hours ago
Epstine files
sad_detective_man@sopuli.xyz 14 hours ago
Not a bad idea