Ask Microsoft: Are you using our personal data to train AI?
We had four lawyers, three privacy experts, and two campaigners look at Microsoft's new Service Agreement, and none of our experts could tell if Microsoft plans on using your personal data – including audio, video, chat, and attachments from 130 products, including Office, Skype, Teams, and Xbox – to train its AI models.
If nine experts in privacy can't understand what Microsoft does with your data, what chance does the average person have? That's why we're asking Microsoft to say if they're going to use our personal data to train its AI.
Hot tip: If you're switching to Linux and you're not sure how to do something - ask your favourite LLM AI chatbot for help.
There's typically some terminal command or config file or something that you can do to get what you want, and I'm sure it all makes sense to an experienced linux person, but its not easy to guess what to do as a novice. But since all the commands and such are well documented, you can get pretty good advice from the AI. As usual, it won't be completely reliable - but you can think of it as a bit like asking a tech expert for help over the phone. They know a lot and can help you - but they can't see exactly what's on the screen and they may 'misremember' some details from time to time. So it isn't perfect, but it's certainly good enough to find what you are looking for.
(Or you can just ask a real person. Those are pretty helpful too.)
Homie appreciate you typing all that but I'm a full time Linux engineer. If anything stuff I have worked on and written has become part of the AI landscape.
Fair enough. I didn't really mean to direct what I was saying at you specifically; rather I was just kind of continuing the conversation.
And yeah, the only reason that the AI stuff works at all is because people have taken the time to write down good advice in the past - which has then unforeseeably used as AI training data (without consultation or compensation...) So yeah, I wouldn't be surprised if your work was in there somewhere.