Attached: 1 image
So, Microsoft is silently installing Copilot onto Windows Server 2022 systems and this is a disaster.
How can you push a tool that siphons data to a third party onto a security-critical system?
What privileges does it have upon install? Who thought this is a good idea? And most importantly, who needs this?
#infosec #security #openai #microsoft #windowsserver #copilot
Or if CoPilot starts exfiltrating data to Microsoft so their server farms can ‘analyze’ it.
I’m not heavily involved in the space, but I’m given to understand that MS isn’t very clear about what happens to your data or how it gets used or shared.
Perhaps Microsoft will be smart enough not to allow the general public to query trade secrets or government data that’s been pulled via unwanted copilot integration.
But maybe the ongoing Russian hack of Microsoft will make it irrelevant, because the servers can be accessed directly.
Or perhaps at some distant time, Microsoft will roll out features or technologies developed using an internal version of CoPilot that has access to all data - including proprietary information from competitors.
And that’s not even counting what ISP’s will do if they find a way to analyze copilot traffic, or what state actors will do if they can set up MitM attacks for Copilot.
Honestly, I sort of fear the repercussions, but I look forward to the lawsuits.
Exchange allows users to access data and Microsoft services and it comes with good documentation and a whole slew of controls for org admins.
Active Directory provides authentication services, and it is mostly for your internal users (so they can access org services, including Exchange), but it’s very common to allow guests and to federate under certain circumstances, so your AD talks to their AD and external guests can authenticate and use resources that have been shared with them.
It is also well-documented with tight control in the hands of administrators.
Copilot is a black box. Their terms of service are vague. Microsoft’s responsible AI website comprises of marketing speak, no details, and the standards guide on the site is mostly questions that amount to “TBD”. Administrative ability to control data sharing is non-existent, not yet developed, or minimal.
We don’t know the scope of data gathered, the retention and handling policies, or where that data/any models built from that data are going to wind up.
My read is that they’re waiting to be sued or legislated before they impose any limits on themselves.
Now, on the other hand, the number of breaches has gone way up recently. Microsoft has pushed AD and Exchange into the cloud recently. And they just had several instances where keys were stolen and passwords were left in the clear for months after they were notified, as well…
Well we have no solid evidence but it’s certainly within the realm of possibility.
There’s no need to degrade performance to get a lawsuit, the simple fact of extrading data can get you in a tribunal, especially from customers with high privacy requirements, or with European sovereign clouds certifications
No enterprise is going to want to deal with that and realistically they’re the only ones with the pockets to fight that battle.
If introducing Copilot to server degrades service enough to trigger an SLA downstream, you can absolutely bet lawyers will get involved.
Or if CoPilot starts exfiltrating data to Microsoft so their server farms can ‘analyze’ it.
I’m not heavily involved in the space, but I’m given to understand that MS isn’t very clear about what happens to your data or how it gets used or shared.
Perhaps Microsoft will be smart enough not to allow the general public to query trade secrets or government data that’s been pulled via unwanted copilot integration.
But maybe the ongoing Russian hack of Microsoft will make it irrelevant, because the servers can be accessed directly.
Or perhaps at some distant time, Microsoft will roll out features or technologies developed using an internal version of CoPilot that has access to all data - including proprietary information from competitors.
And that’s not even counting what ISP’s will do if they find a way to analyze copilot traffic, or what state actors will do if they can set up MitM attacks for Copilot.
Honestly, I sort of fear the repercussions, but I look forward to the lawsuits.
I thought the Microsoft technologies designed to allow anyone to access your servers were called Exchange and Active Directory.
Exchange allows users to access data and Microsoft services and it comes with good documentation and a whole slew of controls for org admins.
Active Directory provides authentication services, and it is mostly for your internal users (so they can access org services, including Exchange), but it’s very common to allow guests and to federate under certain circumstances, so your AD talks to their AD and external guests can authenticate and use resources that have been shared with them.
It is also well-documented with tight control in the hands of administrators.
Copilot is a black box. Their terms of service are vague. Microsoft’s responsible AI website comprises of marketing speak, no details, and the standards guide on the site is mostly questions that amount to “TBD”. Administrative ability to control data sharing is non-existent, not yet developed, or minimal.
We don’t know the scope of data gathered, the retention and handling policies, or where that data/any models built from that data are going to wind up.
My read is that they’re waiting to be sued or legislated before they impose any limits on themselves.
Please explain more.
I’m not confident they are authoritative on the matter.
Don’t consider me to be, either, but I have more details in my response to them.
https://lemmy.sdf.org/comment/10635782
I read their message as a joke that AD was an easy vector for an adversary
Yeah… I realized that like an hour later, and couldn’t figure out how to respond appropriately. Then I forgot all about it because ADHD.
But yeah. I definitely got whooshed here.
In my defense, I guess I wasn’t expecting to see a joke in the thread, so … well, I didn’t see one.
Usually those are the ones all those companies and organizations are using who have their files encrypted by malware.
Yes that’s because pretty much all companies use AD and exchange is also popular (but less so now with exchange online)
Both are also extremely valuable for companies and thus attackers.
Ransomware attacks pretty much always rely on missconfiguration and/or social engineering.
Correlation != Causation.
Now, on the other hand, the number of breaches has gone way up recently. Microsoft has pushed AD and Exchange into the cloud recently. And they just had several instances where keys were stolen and passwords were left in the clear for months after they were notified, as well…
Well we have no solid evidence but it’s certainly within the realm of possibility.
There’s no need to degrade performance to get a lawsuit, the simple fact of extrading data can get you in a tribunal, especially from customers with high privacy requirements, or with European sovereign clouds certifications
Wonky server double post
deleted by creator