August 23rd, 2024

Companies ground Microsoft Copilot over data governance concerns

Many enterprises are pausing Microsoft Copilot implementations due to data governance concerns, with half of surveyed chief data officers restricting use over security issues and complex data access permissions.

Read original articleLink Icon
Companies ground Microsoft Copilot over data governance concerns

Concerns over data governance are leading many large enterprises to pause or restrict the use of Microsoft Copilot tools. Jack Berkowitz, chief data officer of Securiti, reported that about half of the 20-plus chief data officers he surveyed have grounded their Copilot implementations due to security and oversight issues. While Microsoft markets Copilot as a productivity enhancer, the rapid deployment of generative AI has outpaced the establishment of necessary safety protocols. Companies face challenges with complex data access permissions, particularly in environments like SharePoint and Office 365, where sensitive information could be inadvertently summarized and exposed by the AI. Berkowitz emphasized that achieving effective use of Copilot requires clean data and robust security measures, which many organizations lack due to historical data management practices. He noted that while some generative AI applications in customer service have shown positive returns, the overall sentiment among corporate clients is one of caution. To successfully integrate AI tools, companies need to enhance their data governance and observability, ensuring that they understand their data assets and access rights.

- Many enterprises are pausing Microsoft Copilot implementations due to data governance concerns.

- About half of surveyed chief data officers have restricted Copilot use.

- Security issues arise from complex data access permissions in existing systems.

- Effective use of AI tools requires clean data and robust security measures.

- Companies need improved data governance and observability for successful AI integration.

Link Icon 10 comments
By @snowwrestler - 8 months
The data governance concerns are not that Microsoft has access to the data (that is no doubt covered by contract clauses).

The concerns are that most corporate implementations of network roles and permissions are not up to date or accurate, so CoPilot will show data to an employee that they should not be allowed to see. Salary info is an example.

Basically, CoPilot is following “the rules” (technical settings) but corporate IT teams have not kept the technical rules up to date with the business rules. So they need to pause CoPilot until they get their own rules straight.

Edit to add: if your employer has CoPilot turned on, maybe try asking for sensitive stuff and see what you get. ;-)

By @dopylitty - 8 months
It would help if people stopped calling it "AI" and called it what it is which is enterprise document search, a thing that has existed forever and which these new tools are just poorly controllable version of.

Would you enable a search indexer on all your corporate data that doesn't have any way to control which documents are returned to which users? Probably not.

It's a known issue with SharePoint going back years and has various solutions[0] such as document level access controls or disabling indexing of content.

If we called it what it is though the C-levels probably wouldn't even care about it. They never cared about enterprise document search before and certainly didn't "pivot" to enterprise document search or report on the progress of enterprise document search implementation to the board.

0: https://sharepointmaven.com/3-ways-prevent-documents-appeari...

By @nerdjon - 8 months
Good, this was a poorly conceived idea to build into an OS and turn on by default.

It is the same problem with a lot of the AI tools right now. Using them for your code, looking at your documents, etc etc. Unless you self host it or use a 'private' service from Azure or AWS (which they say is safe...) who knows where this information is ending up.

This is a major leak waiting to happen. It scares me to think what kind of data has been fed into ChatGPT or some code tool that is just sitting somewhere in a log or something plaintext that could be found later.

By @bongodongobob - 8 months
Hey guys, this isn't the built in copilot that was scrapped. This is about corporate copilots in M365. The problem is that some companies have bad data hygiene and don't control who can access what. Copilot is working as intended but some teams have done a poor job of controlling permissions for data.
By @throwaway22032 - 8 months
If you upload your data to a third party without first encrypting it with a key known only to you it is no longer yours.

Everything else is just wishful thinking. Like trying to keep a secret whilst only telling one or two friends.

By @moron4hire - 8 months
Latest Visual Studio update has made CoPilot much more prominent. I can't tell if it's actually running or not. I can't figure out how to turn it off completely.

LLM-based AI is technically banned at my work. For somewhat good reason: most of our work involves confidential, controlled, or classified data. Though I've seen a lot of people, especially the juniors, still using ChatGPT every day.

Also noticed the UI has gotten a lot slower. I'm guessing the two things are related.

If my company wasn't locked into "Microsoft everything" this would push me the last inch to ditch VS completely. I already did at home.

By @TiredOfLife - 8 months
It's amazing how you can tell which publication this is, just by looking at title. No need even to look for the domain in brackets.
By @ActionHank - 8 months
At this point we should take the idea of companies being a legal individual, but introduce the idea of a social credit system that goes towards tax deductions based on your products actually improving the lives of people rather than stealing from them.

Want to build AI tooling that leverages user data? Great! * Does it gather their data for targeted ads? - neutral. * Does it gather their data to then be resold to others? - -100points, pay more tax, you're rent seeking. * Does it help the user not get phished? - +100points you're actually offering something of value.

I don't believe having humanoid robots in factories helps or is nearly as profitable as humanoid robots that will do my laundry for me.