The Risk of Too Much Access with Microsoft Copilot

Microsoft Copilot AI designed to provide intelligence with the likes of Microsoft Word, Excel and Powerpoint
Microsoft Copilot May 25, 2024

When ChatGPT became widely available, it wasn’t long before security-conscious companies started clamping down on its use by employees. The helpfulness of the AI tool was outweighed, in their calculations, by the risk of sensitive data being shared with the chatbot, perhaps violating data security compliance requirements. Caution when using a publicly accessible large language model (LLM) to enhance productivity is, frankly, just common sense.

But what happens when the generative AI tools are built into the business software your company uses every day? Microsoft 365 Copilot has been integrated into apps like Word, Excel, and Teams, offering an unprecedented opportunity to enhance productivity and creativity by helping to effectively automate necessary but tedious tasks. However, the access Copilot has to a company’s data can be a potential source of significant problems if the organization hasn’t thought through the implications of how it interacts with and shares that information.

Understanding Microsoft 365 Copilot

What differentiates Microsoft Copilot AI from other AI tools like ChatGPT? It is connected not just to the apps an employee uses, but also to the data an organization stores in Microsoft 365. Imagine Copilot integrating information from notes, email threads, and past PowerPoint presentations to create a new document, or creating a summary complete with a list of action items from a Teams meeting, and you will start to get a sense of its potential uses. The process is simple: type a prompt into Copilot, and Microsoft looks at your 365 permissions to get the context. The prompt is sent to an LLM and Microsoft performs a responsible AI check before the output is sent back to the user.

The catch here is that Microsoft Copilot doesn’t just access sensitive information—it can also generate sensitive information as part of its function. This makes overly permissive data access, which is a persistent problem in many organizations, more dangerous than it might be otherwise.

Understanding Possible Security Risks with Microsoft Copilot

Assigning appropriate permissions and security settings is (or should be) top of mind for any application a business uses. Overly restrictive settings hamper productivity, while a lack of limits leaves sensitive or proprietary information exposed. Finding the just-right balance that gives employees access to only the information they need to do their jobs effectively is the goal. In that sense, using Microsoft Copilot is no different from other types of business applications.

However, overly permissive data access is a bigger issue for Microsoft Copilot for a few key reasons. First, when given a prompt, Copilot will access all the data the employee has access to, even if it includes sensitive data they shouldn’t have. A human might never stumble on that information because they don’t think to look where they don’t need to, but Copilot will find it. Unfortunately, Copilot’s results do not inherit security labels from their source material. That puts the onus on the employee to check the work and ensure that the data in it is appropriately shared or kept private. To put it mildly, relying on individual employees throughout an organization to perfectly maintain data security in this situation is a recipe for disaster.

It isn’t difficult to imagine nightmare scenarios where confidential employee information makes it into a widely shared HR report, or unreleased financial figures become unintentionally public in a quarterly report. How can you ensure that confidential data remains safe while still gaining the competitive advantage that Microsoft Copilot can lend to your teams? Right Click has the answer.

Your 24/7 IT and Cybersecurity Experts

From its founding, Orange County–based Right Click, Inc. has been focused on ensuring that businesses have safe access to the technology that will allow them to be competitive. Our expertise in cybersecurity and compliance, as well as our deep experience with Microsoft’s products through our 25+ years as a Microsoft partner, can help your organization mitigate data risk and properly configure your permissions to use Microsoft Copilot safely and effectively. To find out more, contact us here today.

For all business related IT and Cybersecurity emergency get help HERE

YOUR BUSINESS IS OUR PRIORITY!

Top