<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1525762147722832&amp;ev=PageView&amp;noscript=1">
Morten Yndesdal

How to prepare for Copilot

Microsoft's AI assistant Copilot will soon bring artificial intelligence available to Microsoft 365’s large user base. The potential productivity gains are great. But beware: without control over usage and your organization's data, there are many pitfalls.

After working with productivity tools and the modern workplace for many years, I've learned that one thing is crucial: control over your organization's data. I dare to say that I could look at any business in the Nordics and point out issues that could result in a privacy-related fine. In addition, as with any new process, you need a goal, a strategy, and a plan for how to get to where you want to be.

 Is your data ready for Copilot? Learn more about Innofactor's Microsoft Copilot Readiness Kit!


Is this a crisis? Well, as said, your organization could receive a fine, which is a bit grim, but it gets much worse if data you believe to be safe and secure is shared with unauthorized individuals. The loss of reputation in the market and the associated financial hit can often be even greater. With Copilot, it's crucial that organizations have a plan for how they use the AI assistant and have a firm grip on the data it will use.

Microsoft 365 Copilot is a digital assistant powered by artificial intelligence with the aim to provide personalized assistance to users for a variety of tasks and activities. Copilot not only brings functionality familiarized by ChatGPT to Microsoft 365, but it combines the power of large language models with your data in Microsoft Graph (calendar, email messages, chats, documents, meetings, etc.) and Microsoft 365 apps (Word, Excel, PowerPoint, Outlook, and Teams) to offer you significant potential productivity improvements. Thinking that the risk to adopt it is too great is like burying your head in the sand and leaving productivity gains to your competitors.

"Thinking that the risk to adopt Microsoft 365 Copilot is too great is like burying your head in the sand and leaving productivity gains to your competitors."

Control before you jump on the bandwagon
Copilot 555x500 KSP Image

So, what do you need to consider before unleashing AI’s power on your organization's data?

Firstly, you need to have your data in a place where it’s both accessible and controlled. It goes without saying that not all data is meant for everyone. If you're creating a report on the company's largest customers, it's crucial that the public version doesn't include sensitive company data. On the other hand, internal reports on the same topic would be less useful if such data is excluded.

And as soon as you start involving personal data, the picture becomes even more complicated. A manager, for example, may want to include sensitive data in a particular assessment, but such data must by no means be accessible to everyone. Therefore, data classification is a clear necessity. Data security is equally important. It is smart regardless of artificial intelligence, but it becomes extremely relevant with Copilot. The leaders in all organizations should be concerned about data control and ask questions about it before using Copilot.

Copilot is suitable for all organizations

If you are there or on your way to having full control over your data, you can tackle the next challenge: Copilot is incredibly smart, but it must be used correctly.

You need to know how to use it and how to ask questions to get something useful out of it. Yes, some things come "right out of the box" if the data is in place, but without proper training and understanding, you won't get far. There are no magic buttons to press, although Copilot will partly point out "where the buttons are" in the form of common queries.

Information architects who understand data security, data volumes, documents, and data collection have to be involved. In addition, personnel and HR, those responsible for knowing and facilitating training and how AI should be used, should be involved early and understand their roles and responsibilities. Guidelines must be placed. The use of AI must be aligned in terms of ethics, laws, and privacy.

Finally, you should apply a healthy dose of critical thinking. "You should not blindly trust and believe everything you hear," we often tell children, and this can be introduced into the company's standard training too. Copilot can help you quickly obtain and present information, but what the content actually is, where it comes from, how sensitive it is, etc., must be decided by the user. There's a reason it's not called Autopilot!

I'm confident that Copilot is something all businesses can benefit from to gain productivity on many tasks that currently consume our time. But starting an AI project with a 12-month perspective is like throwing money out the window because developments are happening so quickly that you need to do things faster and smarter. Choose tasks and areas for proof of concept, figure out how you'll implement them, and build expertise.

But the starting point is to ask questions about your organization's data. Do we have control?


Morten Yndesdal

Principal Consultant at Innofactor