The rapid adoption of AI tools has created a new category of risk: shadow AI. Employees frustrated by organisational restrictions on AI tools often resort to copying sensitive information into personal ChatGPT accounts or other unauthorised platforms.
“We saw when AI first came out, some companies just immediately put the doors down and said no, you cannot use it,” noted Wylie. “There wasn’t really an acknowledgement that these tools are really useful, but you need to understand the risks they bring.”
The solution isn’t blanket prohibition but providing approved AI tools with proper data controls, such as Microsoft Copilot configured to use only organisational data, combined with technical controls like data loss prevention software to monitor for sensitive information leaving the organisation.