Skip to content

Microsoft launches GPT-4 powered Security Copilot

  • by
  • 3 min read

Microsoft has followed up its Copilot for Office with another GPT-4 powered Security Copilot that’s meant to assist security professionals to identify breaches as well as analyse files and data. Other than GPT-4, the Security Copilot is also powered by Microsoft’s custom security-specific model that uses 65 trillion daily signals Microsoft collects in its threat intelligence analysis and security-specific signals.

The tool is rolling out to a “few customers” starting March 28. Microsoft hasn’t announced a specific launch date for public access yet and considering the sensitive nature of the tool, the company seems to be rather careful putting it out there in the world. 

Security Copilot can analyse incidents with just a prompt. | Source: Microsoft

Security Copilot looks a lot like ChatGPT, with a simple prompt box where you can type in all your questions. The important thing to note here is that Security Copilot is meant to assist a security professional’s work instead of replacing it altogether. For that reason, the tool also has a pinboard section where teams can share information and work collaboratively. The tool is aimed at simplifying the complicated process that is security analysis, catching what a human investigator might miss and addressing the talent gap that often limits a team’s capacity.

In addition to accepting natural language prompts, the tool can also take in files, URLs or code snippets for analysis and even ask for the incident and alert information from other security tools. Additionally, all prompts and analyses are automatically saved giving a full analysis trail for investigators. 

Other features include a prompt book feature. This is basically a set of steps or automated processes that investigators and users can put together ahead of time and run with a single click. Finally, when all the investigation and analysis are done, Security Copilot can even put together a Powerpoint slideshow outlining incidents, attack vectors and any other information you need. 

That said, while the interface might look like ChatGPT or Microsoft’s Bing Chat, Security Copilot is limited to just security-related questions. Microsoft is also taking common chatbot problems like accuracy and hallucinations more seriously with Security Copilot.

Security reporting is made easier with Security Copilot. | Source: Microsoft

For starters, the feedback system is far more advanced than the simple thumbs up or down you see on Bing Chat. Instead, Microsoft will ask users exactly what’s wrong to get a better idea of any hallucinations the tool might experience. 

Combined with Bing Chat, Copilot for Office and Microsoft-owned Github’s own Copilot, Security Copilot is part of Microsoft’s massive push towards AI integration into Windows and Windows-related apps among other products offered by the company. There are no signs of slowing down either, meaning we could see AI-assisted features in other services and products offered by the company as well. 

In the News: Novel macOS malware steals iCloud keychain data and passwords

Yadullah Abidi

Yadullah Abidi

Yadullah is a Computer Science graduate who writes/edits/shoots/codes all things cybersecurity, gaming, and tech hardware. When he's not, he streams himself racing virtual cars. He's been writing and reporting on tech and cybersecurity with websites like Candid.Technology and MakeUseOf since 2018. You can contact him here: yadullahabidi@pm.me.

>