Below are some resources that are helpful in ramping up and using Security Copilot.
Watch this Microsoft Mechanics video that goes over the product end-to-end and provides insights into how it works.
- Link: https://youtu.be/0lg_derTkaM?si=QR2kCOPO2_LVa9v0
- Playlist: https://www.youtube.com/playlist?list=PL3ZTgFEc7LyuQRLD61q9YqPKEDlZj4j5u
Learn how to get started and use Security Copilot
Understanding how to prompt is paramount when using Security Copilot
Below are some of the projects some of the Security Copilot experts have created.
-
Link: https://github.com/Yaniv-Shasha/SecurityCopilot Author: Yaniv Shasha
-
Link: https://github.com/paulgoodsonMSFT/SecurityCopilotPlugins Author: Paul Goodson
-
Link: https://github.com/rod-trent/Security-Copilot Author: Rod Trent
Some of the technical docs that are helpful when starting with Security Copilot.
The Microsoft Cybersecurity Reference Architectures (MCRA) are the component of Microsoft's Security Adoption Framework (SAF) that describe Microsoft’s cybersecurity capabilities and technologies.
- Link: https://aka.ms/mcra
As you consider and evaluate AI enabled integration, it's critical to understand the shared responsibility model and which tasks the AI platform or application provider handle and which tasks you handle.
Below are some blogs from experts that go deeper into AI and Security Copilot.
-
Link: https://applied-gai-in-security.ghost.io/ Author: Brandon Dixon
-
Link: https://blog.openthreatresearch.com/demystifying-generative-ai-a-security-researchers-notes/ Author: Roberto Rodriguez
AI tokens can be thought of as pieces of words. When using AI models like GPT-3, the input text is broken down into these tokens before processing. These tokens are not necessarily split exactly where words start or end; they can include trailing spaces and even sub-words. Here are some helpful rules of thumb for understanding tokens:
- 1 token is approximately equal to 4 characters in English.
- 1 token is roughly equivalent to ¾ of a word.
- 100 tokens correspond to about 75 words or 1-2 sentences.
- 1 paragraph is approximately 100 tokens.
- 1,500 words translate to around 2,048 tokens.
Use tools like OpenAI's Tokenizer: https://platform.openai.com/tokenizer