Installation
From the Marketplace
Open VS Code , Cursor , Windsurf , or Antigravity
Go to Extensions (Cmd/Ctrl+Shift+X)
Search for "Cisco AI Security Scanner"
Click Install
From VSIX
Download the latest .vsix file from GitHub Releases
Open the Command Palette (Cmd/Ctrl+Shift+P)
Run "Extensions: Install from VSIX..."
Select the downloaded file and reload
Requirements
Required:
Python 3.10+ — If not available, a portable Python 3.11 runtime is downloaded automatically on first use. You can also point to a specific Python binary via the mcp-scanner.python.customPath setting.
Optional (for enhanced analysis):
LLM API Key — For AI-driven analysis (see provider setup below)
Cisco AI Defense API Key — For cloud-based threat classification
VirusTotal API Key — For binary file hash lookup and scanning
Post-Install Setup
Look for the shield icon in the Activity Bar (left sidebar) to open the Security Scanner panel.
The Setup Wizard opens automatically on first launch. Follow its steps, or configure manually:
Run "Configure LLM Provider" from the Command Palette to set up AI analysis
Run "Configure Cisco AI Defense" for cloud-based threat classification
Run "Configure VirusTotal API" for binary scanning
Run "Scan All (MCP + Skills)" to start your first scan.
LLM Provider Setup
The scanner supports nine LLM providers for AI-driven analysis. You only need one. YARA and behavioral analysis run locally without any provider.
OpenAI
Run "Configure LLM Provider" from the Command Palette
Select OpenAI as the provider
Paste your API key
Ensure your account has API credits available
Anthropic
Run "Configure LLM Provider" from the Command Palette
Select Anthropic as the provider
Paste your API key
Azure OpenAI
Run "Configure LLM Provider" and select Azure OpenAI
Paste your API key
Set mcp-scanner.llm.azureEndpoint to your resource endpoint
Set mcp-scanner.llm.azureDeployment to your deployment name
Azure AI Services (Claude/Llama)
Setup is the same as Azure OpenAI — configure the API key, endpoint, and deployment name. Uses the Anthropic API format with Azure authentication.
AWS Bedrock
Create IAM credentials in AWS Console → IAM → Users
Grant bedrock:InvokeModel permission to your IAM user
Request model access in the Bedrock console before use
Run "Configure LLM Provider" , select AWS Bedrock , and paste your secret key
Set the AWS region in settings (must have Bedrock access enabled, e.g. us-east-1)
GCP Vertex AI
Create a service account in GCP Console → IAM → Service Accounts
Grant the Vertex AI User role
Download the JSON key file
Run "Configure LLM Provider" , select GCP Vertex , and provide the path to the JSON file
Set your project ID and region in settings
Alternatively, leave the key empty to use Application Default Credentials (gcloud auth application-default login).
Ollama (Local / Offline)
Key No API key required Default model llama3.2Default endpoint http://localhost:11434Docs ollama.ai
Install Ollama from ollama.ai
Pull a model: ollama pull llama3.2
Ensure Ollama is running: ollama serve
Run "Configure LLM Provider" , select Ollama
Adjust the endpoint in settings if not using the default
OpenRouter
Run "Configure LLM Provider" , select OpenRouter
Paste your API key
Model names include a provider prefix (e.g. openai/gpt-4o, anthropic/claude-3.5-sonnet)
Google AI Studio
Run "Configure LLM Provider" , select Google AI Studio
Paste your API key
Free tier is available with usage limits
Supported Platforms
IDEs
VS Code
Cursor
Windsurf
Antigravity
MCP Configuration Sources
The scanner auto-discovers MCP server configurations from these locations:
Source Config file Cursor ~/.cursor/mcp.json (macOS), ~/.config/cursor/mcp.json (Linux), or %APPDATA%/Cursor/mcp.json (Windows)Claude Desktop ~/Library/Application Support/Claude/claude_desktop_config.json (macOS), ~/.config/claude/claude_desktop_config.json (Linux), or %APPDATA%/Claude/claude_desktop_config.json (Windows)VS Code ~/Library/Application Support/Code/User/mcp.json (macOS), ~/.config/Code/User/mcp.json (Linux), or %APPDATA%/Code/User/mcp.json (Windows)Windsurf ~/.codeium/windsurf/mcp_config.json (macOS), ~/.config/codeium/windsurf/mcp_config.json (Linux), or %APPDATA%/Codeium/windsurf/mcp_config.json (Windows)Antigravity ~/.antigravity/mcp.json (macOS), ~/.config/antigravity/mcp.json (Linux), or %APPDATA%/Antigravity/mcp.json (Windows)Workspace .cursor/mcp.json, .vscode/mcp.json, mcp.json, .mcp/config.json in your project
Toggle individual sources on or off in the mcp-scanner.globalConfigs.* settings.
Skill Sources
Source Default path Setting Cursor Skills ~/.cursor/skills/skill-scanner.globalSkills.claudeSkillsClaude Skills ~/.claude/skills/skill-scanner.globalSkills.claudeSkillsCodex Skills ~/.codex/skills/skill-scanner.globalSkills.claudeSkillsAntigravity Skills ~/.gemini/antigravity/skills/skill-scanner.globalSkills.antigravitySkillsWorkspace Skills .cursor/skills/, .claude/skills/, .codex/skills/, .agent/skills/, and any SKILL.md in projectAutomatic when scan scope includes workspace Custom Paths Any additional directories you configure skill-scanner.globalSkills.customPaths
LLM Providers
Provider Local? Key prefix OpenAI No sk-Anthropic No sk-ant-Azure OpenAI No (Azure key) Azure AI Services No (Azure key) AWS Bedrock No (AWS secret key) GCP Vertex AI No (Service account JSON) Ollama Yes None required OpenRouter No sk-or-Google AI Studio No AIzaSy
Previous Overview
Next Features