Skip to content
Cisco
CiscoAI Security

Installation

From the Marketplace

  1. Open VS Code, Cursor, Windsurf, or Antigravity
  2. Go to Extensions (Cmd/Ctrl+Shift+X)
  3. Search for "Cisco AI Security Scanner"
  4. Click Install

From VSIX

  1. Download the latest .vsix file from GitHub Releases
  2. Open the Command Palette (Cmd/Ctrl+Shift+P)
  3. Run "Extensions: Install from VSIX..."
  4. Select the downloaded file and reload

Requirements

Required:

  • Python 3.10+ — If not available, a portable Python 3.11 runtime is downloaded automatically on first use. You can also point to a specific Python binary via the mcp-scanner.python.customPath setting.

Optional (for enhanced analysis):

  • LLM API Key — For AI-driven analysis (see provider setup below)
  • Cisco AI Defense API Key — For cloud-based threat classification
  • VirusTotal API Key — For binary file hash lookup and scanning

Post-Install Setup

  1. Look for the shield icon in the Activity Bar (left sidebar) to open the Security Scanner panel.
  2. The Setup Wizard opens automatically on first launch. Follow its steps, or configure manually:
    • Run "Configure LLM Provider" from the Command Palette to set up AI analysis
    • Run "Configure Cisco AI Defense" for cloud-based threat classification
    • Run "Configure VirusTotal API" for binary scanning
  3. Run "Scan All (MCP + Skills)" to start your first scan.

LLM Provider Setup

The scanner supports nine LLM providers for AI-driven analysis. You only need one. YARA and behavioral analysis run locally without any provider.

OpenAI

Key formatsk-... (may include proj- or org- prefix)
Default modelgpt-4o
Get your keyplatform.openai.com/api-keys
  1. Run "Configure LLM Provider" from the Command Palette
  2. Select OpenAI as the provider
  3. Paste your API key
  4. Ensure your account has API credits available

Anthropic

Key formatsk-ant-...
Default modelclaude-3-5-sonnet-20241022
Get your keyconsole.anthropic.com/settings/keys
  1. Run "Configure LLM Provider" from the Command Palette
  2. Select Anthropic as the provider
  3. Paste your API key

Azure OpenAI

Key formatAzure API key from the Azure Portal
Endpoint formathttps://YOUR-RESOURCE.openai.azure.com/
Docslearn.microsoft.com/azure/ai-services/openai
  1. Run "Configure LLM Provider" and select Azure OpenAI
  2. Paste your API key
  3. Set mcp-scanner.llm.azureEndpoint to your resource endpoint
  4. Set mcp-scanner.llm.azureDeployment to your deployment name

Azure AI Services (Claude/Llama)

Endpoint formathttps://YOUR-RESOURCE.services.ai.azure.com
Default modelclaude-opus-4-5
Docslearn.microsoft.com/azure/ai-studio

Setup is the same as Azure OpenAI — configure the API key, endpoint, and deployment name. Uses the Anthropic API format with Azure authentication.

AWS Bedrock

Key formatAWS Secret Access Key
Default modelanthropic.claude-3-5-sonnet-20241022-v2:0
Docsdocs.aws.amazon.com/bedrock
  1. Create IAM credentials in AWS Console → IAM → Users
  2. Grant bedrock:InvokeModel permission to your IAM user
  3. Request model access in the Bedrock console before use
  4. Run "Configure LLM Provider", select AWS Bedrock, and paste your secret key
  5. Set the AWS region in settings (must have Bedrock access enabled, e.g. us-east-1)

GCP Vertex AI

Key formatPath to a service account JSON file, or leave empty for default credentials
Default modelgemini-1.5-pro
Docscloud.google.com/vertex-ai/docs
  1. Create a service account in GCP Console → IAM → Service Accounts
  2. Grant the Vertex AI User role
  3. Download the JSON key file
  4. Run "Configure LLM Provider", select GCP Vertex, and provide the path to the JSON file
  5. Set your project ID and region in settings

Alternatively, leave the key empty to use Application Default Credentials (gcloud auth application-default login).

Ollama (Local / Offline)

KeyNo API key required
Default modelllama3.2
Default endpointhttp://localhost:11434
Docsollama.ai
  1. Install Ollama from ollama.ai
  2. Pull a model: ollama pull llama3.2
  3. Ensure Ollama is running: ollama serve
  4. Run "Configure LLM Provider", select Ollama
  5. Adjust the endpoint in settings if not using the default

OpenRouter

Key formatsk-or-...
Default modelopenai/gpt-4o
Get your keyopenrouter.ai/keys
  1. Run "Configure LLM Provider", select OpenRouter
  2. Paste your API key
  3. Model names include a provider prefix (e.g. openai/gpt-4o, anthropic/claude-3.5-sonnet)

Google AI Studio

Key formatAIzaSy...
Default modelgemini-1.5-pro
Get your keyaistudio.google.com/app/apikey
  1. Run "Configure LLM Provider", select Google AI Studio
  2. Paste your API key
  3. Free tier is available with usage limits

Supported Platforms

IDEs

  • VS Code
  • Cursor
  • Windsurf
  • Antigravity

MCP Configuration Sources

The scanner auto-discovers MCP server configurations from these locations:

SourceConfig file
Cursor~/.cursor/mcp.json (macOS), ~/.config/cursor/mcp.json (Linux), or %APPDATA%/Cursor/mcp.json (Windows)
Claude Desktop~/Library/Application Support/Claude/claude_desktop_config.json (macOS), ~/.config/claude/claude_desktop_config.json (Linux), or %APPDATA%/Claude/claude_desktop_config.json (Windows)
VS Code~/Library/Application Support/Code/User/mcp.json (macOS), ~/.config/Code/User/mcp.json (Linux), or %APPDATA%/Code/User/mcp.json (Windows)
Windsurf~/.codeium/windsurf/mcp_config.json (macOS), ~/.config/codeium/windsurf/mcp_config.json (Linux), or %APPDATA%/Codeium/windsurf/mcp_config.json (Windows)
Antigravity~/.antigravity/mcp.json (macOS), ~/.config/antigravity/mcp.json (Linux), or %APPDATA%/Antigravity/mcp.json (Windows)
Workspace.cursor/mcp.json, .vscode/mcp.json, mcp.json, .mcp/config.json in your project

Toggle individual sources on or off in the mcp-scanner.globalConfigs.* settings.

Skill Sources

SourceDefault pathSetting
Cursor Skills~/.cursor/skills/skill-scanner.globalSkills.claudeSkills
Claude Skills~/.claude/skills/skill-scanner.globalSkills.claudeSkills
Codex Skills~/.codex/skills/skill-scanner.globalSkills.claudeSkills
Antigravity Skills~/.gemini/antigravity/skills/skill-scanner.globalSkills.antigravitySkills
Workspace Skills.cursor/skills/, .claude/skills/, .codex/skills/, .agent/skills/, and any SKILL.md in projectAutomatic when scan scope includes workspace
Custom PathsAny additional directories you configureskill-scanner.globalSkills.customPaths

LLM Providers

ProviderLocal?Key prefix
OpenAINosk-
AnthropicNosk-ant-
Azure OpenAINo(Azure key)
Azure AI ServicesNo(Azure key)
AWS BedrockNo(AWS secret key)
GCP Vertex AINo(Service account JSON)
OllamaYesNone required
OpenRouterNosk-or-
Google AI StudioNoAIzaSy