Installation
Prerequisites
| Dependency | Version | Notes |
|---|---|---|
| Python | 3.10–3.13 | Required for the CLI and all analyzers |
| uv | Latest | Recommended package manager (install) |
| pip | Latest | Alternative to uv |
Install from PyPI
# Using uv (recommended)
uv pip install cisco-ai-skill-scanner
# Using pip
pip install cisco-ai-skill-scanner
Cloud Provider Extras
Install optional extras for managed LLM cloud services:
# AWS Bedrock support (IAM credentials)
pip install cisco-ai-skill-scanner[bedrock]
# Google Vertex AI support
pip install cisco-ai-skill-scanner[vertex]
# Azure OpenAI support (managed identity)
pip install cisco-ai-skill-scanner[azure]
# All cloud providers
pip install cisco-ai-skill-scanner[all]
Install from Source
Use this if you want to contribute or run the latest development version:
git clone https://github.com/cisco-ai-defense/skill-scanner
cd skill-scanner
uv sync --all-extras
Run commands with uv run skill-scanner when using a source install.
Verify Installation
skill-scanner --help
skill-scanner list-analyzers
Environment Configuration
All configuration is done through environment variables. No config files are required for basic scanning.
LLM Analyzer and Meta-Analyzer
| Variable | Description |
|---|---|
SKILL_SCANNER_LLM_API_KEY | API key for the LLM provider |
SKILL_SCANNER_LLM_MODEL | Model name (e.g., anthropic/claude-sonnet-4-20250514, gpt-4o) |
External Analyzers
| Variable | Description | Required for |
|---|---|---|
VIRUSTOTAL_API_KEY | VirusTotal API key | --use-virustotal |
AI_DEFENSE_API_KEY | Cisco AI Defense API key | --use-aidefense |
AI_DEFENSE_API_URL | Cisco AI Defense endpoint URL | --use-aidefense |
Quick Setup
# Required only for LLM-powered analysis
export SKILL_SCANNER_LLM_API_KEY="your_api_key"
export SKILL_SCANNER_LLM_MODEL="anthropic/claude-sonnet-4-20250514"
# Optional: VirusTotal binary scanning
export VIRUSTOTAL_API_KEY="your_virustotal_api_key"
# Optional: Cisco AI Defense
export AI_DEFENSE_API_KEY="your_aidefense_api_key"
LLM Provider Setup
The LLM and Meta analyzers use LiteLLM under the hood, supporting many providers through a unified interface.
| Provider | Model String Example | Extra Required |
|---|---|---|
| Anthropic | anthropic/claude-sonnet-4-20250514 | None |
| OpenAI | gpt-4o | None |
| AWS Bedrock | bedrock/anthropic.claude-3-sonnet-... | [bedrock] |
| Google Vertex | vertex_ai/gemini-pro | [vertex] |
| Azure OpenAI | azure/gpt-4o | [azure] |
Set --llm-provider on the CLI or configure via the SKILL_SCANNER_LLM_MODEL environment variable using LiteLLM model naming conventions.
See the Dependencies and LLM Providers reference for provider-specific setup instructions.