Skip to content
Cisco AI Defense logo
CiscoAI Security

Installation — Skill Scanner

Installation

Prerequisites

DependencyVersionNotes
Python3.10–3.13Required for the CLI and all analyzers
uvLatestRecommended package manager (install)
pipLatestAlternative to uv

Install from PyPI

# Using uv (recommended)
uv pip install cisco-ai-skill-scanner

# Using pip
pip install cisco-ai-skill-scanner

Cloud Provider Extras

Install optional extras for managed LLM cloud services:

# AWS Bedrock support (IAM credentials)
pip install cisco-ai-skill-scanner[bedrock]

# Google Vertex AI support
pip install cisco-ai-skill-scanner[vertex]

# Azure OpenAI support (managed identity)
pip install cisco-ai-skill-scanner[azure]

# All cloud providers
pip install cisco-ai-skill-scanner[all]

Install from Source

Use this if you want to contribute or run the latest development version:

git clone https://github.com/cisco-ai-defense/skill-scanner
cd skill-scanner
uv sync --all-extras

Run commands with uv run skill-scanner when using a source install.


Verify Installation

skill-scanner --help
skill-scanner list-analyzers

Environment Configuration

All configuration is done through environment variables. No config files are required for basic scanning.

LLM Analyzer and Meta-Analyzer

VariableDescription
SKILL_SCANNER_LLM_API_KEYAPI key for the LLM provider
SKILL_SCANNER_LLM_MODELModel name (e.g., anthropic/claude-sonnet-4-20250514, gpt-4o)

External Analyzers

VariableDescriptionRequired for
VIRUSTOTAL_API_KEYVirusTotal API key--use-virustotal
AI_DEFENSE_API_KEYCisco AI Defense API key--use-aidefense
AI_DEFENSE_API_URLCisco AI Defense endpoint URL--use-aidefense

Quick Setup

# Required only for LLM-powered analysis
export SKILL_SCANNER_LLM_API_KEY="your_api_key"
export SKILL_SCANNER_LLM_MODEL="anthropic/claude-sonnet-4-20250514"

# Optional: VirusTotal binary scanning
export VIRUSTOTAL_API_KEY="your_virustotal_api_key"

# Optional: Cisco AI Defense
export AI_DEFENSE_API_KEY="your_aidefense_api_key"

LLM Provider Setup

The LLM and Meta analyzers use LiteLLM under the hood, supporting many providers through a unified interface.

ProviderModel String ExampleExtra Required
Anthropicanthropic/claude-sonnet-4-20250514None
OpenAIgpt-4oNone
AWS Bedrockbedrock/anthropic.claude-3-sonnet-...[bedrock]
Google Vertexvertex_ai/gemini-pro[vertex]
Azure OpenAIazure/gpt-4o[azure]

Set --llm-provider on the CLI or configure via the SKILL_SCANNER_LLM_MODEL environment variable using LiteLLM model naming conventions.

See the Dependencies and LLM Providers reference for provider-specific setup instructions.