Vision-module-auto/QUICKSTART.md
Svrnty 52b0813b64 feat: Add YAML-based configuration system
- Add default_config.yaml with customizable settings
- Model selection (minicpm-v, llama3.2-vision, llava)
- Customizable vision prompt for better responses
- Timing parameters (idle threshold, response delay)
- Approval keywords configuration
- User config at ~/.config/claude-vision-auto/config.yaml
- New command: claude-vision-config to generate user config
- Environment variables still override config files
- Added PyYAML dependency

Configuration priority:
1. Environment variables (highest)
2. User config (~/.config/claude-vision-auto/config.yaml)
3. Default config (package default)

Generated with Claude Code

Co-Authored-By: Claude <noreply@anthropic.com>
Co-Authored-By: Jean-Philippe Brule <jp@svrnty.io>
2025-10-29 10:19:35 -04:00

2.2 KiB

Quick Start Guide

Get Claude Vision Auto running in 5 minutes.

Prerequisites Check

# Check Claude Code
claude --version

# Check Docker
docker ps

# Check Python
python3 --version

Installation

cd /home/svrnty/claude-vision-auto

# Install system dependencies
sudo apt-get update && sudo apt-get install -y scrot

# Install Python package
make install

Start Ollama (if not running)

# Check if running
docker ps | grep ollama

# If not running, start it
docker run -d \
    -p 11434:11434 \
    --name ollama \
    --restart unless-stopped \
    ollama/ollama:latest

# Pull vision model
docker exec ollama ollama pull minicpm-v:latest

Test Installation

# Verify command
which claude-vision

# Test connection
claude-vision --help 2>&1 | head -5

First Run

# Start interactive session
claude-vision

# You should see:
# [Claude Vision Auto] Testing Ollama connection...
# [Claude Vision Auto] Connected to Ollama
# [Claude Vision Auto] Using model: minicpm-v:latest

Test Auto-Approval

# Try a simple command
claude-vision "create a test.md file in /tmp"

# Watch for auto-approval when prompted:
# [Vision] Analyzing prompt...
# [Vision] Response: 1
# [Vision] Response sent

Troubleshooting

Ollama Not Connected

docker start ollama
docker exec ollama ollama pull minicpm-v:latest

Screenshot Fails

sudo apt-get install scrot
scrot /tmp/test.png  # Test it works

Command Not Found

export PATH="$HOME/.local/bin:$PATH"
source ~/.bashrc

Next Steps

Quick Configuration

Add to ~/.bashrc:

# Claude Vision Auto
export PATH="$HOME/.local/bin:$PATH"
alias cv="claude-vision"
alias cvd="DEBUG=true claude-vision"

Reload:

source ~/.bashrc
cv  # Now you can use 'cv' instead of 'claude-vision'

Support