Skip to main content
Get started with Vega AI in minutes. This guide will walk you through the fastest way to start tracking your job applications with AI-powered assistance.

Prerequisites

Before you begin, you’ll need:

Docker

Docker Desktop installed on your machine

Gemini API Key

Free API key from Google AI Studio
New to Docker? Docker Desktop is available for Mac, Windows, and Linux. It’s free and takes just a few minutes to install.

Quick Start

1

Get Your Gemini API Key

Navigate to Google AI Studio and generate a free API key. This enables Vega AI’s intelligent features like document generation and job matching.
2

Create Configuration Directory

Create a dedicated directory for Vega AI and set up your configuration:
# Create directory
mkdir vega-ai && cd vega-ai

# Create config file with your API key
echo "GEMINI_API_KEY=your-gemini-api-key" > config
Replace your-gemini-api-key with your actual API key from Google AI Studio.
3

Start Vega AI

Launch Vega AI with a single Docker command:
docker run --pull always -d \
  --name vega-ai \
  -p 8765:8765 \
  -v vega-data:/app/data \
  --env-file config \
  ghcr.io/benidevo/vega-ai:latest
This command:
  • Downloads the latest Vega AI image
  • Runs it as a background service
  • Exposes the application on port 8765
  • Persists your data in a Docker volume
  • Loads your configuration from the config file
4

Access Vega AI

Open your browser and navigate to:
http://localhost:8765
Log in with the default credentials:
  • Username: admin
  • Password: VegaAdmin
Security First! Change your password immediately after first login via Settings → Account.

What’s Next?

Now that Vega AI is running, you can:

Add Your First Job

Start tracking job applications manually or with the browser extension

Build Your Profile

Upload your CV or manually add your experience, education, and skills

Generate Documents

Create tailored cover letters and CVs using AI

Get Match Scores

Analyze job compatibility with AI-powered matching

Alternative Installation Methods

Docker Compose

For easier management and configuration, create a docker-compose.yml file:
docker-compose.yml
services:
  vega-ai:
    image: ghcr.io/benidevo/vega-ai:latest
    ports:
      - "8765:8765"
    volumes:
      - vega-data:/app/data
    env_file:
      - config
    restart: unless-stopped

volumes:
  vega-data:
Then start with:
docker compose up -d

Cloud Mode

Try the hosted version at vega.benidevo.com for zero setup. Learn more about Cloud vs Self-Hosted.

Browser Extension

Capture jobs directly from LinkedIn with one click:
1

Download Extension

Get the latest release from GitHub Releases
2

Install in Chrome

  1. Extract the ZIP file to a folder
  2. Open chrome://extensions/
  3. Enable “Developer mode”
  4. Click “Load unpacked” and select the folder
3

Configure Connection

Connect the extension to your Vega AI instance:
  • URL: http://localhost:8765
  • Credentials: Your admin login

Troubleshooting

Port Already in Use

If port 8765 is already occupied:
# Use a different port
docker run --pull always -d \
  --name vega-ai \
  -p 8080:8765 \
  -v vega-data:/app/data \
  --env-file config \
  ghcr.io/benidevo/vega-ai:latest
Then access at http://localhost:8080

Container Won’t Start

Check the logs for errors:
docker logs vega-ai

Reset Everything

To start fresh:
# Stop and remove container
docker stop vega-ai
docker rm vega-ai

# Remove data volume (WARNING: deletes all data)
docker volume rm vega-data

# Start again from Step 3

Need Help?

Explore the complete documentation:

Configuration Guide

Advanced configuration options and environment variables

Deployment Options

Choose between cloud and self-hosted deployment

Build docs developers (and LLMs) love