Skip to main content

AI Memory

screenpipe transforms your screen and audio into persistent, queryable memory that AI agents can use to understand what you’re doing and act autonomously.

The Problem

Every AI interaction today requires:
  1. Stopping work to write a prompt
  2. Translating intent — explaining context the AI should already know
  3. Waiting for a response
Your screen already shows exactly what you’re doing. The context is right there. But AI can’t see it.

How screenpipe Solves This

1

Continuous Context Capture

screenpipe records everything happening on your screen and audio:
  • Every app you use
  • Every window you focus
  • Every word you type or say
  • Every conversation you have
2

Queryable Memory

All captured data is indexed and searchable:
  • Full-text search across OCR and transcriptions
  • Time-based filtering
  • App/window/speaker filtering
  • Natural language queries
3

AI Context Layer

AI agents query screenpipe to understand:
  • What you’re currently working on
  • What you did earlier today
  • Conversations from meetings
  • Code you wrote, docs you read, emails you sent
4

Autonomous Action

With full context, AI agents can:
  • Act without prompts (triggered by events)
  • Make decisions based on your actual behavior
  • Remember everything you’ve done
Vision: AI agents that watch your screen, understand your work, and act autonomously without prompts. Recording + AI = ability to clone human digital work at high fidelity.

Use Cases

Memory Retrieval

// AI: "What was the URL from that design mockup earlier?"
const response = await fetch(
  'http://localhost:3030/search?q=figma+design+mockup&content_type=ocr'
);
const results = await response.json();

// Returns: screenshot with Figma URL + timestamp
Never lose context again. AI can retrieve anything you’ve seen or heard.

Ambient Automation

// AI watches you code, writes docs automatically
const recentCode = await fetch(
  `http://localhost:3030/timeline?` +
  `app_name=Code&` +
  `start_time=${lastHour.toISOString()}`
);

const data = await recentCode.json();
const codeChanges = data.data
  .filter(item => item.content.window_name.includes('.rs'))
  .map(item => item.content.text);

// AI: Detects new functions, generates docstrings, updates README

Context-Aware Assistants

// Cursor AI gets full context of what you're building
import { screenpipe } from 'screenpipe-sdk';

async function getCodingContext() {
  const lastHour = new Date(Date.now() - 60 * 60 * 1000);
  
  // Get all code editor activity
  const editorActivity = await screenpipe.search({
    app_name: 'Code',
    start_time: lastHour.toISOString(),
    content_type: 'ocr'
  });
  
  // Get terminal commands
  const terminalActivity = await screenpipe.search({
    app_name: 'Terminal',
    start_time: lastHour.toISOString(),
    content_type: 'ocr'
  });
  
  // Get browser tabs (docs you read)
  const docsActivity = await screenpipe.search({
    app_name: 'Chrome',
    browser_url: 'docs',
    start_time: lastHour.toISOString(),
    content_type: 'ocr'
  });
  
  return {
    files_edited: editorActivity.data,
    commands_run: terminalActivity.data,
    docs_read: docsActivity.data
  };
}

// Cursor now knows:
// - What files you're working on
// - What commands you ran (build errors, test results)
// - What documentation you referenced
// - Your coding patterns and style

AI Agent Patterns

Event-Driven Triggers

AI agents can listen to screenpipe events and act autonomously:
// When you open Figma, AI prepares design assets
const watcher = screenpipe.watch({
  content_type: 'ocr',
  capture_trigger: 'app_switch'
});

watcher.on('data', async (item) => {
  if (item.content.app_name === 'Figma') {
    // AI: Load recent design files, prepare asset library
    console.log('Figma opened, preparing design context...');
    
    const recentDesigns = await screenpipe.search({
      app_name: 'Figma',
      start_time: lastWeek.toISOString()
    });
    
    // Auto-open recent files, sync with design system
  }
});

Proactive Suggestions

// AI notices you doing the same task repeatedly
const recentActivity = await screenpipe.timeline({
  start_time: lastWeek.toISOString(),
  limit: 1000
});

// Analyze patterns
const patterns = detectRepetitivePatterns(recentActivity.data);

if (patterns.length > 0) {
  // AI: "I noticed you copy-paste this config 5 times a day. Want me to automate it?"
  suggestAutomation(patterns);
}

Memory Augmentation

// AI understands temporal context
const query = "that bug we discussed";

// Instead of searching all time:
// 1. Find recent conversations
const recentMeetings = await screenpipe.search({
  content_type: 'audio',
  start_time: lastWeek.toISOString(),
  q: 'bug'
});

// 2. Find code changes around that time
const codeChanges = await screenpipe.search({
  app_name: 'Code',
  start_time: recentMeetings.data[0].content.timestamp,
  content_type: 'ocr'
});

// AI: "You discussed the auth bug on March 5th at 2pm, then fixed it in auth.rs at 3pm."

Privacy & Security

Local-First Architecture

All data stays on your machine. screenpipe never sends data to external servers unless you explicitly enable cloud sync (zero-knowledge encrypted).
~/.screenpipe/
  data/
    2026-03-08/
      *.jpg           # Snapshots (JPEG)
  screenpipe.db       # SQLite database
  audio_chunks/       # Audio segments
All data is stored locally:
  • Screenshots: ~/.screenpipe/data/
  • Database: ~/.screenpipe/screenpipe.db
  • Audio: ~/.screenpipe/audio_chunks/

Building AI Agents

TypeScript SDK Example

npm install screenpipe-sdk

Reference

Vision Document

Read the full vision: VISION.md in the screenpipe repository. Key principles:
  • Stability over features — fix what’s broken before building what’s new
  • No feature creep — every feature must serve Record, Rewind, or Ask
  • Local-first always — data never leaves the device without explicit opt-in
  • Cross-platform — macOS, Windows, Linux

Source Files

  • Search API: crates/screenpipe-engine/src/routes/search.rs
  • Timeline API: crates/screenpipe-engine/src/routes/timeline.rs
  • Database: crates/screenpipe-db/src/lib.rs
  • Vision spec: VISION.md
  • README: README.md

Build docs developers (and LLMs) love