Installation
Install Droid from Factory AI:Quick Setup
Configuration Only
Use Specific Model
Droid requires a large context window (at least 64k tokens). See Context Length for configuration.
Features
IDE Integration
Works natively in VS Code, IntelliJ, and more
Terminal Mode
Full-featured CLI interface
Multi-Model
Configure multiple models simultaneously
Cloud Support
Automatic configuration for cloud models
Recommended Models
Cloud Models
qwen3-coder:480b-cloud
Recommended model for Droid (260k context)
glm-4.7:cloud— Reasoning and code generation (200k context)deepseek-v3.1:671b-cloud— Massive reasoning model (160k context)minimax-m2.5:cloud— Fast, efficient coding (200k context)
Local Models
qwen3-coder— Efficient code generation (~11GB VRAM)glm-4.7— Reasoning and coding (~25GB VRAM)deepseek-coder— Specialized code model (~20GB VRAM)
Manual Setup
Manual configuration for local models
Manual configuration for local models
Add a configuration block to
~/.factory/config.json:Cloud model configuration
Cloud model configuration
For cloud models with larger context windows:Ollama automatically sets appropriate token limits when you use
ollama launch droid.Configuration File
Droid stores configuration in~/.factory/config.json:
Example configuration
Example configuration
ollama launch droid.
Multiple Models
Droid supports multiple models simultaneously. Useollama launch droid to configure several:
Connecting to ollama.com
To use cloud models hosted on ollama.com:Create an API key
Go to ollama.com/settings/keys
Usage Examples
Terminal Mode
IDE Integration
- Install the Droid plugin for your IDE
- Configure Ollama models in Droid settings
- Select a model from the model picker
Ask Droid to Make Changes
Switch Models
In the Droid UI, use the model selector to switch between configured models.Troubleshooting
Model Not Available
Ensure the model is pulled:Configuration Not Loading
Restart Droid to pick up config changes:Context Window Too Small
For local models, increase context:Connection Issues
Verify Ollama is running:base_url in ~/.factory/config.json matches your Ollama host.
Advanced Configuration
Custom Display Names
Temperature and Parameters
Droid respects model-level parameters:Environment Variables
Droid respects:OLLAMA_HOST— Override Ollama server URLOLLAMA_API_KEY— API key for ollama.com
IDE Plugins
Droid provides plugins for:- VS Code — Install from marketplace
- IntelliJ IDEA — Install from JetBrains
- PyCharm — Same as IntelliJ
- WebStorm — Same as IntelliJ
Backup Configuration
When usingollama launch droid, Ollama creates backups in ~/.ollama/backups/ before modifying your configuration.
Learn More
Factory AI
Official Factory AI website
Droid Docs
Complete Droid documentation
OpenAI API
Ollama’s OpenAI-compatible API
Context Length
Configure model context windows