LocalChat Documentation
LocalChat is a macOS app that runs AI models locally on your Mac. Your conversations stay private because everything happens on your device - no data is sent to external servers.
What LocalChat does
LocalChat lets you chat with AI models the same way you'd use ChatGPT or Claude, but with one key difference: the AI runs on your computer, not in the cloud. This means:
- Complete privacy - Your conversations never leave your Mac
- Works offline - Use AI without an internet connection
- No subscriptions - One-time purchase, use forever
- No rate limits - Chat as much as you want
How it works
- You download LocalChat and install it on your Mac
- During setup, you choose an AI model to download
- The model runs locally using your Mac's processor and memory
- You chat with the AI just like any other chat app
The AI model files are stored on your Mac, typically in the range of 2-8GB depending on the model size. Larger models generally produce better responses but require more memory.
Supported models
LocalChat supports 300+ AI models from various providers:
- Llama - Meta's open-source models
- Mistral - Fast, efficient models from France
- Gemma - Google's lightweight models
- Qwen - Multilingual models with vision support
- DeepSeek - Reasoning-focused models
You can download multiple models and switch between them depending on your task.
System requirements
| Requirement | Minimum | Recommended |
|---|---|---|
| macOS | 13.0 (Ventura) | Latest version |
| Processor | Any | Apple Silicon (M1/M2/M3/M4) |
| RAM | 8GB | 16GB+ |
| Storage | 10GB free | 50GB+ free |
Apple Silicon Macs get the best performance because LocalChat can use the GPU for faster responses.
Key features
Model management
Browse, download, and organize AI models from the built-in model gallery. No command line needed. See Models for details.
Image analysis
Use vision-enabled models to analyze screenshots, photos, and diagrams. Extract information, describe content, or get explanations. See Image analysis.
Document processing
Drag and drop PDFs or text files to analyze them with AI. Extract summaries, answer questions, or process the content locally. See Chat with Documents.
Screenshot to code
Convert website designs and UI mockups into working HTML and CSS code using vision models. See Screenshot to code.
Offline mode
After downloading your models, you can use LocalChat without any internet connection. Great for travel, privacy-sensitive work, or unreliable connections.
Conversation history
Your chat history is stored locally on your Mac. Search previous conversations and pick up where you left off.
Getting started
Ready to try LocalChat? Follow these guides:
- Installation - Download and install LocalChat
- Quick start - Set up and start your first conversation
- Models - Download and manage AI models
- Settings - Configure LocalChat to your preferences