Everything you need to know about DREA.
DREA (Discovery, Research & Evaluation Architecture) is a local-first AI platform that helps you search, analyze, and understand your documents without sending data to the cloud. It runs entirely on your Mac using local AI models.
No. DREA is a one-time purchase. You pay once and own it forever. No subscriptions, no per-seat pricing, no cloud fees. Updates are included.
DREA runs on any Apple Silicon Mac (M1 or later). The minimum is 16GB RAM for search-only mode. For full AI reasoning with the local LLM, 24GB+ is recommended. See the Hardware section for detailed recommendations.
No. DREA is designed for air-gapped operation. All AI models, embeddings, and vector storage run locally. The only time network access is used is if you explicitly enable the optional External AI feature.
Every query goes through a three-stage pipeline powered by the helper LLM (Qwen3-4B):
Results are displayed as grouped evidence cards organized by document, with the best excerpt highlighted and a relevance note explaining why it matched.
DREA ships with four local AI models, all open-source:
DREA automatically detects tabular data from ingested spreadsheets (Excel, CSV) and renders it as formatted tables in search results — with proper column headers, row striping, and horizontal scrolling. No more raw pipe-delimited text.
Yes. GME-Qwen2-VL creates a unified embedding space for text, images, and video frames. You can search with text queries and find relevant images, or search across all modalities simultaneously. Visual results display as side-by-side cards with thumbnail previews alongside extracted text.
Not unless you choose to. By default, DREA operates in full air-gap mode with network access blocked. The only exception is the optional External AI feature, which requires your explicit action and runs every query through a trust proxy that strips sensitive data before sending.
When you click "Ask Claude" (or another configured provider), DREA's trust proxy:
All of this is powered by the local helper LLM (Qwen3-4B) — no external service is involved in the sanitization process itself.
Everything is stored locally at ~/Library/Application Support/DREA/ on your Mac. This includes documents, vector embeddings, AI models, logs, and configuration. You can back up, move, or delete this folder at any time.
DREA supports a wide range of formats:
Documents are processed via Unstructured.io with automatic format detection, OCR for scanned pages, and table extraction.
With ChromaDB (embedded), DREA comfortably handles thousands of documents. For larger collections (10,000+), you can switch to Qdrant in server mode. The practical limit depends on your available disk space and RAM.
This is macOS Gatekeeper. To open DREA:
You only need to do this once. Alternatively, go to System Settings > Privacy & Security and click "Open Anyway".
The helper LLM pipeline (classify + expand + rerank) adds a few seconds to each query for significantly better results. If speed is more important, you can disable the helper LLM in Settings, which falls back to direct vector search without classification or reranking.
Your data is stored as standard files on your Mac. The vector database (ChromaDB) is at ~/Library/Application Support/DREA/data/chromadb/. You can also use the Log Export feature in Settings to create a zip archive of all logs and configuration for troubleshooting.