AnythingLLM by Mintplex Labs is the most popular open-source all-in-one AI application for teams that want ChatGPT-like capabilities without sending data to external servers. It handles the entire RAG pipeline internally: drag-and-drop document ingestion for PDFs, DOCX, TXT and more, automatic chunking with configurable overlap, vector storage via built-in LanceDB or external providers like Pinecone and Qdrant, and flexible LLM routing across 30+ providers including OpenAI, Anthropic, Ollama, and fully local models.
The platform ships with built-in AI agents that can browse the web, execute code, and interact with external tools. A Community Hub offers extensions and plugins including custom agent skills and reusable system prompts. Multi-user support with role-based access control, workspace isolation, and white-labeling makes it suitable for team deployments. Native MCP compatibility means AnythingLLM workspaces can be exposed as tools for Claude and other MCP-enabled AI systems.
The desktop app runs entirely offline on Mac, Windows, and Linux with no signup required. For teams, cloud hosting starts at $25/month with 4GB storage, scaling to $99/month for large teams with enterprise features. Self-hosting via Docker is completely free. The full REST API enables programmatic workspace and chat management. With 54,000+ stars on GitHub, AnythingLLM is consistently recommended alongside Open WebUI as the top self-hosted AI solution on Reddit's r/LocalLLaMA community.