WeKnora is an enterprise-grade LLM-powered framework developed by Tencent for deep document understanding, semantic retrieval, and context-aware answers using the RAG paradigm. The platform handles complex document processing workflows supporting 10+ formats including PDFs, Word documents, Excel spreadsheets, and images. It integrates seamlessly with WeCom, Feishu, Slack, and Telegram, enabling teams to query knowledge bases directly from communication tools while maintaining flexible LLM provider compatibility across multiple language models.
The platform operates in two primary modes serving different organizational needs. Quick Q&A mode leverages RAG pipelines to rapidly retrieve relevant document chunks and generate contextual answers with minimal latency. Intelligent Reasoning mode employs a ReACT Agent engine that orchestrates knowledge retrieval, integrates MCP tools, and conducts web searches to autonomously solve multi-step reasoning tasks. This dual approach enables both simple query-answer patterns and sophisticated agentic workflows adaptable from customer support automation to complex internal knowledge management.
WeKnora prioritizes enterprise requirements with auto-syncing knowledge from multiple sources, customizable agents and knowledge bases, and vector database flexibility. It supports multimodal inputs including ASR and image understanding for diverse data types. As an MIT-licensed open-source project, it can be self-hosted or deployed in air-gapped environments addressing security and compliance concerns for regulated industries. The combination of flexible deployment, comprehensive document handling, and agentic capabilities positions WeKnora as a robust foundation for enterprise knowledge systems.