LLM Deploy
open-webui/open-webui
open-webui/open-webui· Python
Self-hosted ChatGPT-style web UI for Ollama, OpenAI-compatible APIs, and local RAG.
GitHub stats
- Stars
- 134,809
- Forks
- 19,158
- Watchers
- 588
- Open issues
- 314
meta
- License
- NOASSERTION
- Primary language
- Python
- Last commit
- 2026-04-24
- Stats fetched at
- 2026-04-29
Open WebUI is a self-hosted web frontend that talks to Ollama and any OpenAI-compatible endpoint, giving you a ChatGPT-like experience on your own infrastructure. It bundles multi-user auth, conversation history, RAG over uploaded docs, web search, image generation hooks, and MCP/OpenAPI tool calling. Typical install is one Docker command pointed at your Ollama or vLLM backend; runs fine on a single workstation or behind a reverse proxy for a team.
Editor's verdict
The default pick if you want a polished ChatGPT-clone UI in front of local models — feature set is far ahead of LibreChat or Lobe Chat for the Ollama crowd, and RAG / tools / multi-user just work out of the box. Trade-off: it's a heavy Python + SvelteKit app with its own DB and opinions; if you only need a chat box for one person, ollama-webui-lite or just Ollama's CLI is lighter. Also avoid if you need to deeply customize the agent loop — it's a UI, not an agent framework.