User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
-
Updated
May 10, 2025 - JavaScript
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
Free, high-quality text-to-speech API endpoint to replace OpenAI, Azure, or ElevenLabs
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
This repository provides resources and guidelines to facilitate the integration of Open-WebUI and Langfuse, enabling seamless monitoring and management of AI model usage statistics.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
A Docker Compose to run a local ChatGPT-like application using Ollama, Ollama Web UI, Mistral NeMo & DeepSeek R1.
LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.
Ollama with Let's Encrypt Using Docker Compose
PuPu is a lightweight tool that makes it easy to run AI models on your own device. Designed for smooth performance and ease of use, PuPu is perfect for anyone who wants quick access to AI without technical complexity.
An excellent localized AI chat client application, cross-platform, compatible with all large models compatible with Ollama and OpenAI API. Local deployment protects your data privacy and can be used as Ollama client and OpenAI client.
A privacy-first, self-hosted AI chat UI powered by Ollama and open-source models like Mistral, LLaMA 3, and Phi. Designed to be fast, lightweight, and simple to deploy (Docker-ready)—no GPU or complex setup required. Spin up your own ChatGPT alternative in minutes with a beautiful TailwindCSS UI and Go Fiber backend. Supports real-time streaming
Simple web UI for Ollama
AI model deployment on Synology NAS and macOS 🧠🐳
This Docker Compose setup provides an isolated application with Ollama, Open-WebUI, and Nginx reverse proxy to enable secure HTTPS access. Since Open-WebUI does not support SSL natively, Nginx acts as a reverse proxy, handling SSL termination.
Web Client For Ollama - Llama LLM
A minimal interface in pure HTML/CSS for talking with Ollama focused on ensuring you can read the code.
Add a description, image, and links to the ollama-webui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-webui topic, visit your repo's landing page and select "manage topics."