locAI is a docker compose project which allows you to simply deploy a local AI locally using Open WebUI, Ollama and Caddy
| caddy | ||
| .env.example | ||
| .gitignore | ||
| docker-compose.yml | ||
| LICENSE | ||
| README.md | ||
LocAI - Open WebUI Deployment
A containerized deployment of Open WebUI with Ollama, featuring a complete stack for running local AI models.
Features
- Open WebUI: Web interface for interacting with AI models
- Ollama: Backend for running language models locally
- PostgreSQL: Database for persistent storage
- Redis: In-memory data structure store for WebSocket support
- Caddy: Web server for HTTPS and reverse proxy
Quick Start
-
Clone the repository
git clone https://github.com/yourusername/open-webui.git cd open-webui -
Set up environment variables
cp .env.example .env # Edit .env with your settings # Generate a secret key with: openssl rand -hex 32 -
Start the services
docker compose up -d -
Access the UI
- Open your browser and navigate to
https://localhost - For custom domain setup, update the
DOMAINvariable in your.envfile
- Open your browser and navigate to
Configuration
- GPU Support: Set
WEBUI_DOCKER_TAG=cudain your.envfile for NVIDIA GPU support- Note: NVIDIA GPU acceleration requires nvidia-container-toolkit to be installed on the host system
- Models: Place models in
~/.ollama/modelsor configureOLLAMA_MODELS_PATHin.env - Persistence: Data is stored in subdirectories:
./open-webui: Open WebUI data./postgres/data: PostgreSQL database files./ollama: Ollama configuration and models
Development
To customize the deployment:
- Modify the
docker-compose.ymlto add or adjust services - Update the
Caddyfileto change reverse proxy settings - Contribute to the Open WebUI project on GitHub: open-webui
Troubleshooting
- Container Startup Issues: Check logs with
docker compose logs -f - Model Download Failures: Ensure proper network connectivity and sufficient disk space
- GPU Access Problems: Verify that nvidia-container-toolkit is properly installed and configured
- Connection Issues: Make sure all required ports are open and not blocked by firewalls
Security Notes
- This deployment includes self-signed certificates for local usage
- For production use, set up proper certificates or use a trusted certificate provider
- The default setup is intended for local network usage, additional security measures are needed for public-facing deployments
License
This deployment configuration is provided under the MIT License. Open WebUI and other components have their own respective licenses.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.