Sunday, February 15, 2026

Run Ollama with Open WebUI inside a Docker container and make them accessible from outside

docker run -d -p 11434:11434 -p 3000:8080 --gpus all -e OLLAMA_HOST=0.0.0.0 -e OLLAMA_FLASH_ATTENTION=1 -e OLLAMA_KEEP_ALIVE=-1 -e OLLAMA_KV_CACHE_TYPE=q4_0 -e OLLAMA_BASE_URLS=http://0.0.0.0:11434 -v /data/ollama/.ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always --pull always ghcr.io/open-webui/open-webui:ollama


No comments:

Post a Comment