Decoder.sh

Use Your Self-Hosted LLM Anywhere with Ollama Web UI

Description

Take your self-hosted Ollama models to the next level with Ollama Web UI, which provides a beautiful interface and features like chat history, voice input, and user management. We'll also explore how to use this interface and the models that power it on your phone using the powerful Ngrok tool.

Code

# Run the docker container
docker run -d \
-p 3000:8080 \
--add-host=host.docker.internal:host-gateway \
-v open-webui:/app/backend/data \
--name open-webui \
--restart always \
ghcr.io/open-webui/open-webui:main
# Ngrok setup
brew install ngrok/ngrok/ngrok # mac only

ngrok config add-authtoken [YOUR-TOKEN-HERE]

ngrok http http://localhost:3000