Run Local LLMs with Ollama and Open WebUI on Docker (GPU-Ready Guide for Ubuntu 22.04/24.04) on September 01, 2025 AI Docker GPU Local LLM NVIDIA Ollama Open WebUI self-hosted ai Ubuntu +