How to Run Local LLMs on Ubuntu with Ollama and Open WebUI (GPU-Accelerated) on August 25, 2025 Docker Linux local llms NVIDIA Ollama Open WebUI self-hosted ai Ubuntu +