To easily manage containers without sudo, you must be in the `docker` group. If you choose to skip this step, you will need to run Docker commands with sudo.
Open a new terminal and test Docker access. In the terminal, run:
```bash
docker ps
```
If you see a permission denied error (something like `permission denied while trying to connect to the Docker daemon socket`), add your user to the docker group:
```bash
sudo usermod -aG docker $USER
```
> **Warning**: After running usermod, you must log out and log back in to start a new
> session with updated group permissions.
## Step 2. Verify Docker setup and pull container
Open a new terminal, pull the Open WebUI container image with integrated Ollama:
This will start the Open WebUI container and make it accessible at `http://localhost:8080`. You can access the Open WebUI interface from your local web browser.
Application data will be stored in the `open-webui` volume and model data will be stored in the `open-webui-ollama` volume.
## Step 4. Create administrator account
This step sets up the initial administrator account for Open WebUI. This is a local account that you will use to access the Open WebUI interface.
In the Open WebUI interface, click the "Get Started" button at the bottom of the screen.
Fill out the administrator account creation form with your preferred credentials.
Click the registration button to create your account and access the main interface.
## Step 5. Download and configure a model
This step downloads a language model through Ollama and configures it for use in
Open WebUI. The download happens on your DGX Spark device and may take several minutes.
Click on the "Select a model" dropdown in the top left corner of the Open WebUI interface.
Type `gpt-oss:20b` in the search field.
Click the "Pull 'gpt-oss:20b' from Ollama.com" button that appears.
Wait for the model download to complete. You can monitor progress in the interface.
Once complete, select "gpt-oss:20b" from the model dropdown.
## Step 6. Test the model
This step verifies that the complete setup is working properly by testing model
inference through the web interface.
In the chat textarea at the bottom of the Open WebUI interface, enter:
```
Write me a haiku about GPUs
```
Press Enter to send the message and wait for the model's response.
## Step 7. Troubleshooting
Common issues and their solutions.
| Symptom | Cause | Fix |
|---------|-------|-----|
| Permission denied on docker ps | User not in docker group | Run Step 1 completely, including logging out and logging back in or use sudo|
| Model download fails | Network connectivity issues | Check internet connection, retry download |
> **Warning**: These commands will permanently delete all Open WebUI data and downloaded models.
Stop and remove the Open WebUI container:
```bash
docker stop open-webui
docker rm open-webui
```
Remove the downloaded images:
```bash
docker rmi ghcr.io/open-webui/open-webui:ollama
```
Remove persistent data volumes:
```bash
docker volume rm open-webui open-webui-ollama
```
To rollback permission change: `sudo deluser $USER docker`
## Step 9. Next steps
Try downloading different models from the Ollama library at https://ollama.com/library.
You can monitor GPU and memory usage through the DGX Dashboard available in NVIDIA Sync as you try different models.
If Open WebUI reports an update is available, you can update the container image by running:
```bash
docker pull ghcr.io/open-webui/open-webui:ollama
```
## Setup Open WebUI on Remote Spark with NVIDIA Sync
> **Note**: If you haven't already installed NVIDIA Sync, [learn how here.](/spark/connect-to-your-spark/sync)
## Step 1. Configure Docker permissions
To easily manage containers using NVIDIA Sync, you must be able to run Docker commands without sudo.
Open the Terminal app from NVIDIA Sync to start an interactive SSH session and test Docker access. In the terminal, run:
```bash
docker ps
```
If you see a permission denied error (something like `permission denied while trying to connect to the Docker daemon socket`), add your user to the docker group:
```bash
sudo usermod -aG docker $USER
```
> **Warning**: After running usermod, you must close the terminal window completely to start a new
> session with updated group permissions.
## Step 2. Verify Docker setup and pull container
This step confirms Docker is working properly and downloads the Open WebUI container
image. This runs on the DGX Spark device and may take several minutes depending on network speed.
Open a new Terminal app from NVIDIA Sync and pull the Open WebUI container image with integrated Ollama:
```bash
docker pull ghcr.io/open-webui/open-webui:ollama
```
Once the container image is downloaded, continue to setup NVIDIA Sync.
## Step 3. Open NVIDIA Sync Settings
Click on the NVIDIA Sync icon in your system tray or taskbar to open the main application window.
Click the gear icon in the top right corner to open the Settings window.
Click on the "Custom" tab to access Custom Ports configuration.