chore: Regenerate all playbooks

This commit is contained in:
GitLab CI 2025-11-08 04:59:56 +00:00
parent 3fa1feaeda
commit 0cb12fc751
4 changed files with 7 additions and 7 deletions

View File

@ -74,7 +74,7 @@ The other goes over an example of machine learning algorithms including UMAP and
```
If you are remotely accessing your DGX-Spark then make sure to forward the necesary port to access the notebook in your local browser. Use the below instruction for port fowarding
```bash
ssh -N -L YYYY:localhost:XXXX username@remote_host
ssh -N -L YYYY:localhost:XXXX username@remote_host
```
- `YYYY`: The local port you want to use (e.g. 8888)
- `XXXX`: The port you specified when starting Jupyter Notebook on the remote machine (e.g. 8888)

View File

@ -46,7 +46,7 @@ The setup includes:
* **Risks**:
* Docker permission issues may require user group changes and session restart
* The recipe would require hyperparameter tuning and a high-quality dataset for the best results
**Rollback**: Stop and remove Docker containers, delete downloaded models if needed.
* **Rollback**: Stop and remove Docker containers, delete downloaded models if needed.
## Instructions

View File

@ -64,7 +64,7 @@ All required assets can be found [here on GitHub](https://github.com/NVIDIA/dgx-
* **Risks:**
* Package dependency conflicts in Python environment
* Performance validation may require architecture-specific optimizations
**Rollback:** Container environments provide isolation; remove containers and restart to reset state.
* **Rollback:** Container environments provide isolation; remove containers and restart to reset state.
## Instructions

View File

@ -113,10 +113,10 @@ Ollama server running on port 11434. This configuration runs on your local machi
1. Click the "Add New" button
2. Fill out the form with these values:
- **Name**: `Ollama Server`
- **Port**: `11434`
- **Auto open in browser**: Leave unchecked (this is an API, not a web interface)
- **Start Script**: Leave empty
- **Name**: `Ollama Server`
- **Port**: `11434`
- **Auto open in browser**: Leave unchecked (this is an API, not a web interface)
- **Start Script**: Leave empty
3. Click "Add"
The new Ollama Server entry should now appear in your NVIDIA Sync custom apps list.