dgx-spark-playbooks/community/litguard/config.yaml
prashantkul 78213ac8a8 Add LitGuard playbook: prompt injection detection on DGX Spark
LitServe-based prompt injection detection server with a React monitoring
dashboard. Serves HuggingFace classification models behind an
OpenAI-compatible API with real-time metrics and GPU acceleration.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-10 21:36:27 -07:00

11 lines
261 B
YAML

models:
- name: deberta-injection
hf_model: deepset/deberta-v3-base-injection
device: cuda:0
batch_size: 32
- name: protectai-injection
hf_model: protectai/deberta-v3-base-prompt-injection-v2
device: cuda:0
batch_size: 32
port: 8234