docs: add LiteLLM + AWS Bedrock integration guide
Some checks failed
Auto Tag / auto-tag (push) Has been cancelled
Test / rust-fmt-check (push) Has been cancelled
Test / rust-clippy (push) Has been cancelled
Test / rust-tests (push) Has been cancelled
Test / frontend-typecheck (push) Has been cancelled
Test / frontend-tests (push) Has been cancelled

Add comprehensive documentation for integrating AWS Bedrock Claude models via LiteLLM proxy. Enables enterprise users to leverage existing AWS contracts while maintaining OpenAI-compatible API interface.

Changes:
- README.md: Add quickstart section for LiteLLM + Bedrock setup
- docs/wiki/LiteLLM-Bedrock-Setup.md: New comprehensive guide covering single/multi-account setup, Claude Code integration, troubleshooting, and auto-start configuration
- docs/wiki/AI-Providers.md: Update OpenAI-compatible section to reference LiteLLM
- docs/wiki/Home.md: Add LiteLLM guide to navigation and feature list

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
Shaun Arman 2026-03-31 12:13:26 -05:00
parent d489338bc4
commit 2f2becd4f2
4 changed files with 438 additions and 3 deletions

View File

@ -129,6 +129,7 @@ Launch the app and go to **Settings → AI Providers** to add a provider:
| Mistral | `https://api.mistral.ai/v1` | Requires API key |
| Ollama (local) | `http://localhost:11434` | No key needed — fully offline |
| Azure OpenAI | `https://<resource>.openai.azure.com/openai/deployments/<deployment>` | Requires API key |
| **AWS Bedrock (via LiteLLM)** | `http://localhost:8000/v1` | See [LiteLLM + AWS Bedrock](#litellm--aws-bedrock-setup) below |
For offline use, install [Ollama](https://ollama.com) and pull a model:
```bash
@ -138,6 +139,41 @@ ollama pull llama3.1:8b # Better quality (≥16 GB RAM)
Or use **Settings → Ollama** to pull models directly from within the app.
### LiteLLM + AWS Bedrock Setup
To use Claude via AWS Bedrock (ideal for enterprise environments with existing AWS contracts):
1. **Install LiteLLM:**
```bash
pip install litellm[proxy]
```
2. **Create config file** at `~/.litellm/config.yaml`:
```yaml
model_list:
- model_name: bedrock-claude
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-6
aws_region_name: us-east-1
# Optionally specify aws_profile_name if not using default
general_settings:
master_key: sk-your-secure-key # Any value for API auth
```
3. **Start LiteLLM proxy:**
```bash
nohup litellm --config ~/.litellm/config.yaml --port 8000 > ~/.litellm/litellm.log 2>&1 &
```
4. **Configure in TFTSR:**
- Provider: **OpenAI** (OpenAI-compatible)
- Base URL: `http://localhost:8000/v1`
- API Key: `sk-your-secure-key` (from config)
- Model: `bedrock-claude`
For detailed setup including multiple AWS accounts and Claude Code integration, see the [LiteLLM + Bedrock wiki page](https://gogs.tftsr.com/sarman/tftsr-devops_investigation/wiki/LiteLLM-Bedrock-Setup).
---
## Triage Workflow

View File

@ -19,7 +19,7 @@ pub trait Provider {
### 1. OpenAI-Compatible
Covers: OpenAI, Azure OpenAI, LM Studio, vLLM, and any OpenAI-API-compatible endpoint.
Covers: OpenAI, Azure OpenAI, LM Studio, vLLM, **LiteLLM (AWS Bedrock)**, and any OpenAI-API-compatible endpoint.
| Field | Value |
|-------|-------|
@ -30,7 +30,9 @@ Covers: OpenAI, Azure OpenAI, LM Studio, vLLM, and any OpenAI-API-compatible end
**Models:** `gpt-4o`, `gpt-4o-mini`, `gpt-4-turbo`
**Custom endpoint:** Set `config.base_url` to any OpenAI-compatible API (e.g., LM Studio at `http://localhost:1234/v1`).
**Custom endpoint:** Set `config.base_url` to any OpenAI-compatible API:
- LM Studio: `http://localhost:1234/v1`
- **LiteLLM (AWS Bedrock):** `http://localhost:8000/v1` — See [LiteLLM + Bedrock Setup](LiteLLM-Bedrock-Setup) for full configuration guide
---

View File

@ -12,6 +12,7 @@
| [Development Setup](wiki/Development-Setup) | Prerequisites, commands, environment |
| [Database](wiki/Database) | Schema, migrations, encryption |
| [AI Providers](wiki/AI-Providers) | Supported providers and configuration |
| [LiteLLM + Bedrock Setup](wiki/LiteLLM-Bedrock-Setup) | AWS Bedrock integration via LiteLLM proxy |
| [PII Detection](wiki/PII-Detection) | Patterns, redaction flow, security |
| [IPC Commands](wiki/IPC-Commands) | Full list of Tauri backend commands |
| [CI/CD Pipeline](wiki/CICD-Pipeline) | Gitea Actions setup, multi-platform builds, act_runner config |
@ -23,7 +24,7 @@
- **5-Whys AI Triage** — Interactive guided root cause analysis via multi-turn AI chat
- **PII Auto-Redaction** — Detects and redacts sensitive data before any AI send
- **Multi-Provider AI** — OpenAI, Anthropic Claude, Google Gemini, Mistral, local Ollama (fully offline)
- **Multi-Provider AI** — OpenAI, Anthropic Claude, Google Gemini, Mistral, AWS Bedrock (via LiteLLM), local Ollama (fully offline)
- **SQLCipher AES-256** — All issue history encrypted at rest
- **RCA + Post-Mortem Generation** — Auto-populated Markdown templates, exportable as MD/PDF
- **Ollama Management** — Hardware detection, model recommendations, in-app model management

View File

@ -0,0 +1,396 @@
# LiteLLM + AWS Bedrock Setup
This guide covers how to use **Claude via AWS Bedrock** with TFTSR through the LiteLLM proxy, providing an OpenAI-compatible API gateway.
## Why LiteLLM + Bedrock?
- **Enterprise AWS contracts** — Use existing AWS Bedrock credits instead of direct Anthropic API
- **Multiple AWS accounts** — Run personal and business Bedrock accounts simultaneously
- **OpenAI-compatible API** — Works with any tool expecting OpenAI's API format
- **Claude Code integration** — Reuse the same AWS credentials used by Claude Code CLI
---
## Prerequisites
- **AWS account** with Bedrock access and Claude models enabled
- **AWS credentials** configured (either default profile or named profile)
- **Python 3.8+** for LiteLLM installation
---
## Installation
### 1. Install LiteLLM
```bash
pip install 'litellm[proxy]'
```
Verify installation:
```bash
litellm --version
```
---
## Basic Setup — Single AWS Account
### 1. Create Configuration File
Create `~/.litellm/config.yaml`:
```yaml
model_list:
- model_name: bedrock-claude
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-6
aws_region_name: us-east-1
# Uses default AWS credentials from ~/.aws/credentials
general_settings:
master_key: sk-1234 # Any value — used for API authentication
```
### 2. Start LiteLLM Proxy
```bash
# Run in background
nohup litellm --config ~/.litellm/config.yaml --port 8000 > ~/.litellm/litellm.log 2>&1 &
# Verify it's running
ps aux | grep litellm
```
### 3. Test the Connection
```bash
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer sk-1234" \
-H "Content-Type: application/json" \
-d '{
"model": "bedrock-claude",
"messages": [{"role": "user", "content": "Hello"}],
"max_tokens": 50
}'
```
Expected response:
```json
{
"id": "chatcmpl-...",
"model": "bedrock-claude",
"choices": [{
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?"
}
}]
}
```
### 4. Configure TFTSR
In **Settings → AI Providers → Add Provider**:
| Field | Value |
|-------|-------|
| Provider Name | OpenAI |
| Base URL | `http://localhost:8000/v1` |
| API Key | `sk-1234` (or your master_key from config) |
| Model | `bedrock-claude` |
| Display Name | `Bedrock Claude` |
---
## Advanced Setup — Multiple AWS Accounts
If you have **personal** and **business** Bedrock accounts, you can run both through the same LiteLLM instance.
### 1. Configure AWS Profiles
Ensure you have AWS profiles set up in `~/.aws/credentials`:
```ini
[default]
aws_access_key_id = AKIA...
aws_secret_access_key = ...
[ClaudeCodeLP]
aws_access_key_id = AKIA...
aws_secret_access_key = ...
```
Or if using credential process (like Claude Code does):
```ini
# ~/.aws/config
[profile ClaudeCodeLP]
credential_process = /Users/${USER}/claude-code-with-bedrock/credential-process --profile ClaudeCodeLP
region = us-east-1
```
### 2. Update Configuration File
Edit `~/.litellm/config.yaml`:
```yaml
model_list:
- model_name: bedrock-personal
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-6
aws_region_name: us-east-1
# Uses default AWS credentials
- model_name: bedrock-business
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-5-20250929-v1:0
aws_region_name: us-east-1
aws_profile_name: ClaudeCodeLP # Named profile for business account
general_settings:
master_key: sk-1234
```
### 3. Restart LiteLLM
```bash
# Find and stop existing process
pkill -f "litellm --config"
# Start with new config
nohup litellm --config ~/.litellm/config.yaml --port 8000 > ~/.litellm/litellm.log 2>&1 &
```
### 4. Verify Both Models
```bash
# List available models
curl -s http://localhost:8000/v1/models \
-H "Authorization: Bearer sk-1234" | python3 -m json.tool
# Test personal account
curl -s http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer sk-1234" \
-H "Content-Type: application/json" \
-d '{"model": "bedrock-personal", "messages": [{"role": "user", "content": "test"}]}'
# Test business account
curl -s http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer sk-1234" \
-H "Content-Type: application/json" \
-d '{"model": "bedrock-business", "messages": [{"role": "user", "content": "test"}]}'
```
### 5. Configure in TFTSR
Add both models as separate providers:
**Provider 1 — Personal:**
- Provider: OpenAI
- Base URL: `http://localhost:8000/v1`
- API Key: `sk-1234`
- Model: `bedrock-personal`
- Display Name: `Bedrock (Personal)`
**Provider 2 — Business:**
- Provider: OpenAI
- Base URL: `http://localhost:8000/v1`
- API Key: `sk-1234`
- Model: `bedrock-business`
- Display Name: `Bedrock (Business)`
---
## Claude Code Integration
If you're using [Claude Code](https://claude.ai/claude-code) with AWS Bedrock, you can reuse the same AWS credentials.
### 1. Check Claude Code Settings
Read your Claude Code configuration:
```bash
cat ~/.claude/settings.json
```
Look for:
- `AWS_PROFILE` environment variable (e.g., `ClaudeCodeLP`)
- `awsAuthRefresh` credential process path
- `AWS_REGION` setting
### 2. Use Same Profile in LiteLLM
In `~/.litellm/config.yaml`, add a model using the same profile:
```yaml
model_list:
- model_name: claude-code-bedrock
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-5-20250929-v1:0
aws_region_name: us-east-1
aws_profile_name: ClaudeCodeLP # Same as Claude Code
```
Now both Claude Code and TFTSR use the same Bedrock account without duplicate credential management.
---
## Available Claude Models on Bedrock
| Model ID | Name | Context | Best For |
|----------|------|---------|----------|
| `us.anthropic.claude-sonnet-4-6` | Claude Sonnet 4.6 | 200k tokens | Most tasks, best quality |
| `us.anthropic.claude-sonnet-4-5-20250929-v1:0` | Claude Sonnet 4.5 | 200k tokens | High performance |
| `us.anthropic.claude-opus-4-6` | Claude Opus 4.6 | 200k tokens | Complex reasoning |
| `us.anthropic.claude-haiku-4-5-20251001` | Claude Haiku 4.5 | 200k tokens | Speed + cost optimization |
Check your AWS Bedrock console for available models in your region.
---
## Troubleshooting
### Port Already in Use
If port 8000 is occupied:
```bash
# Find what's using the port
lsof -i :8000
# Use a different port
litellm --config ~/.litellm/config.yaml --port 8080
```
Update the Base URL in TFTSR to match: `http://localhost:8080/v1`
### AWS Credentials Not Found
```bash
# Verify AWS CLI works
aws bedrock list-foundation-models --region us-east-1
# Test specific profile
aws bedrock list-foundation-models --region us-east-1 --profile ClaudeCodeLP
```
If AWS CLI fails, fix credentials first before debugging LiteLLM.
### Model Not Available
Error: `Model not found` or `Access denied`
1. Check [AWS Bedrock Console](https://console.aws.amazon.com/bedrock/) → Model Access
2. Request access to Claude models if not enabled
3. Verify model ID matches exactly (case-sensitive)
### LiteLLM Won't Start
Check logs:
```bash
cat ~/.litellm/litellm.log
```
Common issues:
- Missing Python dependencies: `pip install 'litellm[proxy]' --upgrade`
- YAML syntax error: Validate with `python3 -c "import yaml; yaml.safe_load(open('${HOME}/.litellm/config.yaml'))"`
---
## Auto-Start LiteLLM on Boot
### macOS — LaunchAgent
Create `~/Library/LaunchAgents/com.litellm.proxy.plist`:
```xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.litellm.proxy</string>
<key>ProgramArguments</key>
<array>
<string>/opt/homebrew/bin/litellm</string>
<string>--config</string>
<string>/Users/${USER}/.litellm/config.yaml</string>
<string>--port</string>
<string>8000</string>
</array>
<key>RunAtLoad</key>
<true/>
<key>KeepAlive</key>
<true/>
<key>StandardOutPath</key>
<string>/Users/${USER}/.litellm/litellm.log</string>
<key>StandardErrorPath</key>
<string>/Users/${USER}/.litellm/litellm-error.log</string>
</dict>
</plist>
```
Load it:
```bash
launchctl load ~/Library/LaunchAgents/com.litellm.proxy.plist
```
### Linux — systemd
Create `/etc/systemd/system/litellm.service`:
```ini
[Unit]
Description=LiteLLM Proxy
After=network.target
[Service]
Type=simple
User=${USER}
ExecStart=/usr/local/bin/litellm --config /home/${USER}/.litellm/config.yaml --port 8000
Restart=always
RestartSec=10
[Install]
WantedBy=multi-user.target
```
Enable and start:
```bash
sudo systemctl enable litellm
sudo systemctl start litellm
sudo systemctl status litellm
```
---
## Cost Comparison
| Provider | Model | Input (per 1M tokens) | Output (per 1M tokens) |
|----------|-------|----------------------|----------------------|
| **Anthropic Direct** | Claude Sonnet 4 | $3.00 | $15.00 |
| **AWS Bedrock** | Claude Sonnet 4 | $3.00 | $15.00 |
Pricing is identical, but Bedrock provides:
- AWS consolidated billing
- AWS Credits applicability
- Integration with AWS services (S3, Lambda, etc.)
- Enterprise support contracts
---
## Security Notes
1. **Master Key** — The `master_key` in config is required but doesn't need to be complex since LiteLLM runs locally
2. **AWS Credentials** — Never commit `.aws/credentials` or credential process scripts to git
3. **Local Only** — LiteLLM proxy should only bind to `127.0.0.1` (localhost) — never expose to network
4. **Audit Logs** — TFTSR logs all AI requests with SHA-256 hashes in the audit table
---
## References
- [LiteLLM Documentation](https://docs.litellm.ai/)
- [AWS Bedrock Claude Models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-claude.html)
- [Claude Code on Bedrock](https://docs.anthropic.com/claude/docs/claude-code)