docs: add wiki source files and CI auto-sync pipeline

- Add docs/wiki/ with 11 wiki pages (Home, Architecture, Database,
  AI-Providers, PII-Detection, IPC-Commands, CICD-Pipeline,
  Security-Model, Integrations, Development-Setup, Troubleshooting)
- Add wiki-sync step to .woodpecker/test.yml: syncs docs/wiki/*.md to
  the Gogs wiki git repo on every push to master
- Add Wiki Maintenance section to CLAUDE.md: code→wiki file mapping
  so Claude and contributors know which wiki page to update per change

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Shaun Arman 2026-03-15 13:45:30 -05:00
parent 808500b7bd
commit 52f464d8bd
13 changed files with 1753 additions and 0 deletions

View File

@ -37,3 +37,21 @@ pipeline:
commands:
- npm ci
- npm run test:run
wiki-sync:
image: alpine/git
secrets:
- GOGS_TOKEN
when:
branch: master
event: push
commands:
- git config --global user.email "ci@tftsr.com"
- git config --global user.name "TFTSR CI"
- git clone "https://${GOGS_TOKEN}@gogs.tftsr.com/sarman/tftsr-devops_investigation.wiki.git" /tmp/wiki
- cp docs/wiki/*.md /tmp/wiki/
- cd /tmp/wiki
- git add -A
- git diff --cached --quiet && echo "Wiki up to date, nothing to push." && exit 0
- git commit -m "docs: sync wiki from ${CI_COMMIT_SHA} on ${CI_COMMIT_BRANCH}"
- git push

View File

@ -167,3 +167,33 @@ Known issues with Woodpecker 0.15.4 + Gogs 0.14:
- Gogs 0.14 has no OAuth2 provider support, blocking upgrade to Woodpecker 2.x
Gogs token quirk: the `sha1` value returned by `POST /api/v1/users/{user}/tokens` is the **actual bearer token**. The `sha1` and `sha256` columns in the Gogs DB are hashes of that token, not the token itself.
---
## Wiki Maintenance
The project wiki lives at `https://gogs.tftsr.com/sarman/tftsr-devops_investigation/wiki`.
**Source of truth**: `docs/wiki/*.md` in this repo. The `wiki-sync` CI step (in `.woodpecker/test.yml`) automatically pushes any changes to the Gogs wiki on every push to master.
**When making code changes, update the corresponding wiki file in `docs/wiki/` before committing:**
| Changed area | Wiki file to update |
|---|---|
| New/changed Tauri commands (`commands/*.rs`, `tauriCommands.ts`) | `docs/wiki/IPC-Commands.md` |
| DB schema or migrations (`db/migrations.rs`, `db/models.rs`) | `docs/wiki/Database.md` |
| New/changed AI provider (`ai/*.rs`) | `docs/wiki/AI-Providers.md` |
| PII patterns or detection logic (`pii/`) | `docs/wiki/PII-Detection.md` |
| CI/CD pipeline changes (`.woodpecker/*.yml`) | `docs/wiki/CICD-Pipeline.md` |
| Rust architecture or module layout (`lib.rs`, `state.rs`) | `docs/wiki/Architecture.md` |
| Security-relevant changes (capabilities, audit, Stronghold) | `docs/wiki/Security-Model.md` |
| Dev setup, prerequisites, build commands | `docs/wiki/Development-Setup.md` |
| Integration stubs or v0.2 progress (`integrations/`) | `docs/wiki/Integrations.md` |
| Recurring bugs and fixes | `docs/wiki/Troubleshooting.md` |
To manually push wiki changes without waiting for CI:
```bash
cd /tmp/tftsr-wiki # local clone of the wiki git repo
# edit *.md files, then:
git add -A && git commit -m "docs: ..." && git push
```

120
docs/wiki/AI-Providers.md Normal file
View File

@ -0,0 +1,120 @@
# AI Providers
TFTSR supports 5 AI providers, selectable per-session. API keys are stored in the Stronghold encrypted vault.
## Provider Factory
`ai/provider.rs::create_provider(config)` dispatches on `config.name` to the matching implementation. Adding a provider requires implementing the `Provider` trait and adding a match arm.
```rust
pub trait Provider {
async fn chat(&self, messages: Vec<Message>, config: &ProviderConfig) -> Result<ChatResponse>;
fn name(&self) -> &str;
}
```
---
## Supported Providers
### 1. OpenAI-Compatible
Covers: OpenAI, Azure OpenAI, LM Studio, vLLM, and any OpenAI-API-compatible endpoint.
| Field | Value |
|-------|-------|
| `config.name` | `"openai"` |
| Default URL | `https://api.openai.com/v1/chat/completions` |
| Auth | `Authorization: Bearer <api_key>` |
| Max tokens | 4096 |
**Models:** `gpt-4o`, `gpt-4o-mini`, `gpt-4-turbo`
**Custom endpoint:** Set `config.base_url` to any OpenAI-compatible API (e.g., LM Studio at `http://localhost:1234/v1`).
---
### 2. Anthropic Claude
| Field | Value |
|-------|-------|
| `config.name` | `"anthropic"` |
| URL | `https://api.anthropic.com/v1/messages` |
| Auth | `x-api-key: <api_key>` + `anthropic-version: 2023-06-01` |
| Max tokens | 4096 |
**Models:** `claude-sonnet-4-20250514`, `claude-haiku-4-20250414`, `claude-3-5-sonnet-20241022`
---
### 3. Google Gemini
| Field | Value |
|-------|-------|
| `config.name` | `"gemini"` |
| URL | `https://generativelanguage.googleapis.com/v1beta/models/{model}:generateContent` |
| Auth | API key as `?key=` query parameter |
| Max tokens | 4096 |
**Models:** `gemini-2.0-flash`, `gemini-2.0-pro`, `gemini-1.5-pro`, `gemini-1.5-flash`
---
### 4. Mistral AI
| Field | Value |
|-------|-------|
| `config.name` | `"mistral"` |
| Default URL | `https://api.mistral.ai/v1/chat/completions` |
| Auth | `Authorization: Bearer <api_key>` |
| Max tokens | 4096 |
**Models:** `mistral-large-latest`, `mistral-medium-latest`, `mistral-small-latest`, `open-mistral-nemo`
Uses OpenAI-compatible request/response format.
---
### 5. Ollama (Local / Offline)
| Field | Value |
|-------|-------|
| `config.name` | `"ollama"` |
| Default URL | `http://localhost:11434/api/chat` |
| Auth | None |
| Max tokens | No limit enforced |
**Models:** Any model pulled locally — `llama3.1`, `llama3`, `mistral`, `codellama`, `phi3`, etc.
Fully offline. Responses include `eval_count` / `prompt_eval_count` token stats.
**Custom URL:** Change the Ollama URL in Settings → AI Providers → Ollama (stored in `settingsStore.ollama_url`).
---
## Domain System Prompts
Each triage conversation is pre-loaded with a domain-specific expert system prompt from `src/lib/domainPrompts.ts`.
| Domain | Key areas covered |
|--------|------------------|
| **Linux** | systemd, filesystem, memory, networking, kernel, performance |
| **Windows** | Event Viewer, Active Directory, IIS, Group Policy, clustering |
| **Network** | DNS, firewalls, load balancers, BGP/OSPF, Layer 2, VPN |
| **Kubernetes** | Pod failures, service mesh, ingress, storage, Helm |
| **Databases** | Connection pools, slow queries, indexes, replication, MongoDB/Redis |
| **Virtualization** | vMotion, storage (VMFS), HA, snapshots; KVM/QEMU |
| **Hardware** | RAID, SMART data, ECC memory errors, thermal, BIOS/firmware |
| **Observability** | Prometheus/Grafana, ELK/OpenSearch, tracing, SLO/SLI burn rates |
The domain prompt is injected as the first `system` role message in every new conversation.
---
## Adding a New Provider
1. Create `src-tauri/src/ai/{name}.rs` implementing the `Provider` trait
2. Add a match arm in `ai/provider.rs::create_provider()`
3. Add the model list in `commands/ai.rs::list_providers()`
4. Add the TypeScript type in `src/lib/tauriCommands.ts`
5. Add a UI entry in `src/pages/Settings/AIProviders.tsx`

213
docs/wiki/Architecture.md Normal file
View File

@ -0,0 +1,213 @@
# Architecture
## Overview
TFTSR uses a Tauri 2.x architecture: a Rust backend runs natively, and a React/TypeScript frontend runs in an embedded WebView. Communication between them happens exclusively via typed IPC (`invoke()`).
```
┌─────────────────────────────────────────┐
│ WebView (React) │
│ pages/ → stores/ → tauriCommands.ts │
└──────────────────┬──────────────────────┘
│ invoke() / IPC
┌──────────────────▼──────────────────────┐
│ Rust Backend (Tauri) │
│ commands/ → ai/ → pii/ → db/ → docs/ │
└─────────────────────────────────────────┘
│ │
SQLCipher reqwest
DB (AI APIs)
```
## Backend — Rust
**Entry point:** `src-tauri/src/lib.rs``run()` initialises tracing, opens the DB, registers Tauri plugins, and calls `generate_handler![]` with all IPC commands.
### Shared State
```rust
pub struct AppState {
pub db: Arc<Mutex<rusqlite::Connection>>,
pub settings: Arc<Mutex<AppSettings>>,
pub app_data_dir: PathBuf, // ~/.local/share/tftsr on Linux
}
```
All command handlers receive `State<'_, AppState>` as a Tauri-injected parameter. The Mutex must be **released before any `.await`** — holding a `MutexGuard` across an await point is a compile error because `MutexGuard` is not `Send`.
### Module Layout
| Path | Responsibility |
|------|---------------|
| `lib.rs` | App entry, tracing init, DB setup, plugin registration, command handler list |
| `state.rs` | `AppState` struct |
| `commands/db.rs` | Issue CRUD, 5-Whys entries, timeline events |
| `commands/ai.rs` | `analyze_logs`, `chat_message`, `list_providers` |
| `commands/analysis.rs` | Log file upload, PII detection, redaction |
| `commands/docs.rs` | RCA and post-mortem generation, document export |
| `commands/system.rs` | Ollama management, hardware probe, settings, audit log |
| `commands/integrations.rs` | Confluence / ServiceNow / ADO — v0.2 stubs |
| `ai/provider.rs` | `Provider` trait + `create_provider()` factory |
| `pii/detector.rs` | Multi-pattern PII scanner with overlap resolution |
| `db/migrations.rs` | Versioned schema (10 migrations in `_migrations` table) |
| `db/models.rs` | All DB types — see `IssueDetail` note below |
| `docs/rca.rs` + `docs/postmortem.rs` | Markdown template builders |
| `audit/log.rs` | `write_audit_event()` — called before every external send |
### Directory Structure
```
src-tauri/src/
├── lib.rs
├── main.rs
├── state.rs
├── ai/
│ ├── provider.rs # Provider trait + factory
│ ├── openai.rs
│ ├── anthropic.rs
│ ├── gemini.rs
│ ├── mistral.rs
│ └── ollama.rs
├── commands/
│ ├── db.rs
│ ├── ai.rs
│ ├── analysis.rs
│ ├── docs.rs
│ ├── system.rs
│ └── integrations.rs
├── pii/
│ ├── patterns.rs
│ ├── detector.rs
│ └── redactor.rs
├── db/
│ ├── connection.rs
│ ├── migrations.rs
│ └── models.rs
├── docs/
│ ├── rca.rs
│ ├── postmortem.rs
│ └── exporter.rs
├── audit/
│ └── log.rs
├── ollama/
│ ├── installer.rs
│ ├── manager.rs
│ ├── recommender.rs
│ └── hardware.rs
└── integrations/
├── confluence.rs
├── servicenow.rs
└── azuredevops.rs
```
## Frontend — React/TypeScript
**IPC layer:** All Tauri `invoke()` calls are in `src/lib/tauriCommands.ts`. Every command has a typed wrapper. This is the single source of truth for the frontend API surface.
### Stores (Zustand)
| Store | Persistence | Contents |
|-------|------------|----------|
| `sessionStore.ts` | Not persisted (ephemeral) | currentIssue, messages, piiSpans, approvedRedactions, whyLevel (05), loading state |
| `settingsStore.ts` | `localStorage` as `"tftsr-settings"` | AI providers, theme, Ollama URL, active provider |
| `historyStore.ts` | Not persisted (cache) | Past issues list, search query |
### Page Flow
```
NewIssue → createIssueCmd → startSession(detail.issue) → navigate /issue/:id/triage
LogUpload → uploadLogFileCmd → detectPiiCmd → applyRedactionsCmd
Triage → chatMessageCmd loop → auto-detect why levels 15
Resolution → getIssueCmd → mark 5-Whys steps done
RCA → generateRcaCmd → DocEditor → exportDocumentCmd
```
### Directory Structure
```
src/
├── main.tsx
├── App.tsx
├── components/
│ ├── ChatWindow.tsx
│ ├── TriageProgress.tsx
│ ├── PiiDiffViewer.tsx
│ ├── DocEditor.tsx
│ ├── HardwareReport.tsx
│ ├── ModelSelector.tsx
│ └── ui/index.tsx # Custom components (Card, Button, Input, etc.)
├── pages/
│ ├── Dashboard/
│ ├── NewIssue/
│ ├── LogUpload/
│ ├── Triage/
│ ├── Resolution/
│ ├── RCA/
│ ├── Postmortem/
│ ├── History/
│ └── Settings/
│ ├── AIProviders.tsx
│ ├── Ollama.tsx
│ ├── Integrations.tsx
│ └── Security.tsx
├── stores/
│ ├── sessionStore.ts
│ ├── settingsStore.ts
│ └── historyStore.ts
└── lib/
├── tauriCommands.ts
└── domainPrompts.ts
```
## Key Type: IssueDetail
`get_issue()` returns a **nested** struct, not flat. Always use `detail.issue.*`:
```rust
pub struct IssueDetail {
pub issue: Issue, // Base fields (title, severity, etc.)
pub log_files: Vec<LogFile>,
pub resolution_steps: Vec<ResolutionStep>, // 5-Whys entries
pub conversations: Vec<AiConversation>,
}
```
Use `detail.issue.title`, **not** `detail.title`.
## Application Startup Sequence
```
1. Initialize tracing (RUST_LOG controls level)
2. Determine data directory (~/.local/share/tftsr or TFTSR_DATA_DIR)
3. Open / create SQLite database (run migrations)
4. Create AppState (db + settings + app_data_dir)
5. Register Tauri plugins (stronghold, dialog, fs, shell, http, cli, updater)
6. Register all 39 IPC command handlers
7. Start WebView with React app
```
## Data Flow
```
User Input
[New Issue] ──── UUID assigned, stored in DB
[Upload Log] ─── File read, SHA-256 hash computed, path stored
[Detect PII] ─── 13 regex patterns applied, overlaps resolved
[Review PII] ─── User approves/rejects each span
[Apply Redactions] ─ Text rewritten, audit event logged
[AI Chat] ──────── Domain system prompt injected
Redacted text sent to provider
Auto-detect why level from response
[5-Whys] ───────── Answers stored as resolution_steps
[Generate RCA] ─── Markdown from template + answers
[Export] ────────── MD or PDF to user-chosen directory
```

158
docs/wiki/CICD-Pipeline.md Normal file
View File

@ -0,0 +1,158 @@
# CI/CD Pipeline
## Infrastructure
| Component | URL | Notes |
|-----------|-----|-------|
| Gogs | `http://172.0.0.29:3000` / `https://gogs.tftsr.com` | Git server, version 0.14 |
| Woodpecker CI (direct) | `http://172.0.0.29:8084` | v0.15.4 |
| Woodpecker CI (proxy) | `http://172.0.0.29:8085` | nginx with custom login page |
| PostgreSQL (Gogs DB) | Container: `gogs_postgres_db` | DB: `gogsdb`, User: `gogs` |
---
## Test Pipeline (`.woodpecker/test.yml`)
**Triggers:** Every push and pull request to any branch.
```
Pipeline steps:
1. rust-fmt-check → cargo fmt --check
2. rust-clippy → cargo clippy -- -D warnings
3. rust-tests → cargo test
4. frontend-typecheck → npx tsc --noEmit
5. frontend-tests → npm run test:run (Vitest)
```
**Docker images used:**
- `rust:1.88-slim` — Rust steps (minimum for cookie_store + time + darling)
- `node:22-alpine` — Frontend steps
**System dependencies installed in CI (Rust steps):**
```
libwebkit2gtk-4.1-dev, libssl-dev, libgtk-3-dev, libsoup-3.0-dev,
librsvg2-dev, libglib2.0-dev
```
**Pipeline YAML format (Woodpecker 0.15.4 — legacy MAP format):**
```yaml
clone:
git:
image: woodpeckerci/plugin-git
network_mode: gogs_default # requires repo_trusted=1
environment:
- CI_REPO_CLONE_URL=http://gogs_app:3000/sarman/tftsr-devops_investigation.git
pipeline:
step-name: # KEY = step name (MAP, not list!)
image: rust:1.88-slim
commands:
- cargo test
```
> ⚠️ **Do NOT** use the newer `steps:` list format — Woodpecker 0.15.4 uses the Drone-legacy map format.
---
## Release Pipeline (`.woodpecker/release.yml`)
**Triggers:** Git tags matching `v*`
```
Pipeline steps:
1. build-linux-amd64 → cargo tauri build (x86_64-unknown-linux-gnu)
2. build-linux-arm64 → cargo tauri build (aarch64-unknown-linux-gnu, cross-compile)
3. upload-release → Create Gogs release + upload artifacts via API
```
**Artifacts per platform:**
- Linux amd64: `.deb`, `.rpm`, `.AppImage`
- Linux arm64: `.deb`, `.AppImage`
**Gogs Release API:**
```bash
# Create release
POST $API/repos/sarman/tftsr-devops_investigation/releases
Authorization: token $GOGS_TOKEN
# Upload artifact
POST $API/repos/sarman/tftsr-devops_investigation/releases/{id}/assets
```
The `GOGS_TOKEN` is stored as a Woodpecker secret.
---
## Webhook Configuration
**Hook ID:** 6 (in Gogs)
**Events:** `create`, `push`, `pull_request`
**URL:** `http://172.0.0.29:8084/hook?access_token=<JWT>`
**JWT signing:**
- Algorithm: HS256
- Secret: `repo_hash` value from Woodpecker DB (`dK8zFWtAu67qfKd3Et6N8LptqTmedumJ`)
- Payload: `{"text":"sarman/tftsr-devops_investigation","type":"hook"}`
> ⚠️ JWT has an `iat` claim. If it's stale, regenerate it.
---
## Woodpecker DB State
SQLite at `/docker_mounts/woodpecker/data/woodpecker.sqlite` (on host `172.0.0.29`).
Key values:
```sql
-- User
SELECT user_token FROM users WHERE user_login='sarman';
-- Should be: [REDACTED-ROTATED]
-- Repo
SELECT repo_active, repo_trusted, repo_config_path, repo_hash
FROM repos WHERE repo_full_name='sarman/tftsr-devops_investigation';
-- repo_active=1, repo_trusted=1
-- repo_config_path='.woodpecker/test.yml'
-- repo_hash='dK8zFWtAu67qfKd3Et6N8LptqTmedumJ'
```
---
## Known Issues & Fixes
### Webhook JWT Must Use `?access_token=`
`token.ParseRequest()` in Woodpecker 0.15.4 does **not** read `?token=` URL params. Use `?access_token=<JWT>` instead.
### JWT Signed with `repo_hash` (Not User Hash)
Hook JWT must be signed with the `repo_hash` value, not the user's hash.
### Directory-Based Config Not Supported
Woodpecker 0.15.4 only supports a **single config file**. Set `repo_config_path = .woodpecker/test.yml` in the Woodpecker DB. The `.woodpecker/` directory approach requires v2.x+.
### Step Containers Network Isolation
Pipeline step containers run on the default Docker bridge and cannot resolve `gogs_app` hostname. Fix: set `network_mode: gogs_default` in the clone section (requires `repo_trusted=1`).
### Empty Clone URL Bug
Woodpecker 0.15.4's `go-gogs-client` `PayloadRepo` struct lacks `CloneURL`/`SSHURL` fields, so `build_remote` is always empty from Gogs push payloads. Fix: override the clone URL via `CI_REPO_CLONE_URL` environment variable.
### Gogs Token Authentication
The `sha1` field in Gogs token create API response **is** the actual bearer token (not a hash). Use it directly:
```
Authorization: token <sha1_from_create_response>
```
### Gogs SPA Login Field Mismatch
Gogs 0.14 SPA login form uses `login=` field; the Gogs backend reads `username=`. A custom login page is served by nginx at `/login`.
### Gogs OAuth2 Limitation
Gogs 0.14 has no OAuth2 provider support, blocking upgrade to Woodpecker 2.x.
---
## Gogs PostgreSQL Access
```bash
docker exec gogs_postgres_db psql -U gogs -d gogsdb -c "SELECT id, lower_name FROM repository;"
```
> Database name is `gogsdb`, not `gogs`.

228
docs/wiki/Database.md Normal file
View File

@ -0,0 +1,228 @@
# Database
## Overview
TFTSR uses **SQLite** via `rusqlite` with the `bundled-sqlcipher` feature for AES-256 encryption in production. 10 versioned migrations are tracked in the `_migrations` table.
**DB file location:** `{app_data_dir}/tftsr.db`
---
## Encryption
| Build type | Encryption | Key |
|-----------|-----------|-----|
| Debug (`debug_assertions`) | None (plain SQLite) | — |
| Release | SQLCipher AES-256 | `TFTSR_DB_KEY` env var |
**SQLCipher settings (production):**
- Cipher: AES-256-CBC
- Page size: 4096 bytes
- KDF: PBKDF2-HMAC-SHA512, 256,000 iterations
- HMAC: HMAC-SHA512
```rust
// Simplified init logic
pub fn init_db(data_dir: &Path) -> anyhow::Result<Connection> {
let key = env::var("TFTSR_DB_KEY")
.unwrap_or_else(|_| "dev-key-change-in-prod".to_string());
let conn = if cfg!(debug_assertions) {
Connection::open(db_path)? // plain SQLite
} else {
open_encrypted_db(db_path, &key)? // SQLCipher AES-256
};
run_migrations(&conn)?;
Ok(conn)
}
```
---
## Schema (10 Migrations)
### 001 — issues
```sql
CREATE TABLE issues (
id TEXT PRIMARY KEY,
title TEXT NOT NULL,
description TEXT,
severity TEXT NOT NULL, -- 'critical', 'high', 'medium', 'low'
status TEXT NOT NULL, -- 'open', 'investigating', 'resolved', 'closed'
category TEXT,
source TEXT,
created_at TEXT NOT NULL, -- 'YYYY-MM-DD HH:MM:SS'
updated_at TEXT NOT NULL,
resolved_at TEXT, -- nullable
assigned_to TEXT,
tags TEXT -- JSON array stored as TEXT
);
```
### 002 — log_files
```sql
CREATE TABLE log_files (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
file_name TEXT NOT NULL,
file_path TEXT NOT NULL,
file_size INTEGER,
mime_type TEXT,
content_hash TEXT, -- SHA-256 hex of original content
uploaded_at TEXT NOT NULL,
redacted INTEGER DEFAULT 0 -- boolean: 0/1
);
```
### 003 — pii_spans
```sql
CREATE TABLE pii_spans (
id TEXT PRIMARY KEY,
log_file_id TEXT NOT NULL REFERENCES log_files(id) ON DELETE CASCADE,
pii_type TEXT NOT NULL,
start_offset INTEGER NOT NULL,
end_offset INTEGER NOT NULL,
original_value TEXT NOT NULL,
replacement TEXT NOT NULL
);
```
### 004 — ai_conversations
```sql
CREATE TABLE ai_conversations (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
provider TEXT NOT NULL,
model TEXT NOT NULL,
created_at TEXT NOT NULL,
title TEXT
);
```
### 005 — ai_messages
```sql
CREATE TABLE ai_messages (
id TEXT PRIMARY KEY,
conversation_id TEXT NOT NULL REFERENCES ai_conversations(id) ON DELETE CASCADE,
role TEXT NOT NULL CHECK(role IN ('system', 'user', 'assistant')),
content TEXT NOT NULL,
token_count INTEGER DEFAULT 0,
created_at TEXT NOT NULL
);
```
### 006 — resolution_steps (5-Whys)
```sql
CREATE TABLE resolution_steps (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
step_order INTEGER NOT NULL, -- 15
why_question TEXT NOT NULL,
answer TEXT,
evidence TEXT,
created_at TEXT NOT NULL
);
```
### 007 — documents
```sql
CREATE TABLE documents (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
doc_type TEXT NOT NULL, -- 'rca', 'postmortem'
title TEXT NOT NULL,
content_md TEXT NOT NULL,
created_at INTEGER NOT NULL, -- milliseconds since epoch
updated_at INTEGER NOT NULL
);
```
> **Note:** `documents` uses INTEGER milliseconds; `issues` and `log_files` use TEXT timestamps.
### 008 — audit_log
```sql
CREATE TABLE audit_log (
id TEXT PRIMARY KEY,
timestamp TEXT NOT NULL DEFAULT (datetime('now')),
action TEXT NOT NULL, -- e.g., 'ai_send', 'publish_to_confluence'
entity_type TEXT NOT NULL, -- e.g., 'issue', 'document'
entity_id TEXT NOT NULL,
user_id TEXT DEFAULT 'local',
details TEXT -- JSON with hashes, log_file_ids, etc.
);
```
### 009 — settings
```sql
CREATE TABLE settings (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
updated_at TEXT NOT NULL
);
```
### 010 — issues_fts (Full-Text Search)
```sql
CREATE VIRTUAL TABLE issues_fts USING fts5(
id UNINDEXED,
title,
description,
content='issues',
content_rowid='rowid'
);
```
---
## Key Design Notes
- All primary keys are **UUID v7** (time-sortable)
- Boolean flags stored as `INTEGER` (`0`/`1`)
- JSON arrays (e.g., `tags`) stored as `TEXT`
- `issues` / `log_files` timestamps: `TEXT` (`YYYY-MM-DD HH:MM:SS`)
- `documents` timestamps: `INTEGER` (milliseconds since epoch)
- All foreign keys with `ON DELETE CASCADE`
- Migration history tracked in `_migrations` table (name + applied_at)
---
## Rust Model Types
Key structs in `db/models.rs`:
```rust
pub struct Issue {
pub id: String,
pub title: String,
pub description: Option<String>,
pub severity: String,
pub status: String,
// ...
}
pub struct IssueDetail { // Nested — returned by get_issue()
pub issue: Issue,
pub log_files: Vec<LogFile>,
pub resolution_steps: Vec<ResolutionStep>,
pub conversations: Vec<AiConversation>,
}
pub struct AuditEntry {
pub id: String,
pub timestamp: String,
pub action: String, // NOT event_type
pub entity_type: String, // NOT destination
pub entity_id: String, // NOT status
pub user_id: String,
pub details: Option<String>,
}
```

View File

@ -0,0 +1,175 @@
# Development Setup
## Prerequisites
### System (Linux/Fedora)
```bash
sudo dnf install -y glib2-devel gtk3-devel webkit2gtk4.1-devel \
libsoup3-devel openssl-devel librsvg2-devel
```
### Rust
```bash
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source ~/.cargo/env
```
Minimum required version: **Rust 1.88** (needed by `cookie_store`, `time`, `darling`).
### Node.js
Node **v22** required. Install via nvm or system package manager.
### Project Dependencies
```bash
npm install --legacy-peer-deps
```
---
## Environment Variables
| Variable | Default | Purpose |
|----------|---------|---------|
| `TFTSR_DATA_DIR` | Platform data dir | Override DB location |
| `TFTSR_DB_KEY` | `dev-key-change-in-prod` | DB encryption key (required in production) |
| `RUST_LOG` | `info` | Tracing verbosity: `debug`, `info`, `warn`, `error` |
Application data is stored at:
- **Linux:** `~/.local/share/tftsr/`
- **macOS:** `~/Library/Application Support/tftsr/`
- **Windows:** `%APPDATA%\tftsr\`
---
## Development Commands
### Start Full Dev Environment
```bash
source ~/.cargo/env
cargo tauri dev
```
Hot reload: Vite (frontend at `localhost:1420`) + Tauri (Rust recompiles on save).
### Frontend Only
```bash
npm run dev
# → http://localhost:1420
```
---
## Testing
```bash
# Rust unit tests
cargo test --manifest-path src-tauri/Cargo.toml
# Run a single test module
cargo test --manifest-path src-tauri/Cargo.toml pii::detector
# Run a single test by name
cargo test --manifest-path src-tauri/Cargo.toml test_detect_ipv4
# Frontend tests (single run)
npm run test:run
# Frontend tests (watch mode)
npm run test
# Frontend coverage report
npm run test:coverage
# TypeScript type check
npx tsc --noEmit
```
Current test status: **13/13 frontend tests passing**, Rust tests passing.
---
## Linting & Formatting
```bash
# Rust format check
cargo fmt --manifest-path src-tauri/Cargo.toml --check
# Auto-format
cargo fmt --manifest-path src-tauri/Cargo.toml
# Rust lints (all warnings as errors)
cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings
# Quick Rust type check (no linking)
cargo check --manifest-path src-tauri/Cargo.toml
```
---
## Production Build
```bash
cargo tauri build
# → src-tauri/target/release/bundle/
# Outputs: .deb, .rpm, .AppImage (Linux)
```
Release builds enable **SQLCipher AES-256** encryption. Set `TFTSR_DB_KEY` before building.
---
## Rust Design Patterns
### Mutex Release Before Await
`MutexGuard` is not `Send`. Always release the lock before any `.await`:
```rust
// ✅ CORRECT — release lock before await
let value = {
let db = state.db.lock().map_err(|e| e.to_string())?;
db.query_row(...)?
}; // ← lock released here
some_async_call().await?;
// ❌ WRONG — compile error: MutexGuard not Send across await
let db = state.db.lock()?;
let result = some_async_call().await?; // ERROR
```
### Database Queries (Lifetime Issue)
Use `conn.prepare().and_then(...)` pattern:
```rust
// ✅ CORRECT
let rows = conn.prepare("SELECT ...")
.and_then(|mut stmt| stmt.query_map(params![], |row| { ... })?.collect())?;
// ❌ causes lifetime issues in async context
let mut stmt = conn.prepare("SELECT ...")?;
let rows = stmt.query_map(...)?;
```
### Command Handler Pattern
```rust
#[tauri::command]
pub async fn my_command(
param: String,
state: State<'_, AppState>,
) -> Result<ResponseType, String> {
let result = {
let db = state.db.lock().map_err(|e| e.to_string())?;
db.query_row("SELECT ...", params![param], |row| { ... })
.map_err(|e| e.to_string())?
};
Ok(result)
}
```

53
docs/wiki/Home.md Normal file
View File

@ -0,0 +1,53 @@
# TFTSR — IT Triage & RCA Desktop Application
**TFTSR** is a secure desktop application for guided IT incident triage, root cause analysis (RCA), and post-mortem documentation. Built with Tauri 2.x (Rust + WebView) and React 18.
## Quick Navigation
| Topic | Description |
|-------|-------------|
| [Architecture](wiki/Architecture) | Backend, frontend, and data flow |
| [Development Setup](wiki/Development-Setup) | Prerequisites, commands, environment |
| [Database](wiki/Database) | Schema, migrations, encryption |
| [AI Providers](wiki/AI-Providers) | Supported providers and configuration |
| [PII Detection](wiki/PII-Detection) | Patterns, redaction flow, security |
| [IPC Commands](wiki/IPC-Commands) | Full list of Tauri backend commands |
| [CI/CD Pipeline](wiki/CICD-Pipeline) | Woodpecker CI + Gogs setup |
| [Security Model](wiki/Security-Model) | Encryption, audit trail, capabilities |
| [Integrations](wiki/Integrations) | Confluence, ServiceNow, Azure DevOps (v0.2) |
| [Troubleshooting](wiki/Troubleshooting) | Known issues and fixes |
## Key Features
- **5-Whys AI Triage** — Interactive guided root cause analysis via multi-turn AI chat
- **PII Auto-Redaction** — Detects and redacts sensitive data before any AI send
- **Multi-Provider AI** — OpenAI, Anthropic Claude, Google Gemini, Mistral, local Ollama (fully offline)
- **SQLCipher AES-256** — All issue history encrypted at rest
- **RCA + Post-Mortem Generation** — Auto-populated Markdown templates, exportable as MD/PDF
- **Ollama Management** — Hardware detection, model recommendations, in-app model management
- **Audit Trail** — Every external data send logged with SHA-256 hash
- **Domain-Specific Prompts** — 8 IT domains: Linux, Windows, Network, Kubernetes, Databases, Virtualization, Hardware, Observability
## Project Status
| Phase | Status |
|-------|--------|
| Phases 18 (Core) | ✅ Complete |
| Phase 9 (History/Search FTS) | 🔄 Partially integrated |
| Phase 10 (Integrations) | 🕐 v0.2 stubs only |
| Phase 11 (CLI) | 🕐 Planned |
| Phase 12 (Release packaging) | 🔄 Linux done; macOS/Windows pending |
## Tech Stack
| Layer | Technology |
|-------|-----------|
| Desktop framework | Tauri 2.x |
| Backend | Rust (async/await, tokio) |
| Frontend | React 18 + TypeScript + Vite |
| Styling | Tailwind CSS + custom components |
| Database | rusqlite + SQLCipher (AES-256) |
| Secret storage | tauri-plugin-stronghold |
| State | Zustand |
| Testing | Vitest (frontend) + `#[cfg(test)]` (Rust) |
| CI/CD | Woodpecker CI v0.15.4 + Gogs |

234
docs/wiki/IPC-Commands.md Normal file
View File

@ -0,0 +1,234 @@
# IPC Commands
All backend commands are typed wrappers in `src/lib/tauriCommands.ts`. The Rust handlers live in `src-tauri/src/commands/`.
---
## Database Commands
### `create_issue`
```typescript
createIssueCmd(title: string, description: string, severity: string, category: string) → Issue
```
Creates a new issue. Generates UUID v7. Returns the created `Issue`.
### `get_issue`
```typescript
getIssueCmd(issueId: string) → IssueDetail
```
Returns a **nested** `IssueDetail` — use `detail.issue.title`, not `detail.title`.
```typescript
interface IssueDetail {
issue: Issue;
log_files: LogFile[];
resolution_steps: ResolutionStep[];
conversations: AiConversation[];
}
```
### `list_issues`
```typescript
listIssuesCmd(query: IssueListQuery) → IssueSummary[]
```
Paginated list. Supports filter by status, severity, category; sort by created_at/updated_at.
### `update_issue`
```typescript
updateIssueCmd(issueId: string, updates: Partial<IssueUpdate>) → IssueDetail
```
Partial update. Only provided fields are changed.
### `delete_issue`
```typescript
deleteIssueCmd(issueId: string) → void
```
Cascades: deletes log_files, pii_spans, conversations, messages, resolution_steps, documents.
### `search_issues`
```typescript
searchIssuesCmd(query: string) → IssueSummary[]
```
Full-text search via FTS5 virtual table on title + description.
### `add_five_why`
```typescript
addFiveWhyCmd(issueId: string, whyNumber: number, question: string, answer?: string) → FiveWhyEntry
```
Adds a 5-Whys entry (step 15). `whyNumber` maps to `step_order`.
### `update_five_why`
```typescript
updateFiveWhyCmd(entryId: string, answer: string) → void
```
Sets or updates the answer for an existing 5-Whys entry.
### `add_timeline_event`
```typescript
addTimelineEventCmd(issueId: string, eventType: string, description: string) → TimelineEvent
```
Records a timestamped event in the issue timeline.
---
## Analysis / PII Commands
### `upload_log_file`
```typescript
uploadLogFileCmd(issueId: string, filePath: string) → LogFile
```
Reads the file from disk, computes SHA-256, stores metadata in DB. Returns `LogFile` record.
### `detect_pii`
```typescript
detectPiiCmd(logFileId: string) → PiiDetectionResult
```
Runs 13 PII patterns on the file content. Returns non-overlapping `PiiSpan[]`.
```typescript
interface PiiDetectionResult {
log_file_id: string;
spans: PiiSpan[];
total_found: number;
}
```
### `apply_redactions`
```typescript
applyRedactionsCmd(logFileId: string, approvedSpanIds: string[]) → RedactedLogFile
```
Rewrites file content with approved redactions. Records SHA-256 in audit log. Returns redacted content path.
---
## AI Commands
### `analyze_logs`
```typescript
analyzeLogsCmd(issueId: string, logFileIds: string[], providerConfig: ProviderConfig) → AnalysisResult
```
Sends selected (redacted) log files to the AI provider with an analysis prompt.
### `chat_message`
```typescript
chatMessageCmd(issueId: string, message: string, providerConfig: ProviderConfig) → ChatResponse
```
Sends a message in the ongoing triage conversation. Domain system prompt is injected automatically on first message. AI response is parsed for why-level indicators (15).
### `list_providers`
```typescript
listProvidersCmd() → ProviderInfo[]
```
Returns the list of supported providers with their available models and configuration schema.
---
## Document Commands
### `generate_rca`
```typescript
generateRcaCmd(issueId: string) → Document
```
Builds an RCA Markdown document from the issue data, 5-Whys answers, and timeline.
### `generate_postmortem`
```typescript
generatePostmortemCmd(issueId: string) → Document
```
Builds a blameless post-mortem Markdown document.
### `update_document`
```typescript
updateDocumentCmd(docId: string, contentMd: string) → Document
```
Saves edited Markdown content back to the database.
### `export_document`
```typescript
exportDocumentCmd(docId: string, format: 'md' | 'pdf', outputDir: string) → string
```
Exports document to file. Returns the absolute path of the written file. PDF generation uses `printpdf`.
---
## System / Ollama Commands
### `check_ollama_installed`
```typescript
checkOllamaInstalledCmd() → OllamaStatus
```
Checks if Ollama is running on the configured URL (default: `localhost:11434`).
### `get_ollama_install_guide`
```typescript
getOllamaInstallGuideCmd(platform: string) → InstallGuide
```
Returns platform-specific install instructions for Ollama.
### `list_ollama_models`
```typescript
listOllamaModelsCmd() → OllamaModel[]
```
Lists all locally available Ollama models.
### `pull_ollama_model`
```typescript
pullOllamaModelCmd(modelName: string) → void
```
Downloads a model from the Ollama registry. Streams progress.
### `delete_ollama_model`
```typescript
deleteOllamaModelCmd(modelName: string) → void
```
Removes a model from local storage.
### `detect_hardware`
```typescript
detectHardwareCmd() → HardwareInfo
```
Probes CPU, RAM, GPU. Returns hardware specifications.
### `recommend_models`
```typescript
recommendModelsCmd() → ModelRecommendation[]
```
Returns model recommendations based on detected hardware.
```typescript
interface ModelRecommendation {
name: string;
size: string; // e.g., "2.0 GB" — a String, not a number
reason: string;
}
```
### `get_settings`
```typescript
getSettingsCmd() → AppSettings
```
Reads app settings from the `settings` table.
### `update_settings`
```typescript
updateSettingsCmd(partial: Partial<AppSettings>) → AppSettings
```
Merges partial settings and persists to DB.
### `get_audit_log`
```typescript
getAuditLogCmd(filter: AuditLogFilter) → AuditEntry[]
```
Returns audit log entries. Filter by action, entity_type, date range.
---
## Integration Commands (v0.2 Stubs)
All 6 integration commands currently return `"not yet available"` errors.
| Command | Purpose |
|---------|---------|
| `test_confluence_connection` | Verify Confluence credentials |
| `publish_to_confluence` | Publish RCA/postmortem to Confluence space |
| `test_servicenow_connection` | Verify ServiceNow credentials |
| `create_servicenow_incident` | Create incident from issue |
| `test_azuredevops_connection` | Verify Azure DevOps credentials |
| `create_azuredevops_workitem` | Create work item from issue |

97
docs/wiki/Integrations.md Normal file
View File

@ -0,0 +1,97 @@
# Integrations
> **Status: All integrations are v0.2 stubs.** They are implemented as placeholder commands that return `"not yet available"` errors. The authentication framework and command signatures are finalized, but the actual API calls are not yet implemented.
---
## Confluence
**Purpose:** Publish RCA and post-mortem documents to a Confluence space.
**Commands:**
- `test_confluence_connection(base_url, credentials)` — Verify credentials
- `publish_to_confluence(doc_id, space_key, parent_page_id?)` — Create/update page
**Planned implementation:**
- Confluence REST API v2: `POST /wiki/rest/api/content`
- Auth: Basic auth (email + API token) or OAuth2
- Page format: Convert Markdown → Confluence storage format (XHTML-like)
**Configuration (Settings → Integrations → Confluence):**
```
Base URL: https://yourorg.atlassian.net
Email: user@example.com
API Token: (stored in Stronghold)
Space Key: PROJ
```
---
## ServiceNow
**Purpose:** Create incident records in ServiceNow from TFTSR issues.
**Commands:**
- `test_servicenow_connection(instance_url, credentials)` — Verify credentials
- `create_servicenow_incident(issue_id, config)` — Create incident
**Planned implementation:**
- ServiceNow Table API: `POST /api/now/table/incident`
- Auth: Basic auth or OAuth2 bearer token
- Field mapping: TFTSR severity → ServiceNow priority (P1=Critical, P2=High, etc.)
**Configuration:**
```
Instance URL: https://yourorg.service-now.com
Username: admin
Password: (stored in Stronghold)
```
---
## Azure DevOps
**Purpose:** Create work items (bugs/incidents) in Azure DevOps from TFTSR issues.
**Commands:**
- `test_azuredevops_connection(org_url, credentials)` — Verify credentials
- `create_azuredevops_workitem(issue_id, project, config)` — Create work item
**Planned implementation:**
- Azure DevOps REST API: `POST /{organization}/{project}/_apis/wit/workitems/${type}`
- Auth: Personal Access Token (PAT) via Basic auth header
- Work item type: Bug or Incident
**Configuration:**
```
Organization URL: https://dev.azure.com/yourorg
Personal Access Token: (stored in Stronghold)
Project: MyProject
Work Item Type: Bug
```
---
## v0.2 Roadmap
Integration implementation order (planned):
1. **Confluence** — Most commonly requested; Markdown-to-Confluence conversion library needed
2. **Azure DevOps** — Clean REST API, straightforward PAT auth
3. **ServiceNow** — More complex field mapping; may require customer-specific configuration
Each integration will also require:
- Audit log entry on every publish action
- PII check on document content before external publish
- Connection test UI in Settings → Integrations
---
## Adding an Integration
1. Implement the logic in `src-tauri/src/integrations/{name}.rs`
2. Remove the stub `Err("not yet available")` return in `commands/integrations.rs`
3. Add the new API endpoint to the Tauri CSP `connect-src`
4. Add Stronghold secret key for the API credentials
5. Wire up the Settings UI in `src/pages/Settings/Integrations.tsx`
6. Add audit log call before the external API request

113
docs/wiki/PII-Detection.md Normal file
View File

@ -0,0 +1,113 @@
# PII Detection
## Overview
Before any text is sent to an AI provider, TFTSR scans it for personally identifiable information (PII). Users must review and approve each detected span before the redacted text is transmitted.
## Detection Flow
```
1. Upload log file
2. detect_pii(log_file_id)
→ Scans content with 13 regex patterns
→ Resolves overlapping matches (longest wins)
→ Returns Vec<PiiSpan> with byte offsets + replacements
3. User reviews spans in PiiDiffViewer (before/after diff)
→ Approves or rejects each span
4. apply_redactions(log_file_id, approved_span_ids)
→ Rewrites text with replacements (iterates spans in REVERSE order to preserve offsets)
→ Records SHA-256 hash of redacted text in audit_log
5. Redacted text safe to send to AI
```
## Detection Patterns (13 Types)
| Type | Replacement | Pattern notes |
|------|-------------|---------------|
| `UrlWithCredentials` | `[URL]` | `scheme://user:pass@host` |
| `BearerToken` | `[Bearer]` | Case-insensitive `bearer` keyword + token chars |
| `ApiKey` | `[ApiKey]` | `api_key=`, `apikey=`, `access_token=` + 16+ char value |
| `Password` | `[Password]` | `password=`, `passwd=`, `pwd=` + non-whitespace value |
| `Ssn` | `[SSN]` | `\b\d{3}-\d{2}-\d{4}\b` |
| `CreditCard` | `[CreditCard]` | Visa/MC/Amex Luhn-format numbers |
| `Email` | `[Email]` | RFC-compliant email addresses |
| `MacAddress` | `[MAC]` | `XX:XX:XX:XX:XX:XX` and `XX-XX-XX-XX-XX-XX` |
| `Ipv6` | `[IPv6]` | Full and compressed IPv6 addresses |
| `Ipv4` | `[IPv4]` | Standard dotted-quad notation |
| `PhoneNumber` | `[Phone]` | US and international phone formats |
| `Hostname` | _(patterns.rs)_ | Configurable hostname patterns |
| `UrlCredentials` | _(covered by UrlWithCredentials)_ | |
## Overlap Resolution
When two patterns match overlapping text, the **longer match wins**:
```rust
let mut filtered: Vec<PiiSpan> = Vec::new();
for span in sorted_by_start {
if let Some(last) = filtered.last() {
if span.start < last.end {
// Overlap: keep the longer span
if span.end - span.start > last.end - last.start {
filtered.pop();
filtered.push(span);
}
continue;
}
}
filtered.push(span);
}
```
## PiiSpan Struct
```rust
pub struct PiiSpan {
pub id: String, // UUID v7
pub pii_type: PiiType,
pub start: usize, // byte offset in original text
pub end: usize,
pub original_value: String,
pub replacement: String, // e.g., "[IPv4]"
}
```
## Redaction Algorithm
Spans are applied in **reverse order** to preserve byte offsets:
```rust
let mut redacted = original.to_string();
for span in approved_spans.iter().rev() { // reverse!
redacted.replace_range(span.start..span.end, &span.replacement);
}
```
## Audit Logging
Every redaction and every AI send is logged:
```rust
write_audit_event(
&conn,
"ai_send", // action
"issue", // entity_type
&issue_id, // entity_id
&json!({
"log_file_ids": [...],
"redacted_hash": sha256_hex, // SHA-256 of redacted text
"provider": provider_name,
}).to_string(),
)?;
```
## Security Guarantees
- PII detection runs **locally** — original text never leaves the machine
- Only the redacted text is sent to AI providers
- The SHA-256 hash in the audit log allows integrity verification
- If redaction is skipped (no PII detected), the audit log still records the send

122
docs/wiki/Security-Model.md Normal file
View File

@ -0,0 +1,122 @@
# Security Model
## Threat Model Summary
TFTSR handles sensitive IT incident data including log files that may contain credentials, PII, and internal infrastructure details. The security model addresses:
1. **Data at rest** — Database encryption
2. **Data in transit** — PII redaction before AI send, TLS for all outbound requests
3. **Secret storage** — API keys in Stronghold vault
4. **Audit trail** — Complete log of every external data transmission
5. **Least privilege** — Minimal Tauri capabilities
---
## Database Encryption (SQLCipher AES-256)
Production builds use SQLCipher:
- **Cipher:** AES-256-CBC
- **KDF:** PBKDF2-HMAC-SHA512, 256,000 iterations
- **HMAC:** HMAC-SHA512
- **Page size:** 4096 bytes
- **Key source:** `TFTSR_DB_KEY` environment variable
Debug builds use plain SQLite (no encryption) for developer convenience.
> ⚠️ **Never** use the default key (`dev-key-change-in-prod`) in a production environment.
---
## API Key Storage (Stronghold)
AI provider API keys are stored in `tauri-plugin-stronghold` — an encrypted vault backed by the [IOTA Stronghold](https://github.com/iotaledger/stronghold.rs) library.
The vault is initialized with a password-derived key using Argon2. API keys are never written to disk in plaintext or to the SQLite database.
---
## PII Redaction
**Mandatory path:** No text can be sent to an AI provider without going through the PII detection and user-approval flow.
```
log file → detect_pii() → user approves spans → apply_redactions() → AI provider
```
- Original text **never leaves the machine**
- Only the redacted version is transmitted
- The SHA-256 hash of the redacted text is recorded in the audit log for integrity verification
- See [PII Detection](PII-Detection) for the full list of detected patterns
---
## Audit Log
Every external data transmission is recorded:
```rust
write_audit_event(
&conn,
action, // "ai_send", "publish_to_confluence", etc.
entity_type, // "issue", "document"
entity_id, // UUID of the related record
details, // JSON: provider, model, hashes, log_file_ids
)?;
```
The audit log is stored in the encrypted SQLite database. It cannot be deleted through the UI.
**Audit entry fields:**
- `action` — what was done
- `entity_type` — type of record involved
- `entity_id` — UUID of that record
- `user_id` — always `"local"` (single-user app)
- `details` — JSON blob with hashes and metadata
- `timestamp` — UTC datetime
---
## Tauri Capabilities (Least Privilege)
Defined in `src-tauri/capabilities/default.json`:
| Plugin | Permissions granted |
|--------|-------------------|
| `dialog` | `allow-open`, `allow-save` |
| `fs` | `read-text`, `write-text`, `read`, `write`, `mkdir` — scoped to app dir and temp |
| `shell` | `allow-execute` — for running system commands |
| `http` | default — connect only to approved origins |
---
## Content Security Policy
```
default-src 'self';
style-src 'self' 'unsafe-inline';
img-src 'self' data: asset: https:;
connect-src 'self'
http://localhost:11434
https://api.openai.com
https://api.anthropic.com
https://api.mistral.ai
https://generativelanguage.googleapis.com;
```
HTTP is blocked by default. Only whitelisted HTTPS endpoints (and localhost for Ollama) are reachable.
---
## TLS
All outbound HTTP requests use `reqwest` with default TLS settings (TLS 1.2+ required). Certificate verification is enabled. No custom trust anchors are added.
---
## Security Checklist for New Features
- [ ] Does it send data externally? → Add audit log entry
- [ ] Does it handle user-provided text? → Run PII detection first
- [ ] Does it store secrets? → Use Stronghold, not the SQLite DB
- [ ] Does it need filesystem access? → Scope the fs capability
- [ ] Does it need a new HTTP endpoint? → Add to CSP `connect-src`

View File

@ -0,0 +1,192 @@
# Troubleshooting
## CI/CD
### Builds Not Triggering After Push
**Cause:** Woodpecker 0.15.4 `token.ParseRequest()` does not read `?token=` URL params.
**Fix:** Webhook URL must use `?access_token=<JWT>` (not `?token=`).
```
http://172.0.0.29:8084/hook?access_token=<JWT>
```
Regenerate the JWT if it's stale (JWT has an `iat` claim):
```bash
# JWT payload: {"text":"sarman/tftsr-devops_investigation","type":"hook"}
# Signed with: repo_hash (dK8zFWtAu67qfKd3Et6N8LptqTmedumJ)
```
---
### Pipeline Step Can't Reach Gogs
**Cause:** Step containers run on the default Docker bridge, not on `gogs_default` network.
**Fix:** Use `network_mode: gogs_default` in the clone section and ensure `repo_trusted=1`:
```bash
docker exec woodpecker_db sqlite3 /data/woodpecker.sqlite \
"UPDATE repos SET repo_trusted=1 WHERE repo_full_name='sarman/tftsr-devops_investigation';"
```
---
### Woodpecker Login Fails
**Cause:** Gogs 0.14 SPA login form uses `login=` field; backend reads `username=`.
**Fix:** Use the nginx proxy at `http://172.0.0.29:8085/login` which serves a corrected login form.
---
### Empty Clone URL in Pipeline
**Cause:** Woodpecker 0.15.4 `go-gogs-client` `PayloadRepo` struct is missing `CloneURL`.
**Fix:** Override with `CI_REPO_CLONE_URL` environment variable in the clone section:
```yaml
clone:
git:
environment:
- CI_REPO_CLONE_URL=http://gogs_app:3000/sarman/tftsr-devops_investigation.git
```
---
## Rust Compilation
### `MutexGuard` Not `Send` Across Await
**Error:**
```
error[E0277]: `MutexGuard<'_, Connection>` cannot be sent between threads safely
```
**Fix:** Release the mutex lock before any `.await` point:
```rust
// ✅ Correct
let result = {
let db = state.db.lock().map_err(|e| e.to_string())?;
db.query_row(...)?
}; // lock dropped here
async_fn().await?;
```
---
### Clippy Lints Fail in CI
Common lint fixes:
```rust
// uninlined_format_args
format!("{}", x) → format!("{x}")
// range::contains
x >= a && x < b (a..b).contains(&x)
// push_str single char
s.push_str("a") → s.push('a')
```
Run locally: `cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings`
---
### `cargo tauri dev` Fails — Missing System Libraries
**Fix (Fedora/RHEL):**
```bash
sudo dnf install -y glib2-devel gtk3-devel webkit2gtk4.1-devel \
libsoup3-devel openssl-devel librsvg2-devel
```
---
## Database
### DB Won't Open in Production
**Symptom:** App fails to start with SQLCipher error.
**Check:**
1. `TFTSR_DB_KEY` env var is set
2. Key matches what was used when DB was created
3. File isn't corrupted (try `file tftsr.db` — should say `SQLite 3.x database`)
**Warning:** Changing the key requires re-encrypting the database:
```bash
sqlite3 tftsr.db "ATTACH 'new.db' AS newdb KEY 'new-key'; \
SELECT sqlcipher_export('newdb'); DETACH DATABASE newdb;"
```
---
### Migration Fails to Run
Check which migrations have already been applied:
```sql
SELECT name, applied_at FROM _migrations ORDER BY id;
```
If a migration is partially applied, the DB may be in an inconsistent state. Restore from backup or recreate.
---
## Frontend
### TypeScript Errors After Pulling
Run a fresh type check:
```bash
npx tsc --noEmit
```
Ensure `tauriCommands.ts` matches the Rust command signatures exactly (especially `IssueDetail` nesting).
---
### `IssueDetail` Field Access Errors
The `get_issue` command returns a **nested** struct:
```typescript
// ✅ Correct
const title = detail.issue.title;
const severity = detail.issue.severity;
// ❌ Wrong — these fields don't exist at the top level
const title = detail.title;
```
---
### Vitest Tests Fail
```bash
npm run test:run
```
Common causes:
- Mocked `invoke()` return type doesn't match updated command signature
- `sessionStore` state not reset between tests (call `store.reset()` in `beforeEach`)
---
## Gogs
### Token Authentication
The `sha1` field from the Gogs token create API **is** the bearer token — use it directly:
```bash
curl -H "Authorization: token <sha1_value>" https://gogs.tftsr.com/api/v1/user
```
Do not confuse with the `sha1` column in the `access_token` table, which stores `sha1(token)[:40]`.
### PostgreSQL Access
```bash
docker exec gogs_postgres_db psql -U gogs -d gogsdb -c "SELECT id, lower_name, is_private FROM repository;"
```
Database is named `gogsdb`, not `gogs`.