Compare commits

...

10 Commits

Author SHA1 Message Date
Shaun Arman
0d8970a911 feat: add image attachment support with PII detection
- Add image_attachments table to database schema (migration 013)
- Implement image upload, list, delete, and clipboard paste commands
- Add image file PII detection with user approval workflow
- Register image attachment commands in Tauri IPC
- Update TypeScript types and frontend components
- Add unit tests for image attachment functionality
- Update README and wiki documentation
2026-04-08 19:25:12 -05:00
7112fbc0c1 Merge pull request 'feat(ci): add persistent pre-baked Docker builder images' (#21) from feat/persistent-ci-builders into master
Some checks failed
Build CI Docker Images / linux-amd64 (push) Failing after 1s
Build CI Docker Images / windows-cross (push) Failing after 3s
Build CI Docker Images / linux-arm64 (push) Failing after 1s
Auto Tag / autotag (push) Successful in 54s
Auto Tag / wiki-sync (push) Successful in 55s
Auto Tag / build-windows-amd64 (push) Has been cancelled
Auto Tag / build-linux-arm64 (push) Has been cancelled
Auto Tag / build-linux-amd64 (push) Has been cancelled
Auto Tag / build-macos-arm64 (push) Has been cancelled
Reviewed-on: #21
2026-04-06 02:15:36 +00:00
Shaun Arman
9b388e736d feat(ci): add persistent pre-baked Docker builder images
Some checks are pending
Test / rust-clippy (pull_request) Waiting to run
Test / rust-tests (pull_request) Waiting to run
Test / frontend-typecheck (pull_request) Waiting to run
Test / frontend-tests (pull_request) Waiting to run
Test / rust-fmt-check (pull_request) Successful in 3m57s
Add three Dockerfiles under .docker/ and a build-images.yml workflow that
pushes them to the local Gitea container registry (172.0.0.29:3000).

Each image pre-installs all system deps, Node.js 22, and the Rust cross-
compilation target so release builds can skip apt-get entirely:

  trcaa-linux-amd64:rust1.88-node22   — webkit2gtk, gtk3, all Tauri deps
  trcaa-windows-cross:rust1.88-node22 — mingw-w64, nsis, Windows target
  trcaa-linux-arm64:rust1.88-node22   — arm64 multiarch dev libs, Rust 1.88

build-images.yml triggers automatically when .docker/ changes on master
and supports workflow_dispatch for manual/first-time builds.

auto-tag.yml is NOT changed in this commit — switch it to use the new
images in the follow-up PR (after images are pushed to the registry).

One-time server setup required before first use:
  echo '{"insecure-registries":["172.0.0.29:3000"]}' \
    | sudo tee /etc/docker/daemon.json && sudo systemctl restart docker
2026-04-05 21:07:17 -05:00
ca9eec46d1 Merge pull request 'feat(ui): UI fixes, theme toggle, PII persistence, Ollama install instructions' (#20) from feat/ui-fixes-ollama-bundle-theme into master
Some checks failed
Auto Tag / autotag (push) Successful in 1m20s
Auto Tag / wiki-sync (push) Successful in 1m19s
Auto Tag / build-macos-arm64 (push) Successful in 6m24s
Auto Tag / build-linux-arm64 (push) Has been cancelled
Auto Tag / build-linux-amd64 (push) Has been cancelled
Auto Tag / build-windows-amd64 (push) Has been cancelled
Reviewed-on: #20
2026-04-06 01:54:36 +00:00
Shaun Arman
72625d590b refactor(ollama): remove download/install buttons — show plain install instructions only
Some checks failed
Test / rust-tests (pull_request) Has been cancelled
Test / frontend-typecheck (pull_request) Has been cancelled
Test / rust-clippy (pull_request) Has been cancelled
Test / frontend-tests (pull_request) Has been cancelled
Test / rust-fmt-check (pull_request) Has been cancelled
2026-04-05 20:53:57 -05:00
Shaun Arman
1be4c48690 fix(ci): remove all Ollama bundle download steps — use UI download button instead 2026-04-05 20:53:57 -05:00
Shaun Arman
ff69cf6b11 fix(ci): skip Ollama download on macOS build — runner has no access to GitHub binary assets 2026-04-05 20:53:57 -05:00
2ba0eaf97f Merge pull request 'feat(ui): fix model dropdown, auth prefill, PII persistence, theme toggle, Ollama bundle' (#19) from feat/ui-fixes-ollama-bundle-theme into master
Some checks failed
Auto Tag / autotag (push) Successful in 54s
Auto Tag / wiki-sync (push) Successful in 55s
Auto Tag / build-macos-arm64 (push) Failing after 19s
Auto Tag / build-windows-amd64 (push) Failing after 7m36s
Auto Tag / build-linux-arm64 (push) Has been cancelled
Auto Tag / build-linux-amd64 (push) Has been cancelled
Reviewed-on: #19
2026-04-06 01:12:34 +00:00
Shaun Arman
69b749bc62 style: apply cargo fmt to install_ollama_from_bundle
All checks were successful
Test / frontend-typecheck (pull_request) Successful in 1m49s
Test / frontend-tests (pull_request) Successful in 1m46s
Test / rust-fmt-check (pull_request) Successful in 5m36s
Test / rust-clippy (pull_request) Successful in 27m7s
Test / rust-tests (pull_request) Successful in 28m12s
2026-04-05 19:41:59 -05:00
Shaun Arman
b4b9f2a477 fix(security): add path canonicalization and actionable permission error in install_ollama_from_bundle
Some checks failed
Test / rust-fmt-check (pull_request) Has been cancelled
Test / frontend-tests (pull_request) Has been cancelled
Test / frontend-typecheck (pull_request) Has been cancelled
Test / rust-tests (pull_request) Has been cancelled
Test / rust-clippy (pull_request) Has been cancelled
2026-04-05 19:34:47 -05:00
31 changed files with 2047 additions and 225 deletions

View File

@ -0,0 +1,24 @@
# Pre-baked builder for Linux amd64 Tauri releases.
# All system dependencies are installed once here; CI jobs skip apt-get entirely.
# Rebuild when: Rust toolchain version changes, webkit2gtk/gtk major version changes,
# or Node.js major version changes. Tag format: rust<VER>-node<VER>
FROM rust:1.88-slim
RUN apt-get update -qq \
&& apt-get install -y -qq --no-install-recommends \
libwebkit2gtk-4.1-dev \
libssl-dev \
libgtk-3-dev \
libayatana-appindicator3-dev \
librsvg2-dev \
patchelf \
pkg-config \
curl \
perl \
jq \
git \
&& curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
&& apt-get install -y --no-install-recommends nodejs \
&& rm -rf /var/lib/apt/lists/*
RUN rustup target add x86_64-unknown-linux-gnu

View File

@ -0,0 +1,45 @@
# Pre-baked cross-compiler for Linux arm64 Tauri releases (runs on Linux amd64).
# Bakes in: amd64 cross-toolchain, arm64 multiarch dev libs, Node.js, and Rust.
# This image takes ~15 min to build but is only rebuilt when deps change.
# Rebuild when: Rust toolchain version, webkit2gtk/gtk major version, or Node.js changes.
# Tag format: rust<VER>-node<VER>
FROM ubuntu:22.04
ARG DEBIAN_FRONTEND=noninteractive
# Step 1: amd64 host tools and cross-compiler
RUN apt-get update -qq \
&& apt-get install -y -qq --no-install-recommends \
curl git gcc g++ make patchelf pkg-config perl jq \
gcc-aarch64-linux-gnu g++-aarch64-linux-gnu \
&& rm -rf /var/lib/apt/lists/*
# Step 2: Enable arm64 multiarch. Ubuntu uses ports.ubuntu.com for arm64 to avoid
# binary-all index conflicts with the amd64 archive.ubuntu.com mirror.
RUN dpkg --add-architecture arm64 \
&& sed -i 's|^deb http://archive.ubuntu.com|deb [arch=amd64] http://archive.ubuntu.com|g' /etc/apt/sources.list \
&& sed -i 's|^deb http://security.ubuntu.com|deb [arch=amd64] http://security.ubuntu.com|g' /etc/apt/sources.list \
&& printf '%s\n' \
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy main restricted universe multiverse' \
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy-updates main restricted universe multiverse' \
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy-security main restricted universe multiverse' \
> /etc/apt/sources.list.d/arm64-ports.list \
&& apt-get update -qq \
&& apt-get install -y -qq --no-install-recommends \
libwebkit2gtk-4.1-dev:arm64 \
libssl-dev:arm64 \
libgtk-3-dev:arm64 \
librsvg2-dev:arm64 \
&& rm -rf /var/lib/apt/lists/*
# Step 3: Node.js 22
RUN curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
&& apt-get install -y --no-install-recommends nodejs \
&& rm -rf /var/lib/apt/lists/*
# Step 4: Rust 1.88 with arm64 cross-compilation target
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
--default-toolchain 1.88.0 --profile minimal --no-modify-path \
&& /root/.cargo/bin/rustup target add aarch64-unknown-linux-gnu
ENV PATH="/root/.cargo/bin:${PATH}"

View File

@ -0,0 +1,20 @@
# Pre-baked cross-compiler for Windows amd64 Tauri releases (runs on Linux amd64).
# All MinGW and Node.js dependencies are installed once here; CI jobs skip apt-get entirely.
# Rebuild when: Rust toolchain version changes or Node.js major version changes.
# Tag format: rust<VER>-node<VER>
FROM rust:1.88-slim
RUN apt-get update -qq \
&& apt-get install -y -qq --no-install-recommends \
mingw-w64 \
curl \
nsis \
perl \
make \
jq \
git \
&& curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
&& apt-get install -y --no-install-recommends nodejs \
&& rm -rf /var/lib/apt/lists/*
RUN rustup target add x86_64-pc-windows-gnu

View File

@ -149,24 +149,6 @@ jobs:
pkg-config curl perl jq
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
apt-get install -y nodejs
- name: Download Ollama
run: |
OLLAMA_VER=$(curl -fsSL https://api.github.com/repos/ollama/ollama/releases/latest \
| grep '"tag_name"' | cut -d'"' -f4)
mkdir -p src-tauri/resources/ollama /tmp/ollama-extract
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/ollama-linux-amd64.tgz" \
-o /tmp/ollama.tgz
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/sha256sums.txt" \
-o /tmp/ollama-sha256sums.txt
EXPECTED=$(awk '$2 == "ollama-linux-amd64.tgz" {print $1}' /tmp/ollama-sha256sums.txt)
if [ -z "$EXPECTED" ]; then echo "ERROR: SHA256 entry not found"; exit 1; fi
ACTUAL=$(sha256sum /tmp/ollama.tgz | awk '{print $1}')
if [ "$EXPECTED" != "$ACTUAL" ]; then echo "ERROR: SHA256 mismatch. Expected: $EXPECTED Got: $ACTUAL"; exit 1; fi
tar -xzf /tmp/ollama.tgz -C /tmp/ollama-extract/
cp "$(find /tmp/ollama-extract -name 'ollama' -type f | head -1)" src-tauri/resources/ollama/ollama
chmod +x src-tauri/resources/ollama/ollama
rm -rf /tmp/ollama.tgz /tmp/ollama-extract /tmp/ollama-sha256sums.txt
echo "Bundled Ollama ${OLLAMA_VER} (checksum verified)"
- name: Build
run: |
npm ci --legacy-peer-deps
@ -247,25 +229,9 @@ jobs:
git checkout FETCH_HEAD
- name: Install dependencies
run: |
apt-get update -qq && apt-get install -y -qq mingw-w64 curl nsis perl make jq unzip
apt-get update -qq && apt-get install -y -qq mingw-w64 curl nsis perl make jq
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
apt-get install -y nodejs
- name: Download Ollama
run: |
OLLAMA_VER=$(curl -fsSL https://api.github.com/repos/ollama/ollama/releases/latest \
| grep '"tag_name"' | cut -d'"' -f4)
mkdir -p src-tauri/resources/ollama
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/ollama-windows-amd64.zip" \
-o /tmp/ollama-win.zip
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/sha256sums.txt" \
-o /tmp/ollama-sha256sums.txt
EXPECTED=$(awk '$2 == "ollama-windows-amd64.zip" {print $1}' /tmp/ollama-sha256sums.txt)
if [ -z "$EXPECTED" ]; then echo "ERROR: SHA256 entry not found"; exit 1; fi
ACTUAL=$(sha256sum /tmp/ollama-win.zip | awk '{print $1}')
if [ "$EXPECTED" != "$ACTUAL" ]; then echo "ERROR: SHA256 mismatch. Expected: $EXPECTED Got: $ACTUAL"; exit 1; fi
unzip -jo /tmp/ollama-win.zip 'ollama.exe' -d src-tauri/resources/ollama/
rm /tmp/ollama-win.zip /tmp/ollama-sha256sums.txt
echo "Bundled Ollama ${OLLAMA_VER} for Windows (checksum verified)"
- name: Build
env:
CC_x86_64_pc_windows_gnu: x86_64-w64-mingw32-gcc
@ -347,22 +313,6 @@ jobs:
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
git fetch --depth=1 origin "$GITHUB_SHA"
git checkout FETCH_HEAD
- name: Download Ollama
run: |
OLLAMA_VER=$(curl -fsSL https://api.github.com/repos/ollama/ollama/releases/latest \
| python3 -c "import sys,json; print(json.load(sys.stdin)['tag_name'])")
mkdir -p src-tauri/resources/ollama
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/ollama-darwin" \
-o src-tauri/resources/ollama/ollama
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/sha256sums.txt" \
-o /tmp/ollama-sha256sums.txt
EXPECTED=$(awk '$2 == "ollama-darwin" {print $1}' /tmp/ollama-sha256sums.txt)
if [ -z "$EXPECTED" ]; then echo "ERROR: SHA256 entry not found"; exit 1; fi
ACTUAL=$(shasum -a 256 src-tauri/resources/ollama/ollama | awk '{print $1}')
if [ "$EXPECTED" != "$ACTUAL" ]; then echo "ERROR: SHA256 mismatch. Expected: $EXPECTED Got: $ACTUAL"; exit 1; fi
chmod +x src-tauri/resources/ollama/ollama
rm /tmp/ollama-sha256sums.txt
echo "Bundled Ollama ${OLLAMA_VER} for macOS (checksum verified)"
- name: Build
env:
MACOSX_DEPLOYMENT_TARGET: "11.0"
@ -489,24 +439,6 @@ jobs:
# source "$HOME/.cargo/env" in the Build step handles PATH — no GITHUB_PATH needed
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
--default-toolchain 1.88.0 --profile minimal --no-modify-path
- name: Download Ollama
run: |
OLLAMA_VER=$(curl -fsSL https://api.github.com/repos/ollama/ollama/releases/latest \
| grep '"tag_name"' | cut -d'"' -f4)
mkdir -p src-tauri/resources/ollama /tmp/ollama-extract
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/ollama-linux-arm64.tgz" \
-o /tmp/ollama.tgz
curl -fsSL "https://github.com/ollama/ollama/releases/download/${OLLAMA_VER}/sha256sums.txt" \
-o /tmp/ollama-sha256sums.txt
EXPECTED=$(awk '$2 == "ollama-linux-arm64.tgz" {print $1}' /tmp/ollama-sha256sums.txt)
if [ -z "$EXPECTED" ]; then echo "ERROR: SHA256 entry not found"; exit 1; fi
ACTUAL=$(sha256sum /tmp/ollama.tgz | awk '{print $1}')
if [ "$EXPECTED" != "$ACTUAL" ]; then echo "ERROR: SHA256 mismatch. Expected: $EXPECTED Got: $ACTUAL"; exit 1; fi
tar -xzf /tmp/ollama.tgz -C /tmp/ollama-extract/
cp "$(find /tmp/ollama-extract -name 'ollama' -type f | head -1)" src-tauri/resources/ollama/ollama
chmod +x src-tauri/resources/ollama/ollama
rm -rf /tmp/ollama.tgz /tmp/ollama-extract /tmp/ollama-sha256sums.txt
echo "Bundled Ollama ${OLLAMA_VER} (checksum verified)"
- name: Build
env:
CC_aarch64_unknown_linux_gnu: aarch64-linux-gnu-gcc

View File

@ -0,0 +1,107 @@
name: Build CI Docker Images
# Rebuilds the pre-baked builder images and pushes them to the local Gitea
# container registry (172.0.0.29:3000).
#
# WHEN TO RUN:
# - Automatically: whenever a Dockerfile under .docker/ changes on master.
# - Manually: via workflow_dispatch (e.g. first-time setup, forced rebuild).
#
# ONE-TIME SERVER PREREQUISITE (run once on 172.0.0.29 before first use):
# echo '{"insecure-registries":["172.0.0.29:3000"]}' \
# | sudo tee /etc/docker/daemon.json
# sudo systemctl restart docker
#
# Images produced:
# 172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22
# 172.0.0.29:3000/sarman/trcaa-windows-cross:rust1.88-node22
# 172.0.0.29:3000/sarman/trcaa-linux-arm64:rust1.88-node22
on:
push:
branches:
- master
paths:
- '.docker/**'
workflow_dispatch:
concurrency:
group: build-ci-images
cancel-in-progress: false
env:
REGISTRY: 172.0.0.29:3000
REGISTRY_USER: sarman
jobs:
linux-amd64:
runs-on: linux-amd64
container:
image: docker:24-cli
options: -v /var/run/docker.sock:/var/run/docker.sock
steps:
- name: Checkout
run: |
apk add --no-cache git
git init
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
git fetch --depth=1 origin "$GITHUB_SHA"
git checkout FETCH_HEAD
- name: Build and push linux-amd64 builder
env:
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
run: |
echo "$RELEASE_TOKEN" | docker login $REGISTRY -u $REGISTRY_USER --password-stdin
docker build \
-t $REGISTRY/$REGISTRY_USER/trcaa-linux-amd64:rust1.88-node22 \
-f .docker/Dockerfile.linux-amd64 .
docker push $REGISTRY/$REGISTRY_USER/trcaa-linux-amd64:rust1.88-node22
echo "✓ Pushed $REGISTRY/$REGISTRY_USER/trcaa-linux-amd64:rust1.88-node22"
windows-cross:
runs-on: linux-amd64
container:
image: docker:24-cli
options: -v /var/run/docker.sock:/var/run/docker.sock
steps:
- name: Checkout
run: |
apk add --no-cache git
git init
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
git fetch --depth=1 origin "$GITHUB_SHA"
git checkout FETCH_HEAD
- name: Build and push windows-cross builder
env:
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
run: |
echo "$RELEASE_TOKEN" | docker login $REGISTRY -u $REGISTRY_USER --password-stdin
docker build \
-t $REGISTRY/$REGISTRY_USER/trcaa-windows-cross:rust1.88-node22 \
-f .docker/Dockerfile.windows-cross .
docker push $REGISTRY/$REGISTRY_USER/trcaa-windows-cross:rust1.88-node22
echo "✓ Pushed $REGISTRY/$REGISTRY_USER/trcaa-windows-cross:rust1.88-node22"
linux-arm64:
runs-on: linux-amd64
container:
image: docker:24-cli
options: -v /var/run/docker.sock:/var/run/docker.sock
steps:
- name: Checkout
run: |
apk add --no-cache git
git init
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
git fetch --depth=1 origin "$GITHUB_SHA"
git checkout FETCH_HEAD
- name: Build and push linux-arm64 builder
env:
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
run: |
echo "$RELEASE_TOKEN" | docker login $REGISTRY -u $REGISTRY_USER --password-stdin
docker build \
-t $REGISTRY/$REGISTRY_USER/trcaa-linux-arm64:rust1.88-node22 \
-f .docker/Dockerfile.linux-arm64 .
docker push $REGISTRY/$REGISTRY_USER/trcaa-linux-arm64:rust1.88-node22
echo "✓ Pushed $REGISTRY/$REGISTRY_USER/trcaa-linux-arm64:rust1.88-node22"

141
AGENTS.md Normal file
View File

@ -0,0 +1,141 @@
# AGENTS.md
## Commands & Tools
### Development
- **Full dev server**: `cargo tauri dev` (requires `source ~/.cargo/env` first)
- **Frontend only**: `npm run dev` (Vite at localhost:1420)
- **Production build**: `cargo tauri build``src-tauri/target/release/bundle/`
### Testing & Verification
**Order matters:**
1. `cargo fmt --manifest-path src-tauri/Cargo.toml --check`
2. `cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings`
3. `cargo test --manifest-path src-tauri/Cargo.toml`
4. `npx tsc --noEmit`
5. `npm run test:run`
**Single Rust test**: `cargo test --manifest-path src-tauri/Cargo.toml pii::detector`
**Single Rust test by name**: `cargo test --manifest-path src-tauri/Cargo.toml test_detect_ipv4`
---
## Architecture Highlights
### Rust Backend (Tauri 2)
- **State**: `AppState` wraps `Mutex<Connection>` + `Mutex<AppSettings>` — lock inside `{ }` blocks and **release before `.await`**
- **IPC entry point**: `src-tauri/src/lib.rs``run()` registers all handlers in `generate_handler![]`
- **AI providers**: `ai/provider.rs::create_provider()` dispatches on `provider_type` (or `name` fallback)
- **PII before AI**: Every external send must call `apply_redactions()` and log SHA-256 hash via `audit::log::write_audit_event()`
- **DB encryption**: `debug_assertions` → plain SQLite; release → SQLCipher. Keys from `TFTSR_DB_KEY` / `TFTSR_ENCRYPTION_KEY` env vars
### Frontend (React + TypeScript)
- **IPC layer**: `src/lib/tauriCommands.ts` — single source of truth for typed `invoke()` wrappers
- **Stores** (Zustand):
- `sessionStore.ts`: Ephemeral triage session (not persisted)
- `settingsStore.ts`: AI providers, theme, Ollama URL — persisted to `localStorage` as `"tftsr-settings"`
- `historyStore.ts`: Read-only cache of past issues
- **Domain prompts**: `src/lib/domain Prompts.ts` — 8 IT domains injected as first message in triage conversations
### Key Data Types
- **IssueDetail** (Rust): Nested struct — `detail.issue.title`, NOT `detail.title`
- **IssueDetail** (TS): Mirrors Rust — use `issue.title`, `issue.status`, etc.
- **PII spans**: `PiiDetector::detect()` returns non-overlapping spans (longest wins on overlap), applies in reverse order
---
## CI/CD
**Branch protection**: `master` requires PR + `sarman` approval + all 5 CI checks green
**Gitea Actions workflows** (`.gitea/workflows/`):
- `test.yml`: rustfmt · clippy · cargo test (64) · tsc · vitest (13) — every push/PR
- `auto-tag.yml`: Auto-tag + multi-platform release build on push to `master`
**Runners**:
- `amd64-docker-runner`: linux/amd64 + windows/amd64 builds
- `arm64-native-runner`: native linux/arm64 builds
**CI test binary requirement**: `npm run test:e2e` needs `TAURI_BINARY_PATH=./src-tauri/target/release/tftsr`
---
## Database & Settings
**DB path**: `~/.local/share/tftsr/tftsr.db` (Linux), override via `TFTSR_DATA_DIR`
**SQLite schema**: `db/migrations.rs` tracks 10 migrations; schema in `_migrations` table
**Environment variables**:
- `TFTSR_DATA_DIR`: DB location override
- `TFTSR_DB_KEY`: SQLCipher encryption key (required release)
- `TFTSR_ENCRYPTION_KEY`: API key encryption (required release)
- `RUST_LOG`: tracing level (`debug`, `info`, `warn`, `error`)
---
## PII Detection & Redaction
**Patterns detected**: IPv4/IPv6, emails, tokens, passwords, SSNs, credit cards
**Flow**:
1. `detect_pii(log_file_id)``Vec<PiiSpan>` (sorted, non-overlapping)
2. UI shows diff viewer for approval
3. `apply_redactions(log_file_id, approved_span_ids)` → creates redacted file
4. **Mandatory**: SHA-256 hash of redacted content logged via `audit::log::write_audit_event()` before any AI send
---
## AI Providers
**Supported**: OpenAI (compatible), Anthropic, Google Gemini, Mistral, Ollama (local)
**Adding a provider**:
1. Implement `Provider` trait in `ai/*.rs`
2. Add match arm in `ai/provider.rs::create_provider()`
**Ollama defaults**: `http://localhost:11434`, default model `llama3.2:3b` (≥8 GB RAM) or `llama3.1:8b` (≥16 GB RAM)
---
## Wiki Maintenance
**Source of truth**: `docs/wiki/` — automated sync to Gitea wiki on every push to `master`
**Update these files when changing code**:
| Area | Wiki file |
|------|-----------|
| Tauri commands | `IPC-Commands.md` |
| DB schema/migrations | `Database.md` |
| AI providers | `AI-Providers.md` |
| PII detection | `PII-Detection.md` |
| CI/CD changes | `CICD-Pipeline.md` |
| Architecture | `Architecture.md` |
| Security changes | `Security-Model.md` |
| Dev setup | `Development-Setup.md` |
---
## Common Gotchas
1. **Mutex deadlock**: Never hold `MutexGuard` across `.await` — release before async calls
2. **PII enforcement**: Redaction + hash audit is mandatory before any network send
3. **Testing order**: Run `cargo check`/`cargo fmt`/`cargo clippy` before `cargo test` (order matters for CI)
4. **DB encryption**: Release builds require `TFTSR_DB_KEY` and `TFTSR_ENCRYPTION_KEY` or will fail at runtime
5. **Type mismatch**: `get_issue()` returns `IssueDetail` with nested `issue` field in both Rust and TS
6. **Integration stubs**: `integrations/` modules are v0.2 stubs — not functionally complete
7. **Rust version**: 1.88+ required (for `cookie_store`, `time`, `darling`)
8. **Linux deps**: Must install webkit2gtk4.1 + libsoup3 + openssl before `cargo build`
---
## Prerequisites (Linux/Fedora)
```bash
sudo dnf install -y glib2-devel gtk3-devel webkit2gtk4.1-devel \
libsoup3-devel openssl-devel librsvg2-devel
```
**Rust**: `curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh && source ~/.cargo/env`
**Node**: 22+ (via package manager)

View File

@ -18,6 +18,7 @@ Built with **Tauri 2** (Rust + WebView), **React 18**, **TypeScript**, and **SQL
- **Ollama Management** — Hardware detection, model recommendations, pull/delete models in-app
- **Audit Trail** — Every external data send logged with SHA-256 hash
- **Domain System Prompts** — Pre-built expert context for 8 IT domains (Linux, Windows, Network, Kubernetes, Databases, Virtualization, Hardware, Observability)
- **Image Attachments** — Upload and manage image files with PII detection and mandatory user approval
- **Integrations** *(v0.2, coming soon)* — Confluence, ServiceNow, Azure DevOps
---

View File

@ -46,10 +46,11 @@ All command handlers receive `State<'_, AppState>` as a Tauri-injected parameter
| `commands/analysis.rs` | Log file upload, PII detection, redaction |
| `commands/docs.rs` | RCA and post-mortem generation, document export |
| `commands/system.rs` | Ollama management, hardware probe, settings, audit log |
| `commands/image.rs` | Image attachment upload, list, delete, paste |
| `commands/integrations.rs` | Confluence / ServiceNow / ADO — v0.2 stubs |
| `ai/provider.rs` | `Provider` trait + `create_provider()` factory |
| `pii/detector.rs` | Multi-pattern PII scanner with overlap resolution |
| `db/migrations.rs` | Versioned schema (10 migrations in `_migrations` table) |
| `db/migrations.rs` | Versioned schema (12 migrations in `_migrations` table) |
| `db/models.rs` | All DB types — see `IssueDetail` note below |
| `docs/rca.rs` + `docs/postmortem.rs` | Markdown template builders |
| `audit/log.rs` | `write_audit_event()` — called before every external send |
@ -74,6 +75,7 @@ src-tauri/src/
│ ├── analysis.rs
│ ├── docs.rs
│ ├── system.rs
│ ├── image.rs
│ └── integrations.rs
├── pii/
│ ├── patterns.rs
@ -186,6 +188,15 @@ Use `detail.issue.title`, **not** `detail.title`.
7. Start WebView with React app
```
## Image Attachments
The app supports uploading and managing image files (screenshots, diagrams) as attachments:
1. **Upload** via `upload_image_attachmentCmd()` or `upload_paste_imageCmd()` (clipboard paste)
2. **PII detection** runs automatically on upload
3. **User approval** required before image is stored
4. **Database storage** in `image_attachments` table with SHA-256 hash
## Data Flow
```

View File

@ -2,7 +2,7 @@
## Overview
TFTSR uses **SQLite** via `rusqlite` with the `bundled-sqlcipher` feature for AES-256 encryption in production. 11 versioned migrations are tracked in the `_migrations` table.
TFTSR uses **SQLite** via `rusqlite` with the `bundled-sqlcipher` feature for AES-256 encryption in production. 12 versioned migrations are tracked in the `_migrations` table.
**DB file location:** `{app_data_dir}/tftsr.db`
@ -211,6 +211,29 @@ CREATE TABLE integration_config (
);
```
### 012 — image_attachments (v0.2.7+)
```sql
CREATE TABLE image_attachments (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
file_name TEXT NOT NULL,
file_path TEXT NOT NULL DEFAULT '',
file_size INTEGER NOT NULL DEFAULT 0,
mime_type TEXT NOT NULL DEFAULT 'image/png',
upload_hash TEXT NOT NULL DEFAULT '',
uploaded_at TEXT NOT NULL DEFAULT (datetime('now')),
pii_warning_acknowledged INTEGER NOT NULL DEFAULT 1,
is_paste INTEGER NOT NULL DEFAULT 0
);
```
**Features:**
- Image file metadata stored in database
- `upload_hash`: SHA-256 hash of file content (for deduplication)
- `pii_warning_acknowledged`: User confirmation that PII may be present
- `is_paste`: Flag for screenshots copied from clipboard
**Encryption:**
- OAuth2 tokens encrypted with AES-256-GCM
- Key derived from `TFTSR_DB_KEY` environment variable

View File

@ -32,12 +32,14 @@
- **Ollama Management** — Hardware detection, model recommendations, in-app model management
- **Audit Trail** — Every external data send logged with SHA-256 hash
- **Domain-Specific Prompts** — 8 IT domains: Linux, Windows, Network, Kubernetes, Databases, Virtualization, Hardware, Observability
- **Image Attachments** — Upload and manage image files with PII detection and mandatory user approval
## Releases
| Version | Status | Highlights |
|---------|--------|-----------|
| v0.2.6 | 🚀 Latest | MSI GenAI support, OAuth2 shell permissions, user ID tracking |
| v0.2.5 | Released | Image attachments with PII detection and approval workflow |
| v0.2.3 | Released | Confluence/ServiceNow/ADO REST API clients (19 TDD tests) |
| v0.1.1 | Released | Core application with PII detection, RCA generation |

View File

@ -99,6 +99,34 @@ Rewrites file content with approved redactions. Records SHA-256 in audit log. Re
---
## Image Attachment Commands
### `upload_image_attachment`
```typescript
uploadImageAttachmentCmd(issueId: string, filePath: string, piiWarningAcknowledged: boolean) → ImageAttachment
```
Uploads an image file. Computes SHA-256, stores metadata in DB. Returns `ImageAttachment` record.
### `list_image_attachments`
```typescript
listImageAttachmentsCmd(issueId: string) → ImageAttachment[]
```
Lists all image attachments for an issue.
### `delete_image_attachment`
```typescript
deleteImageAttachmentCmd(imageId: string) → void
```
Deletes an image attachment from disk and database.
### `upload_paste_image`
```typescript
uploadPasteImageCmd(issueId: string, base64Data: string, fileName: string, piiWarningAcknowledged: boolean) → ImageAttachment
```
Uploads an image from clipboard paste (base64). Returns `ImageAttachment` record.
---
## AI Commands
### `analyze_logs`
@ -218,16 +246,6 @@ getAuditLogCmd(filter: AuditLogFilter) → AuditEntry[]
```
Returns audit log entries. Filter by action, entity_type, date range.
### `install_ollama_from_bundle`
```typescript
installOllamaFromBundleCmd() → string
```
Copies the Ollama binary bundled inside the app resources to the system install path:
- **Linux/macOS**: `/usr/local/bin/ollama` (requires write permission — user may need to run app with elevated privileges or `sudo`)
- **Windows**: `%LOCALAPPDATA%\Programs\Ollama\ollama.exe`
Returns a success message with the install path. Errors if the bundled binary is not present in the app resources (i.e., the app was built without an Ollama bundle step in CI).
---
## Integration Commands

View File

@ -0,0 +1,97 @@
# KohakuHub Deployment Summary
## Description
Deployed KohakuHub (a self-hosted HuggingFace-compatible model hub) on the existing Docker infrastructure at `172.0.0.29`, with NGINX reverse proxy at `172.0.0.30` and CoreDNS at `172.0.0.29`.
**Stack deployed:**
| Service | Image | Port(s) | Purpose |
|----------------------|---------------------------|--------------------------|------------------------|
| hub-ui | nginx:alpine | 28080:80 | Vue.js frontend SPA |
| hub-api | built from source | 127.0.0.1:48888:48888 | Python/FastAPI backend |
| minio | quay.io/minio/minio | 29001:9000, 29000:29000 | S3-compatible storage |
| lakefs | built from ./docker/lakefs| 127.0.0.1:28000:28000 | Git-style versioning |
| kohakuhub-postgres | postgres:15 | 127.0.0.1:25432:5432 | Metadata database |
**FQDNs:**
- `ai-hub.tftsr.com` → NGINX → `172.0.0.29:28080` (web UI + API proxy)
- `ai-hub-files.tftsr.com` → NGINX → `172.0.0.29:29001` (MinIO S3 file access)
All data stored locally under `/docker_mounts/kohakuhub/`.
---
## Acceptance Criteria
- [x] All 5 containers start and remain healthy
- [x] `https://ai-hub.tftsr.com` serves the KohakuHub Vue.js SPA
- [x] `https://ai-hub-files.tftsr.com` proxies to MinIO S3 API
- [x] DNS resolves both FQDNs to `172.0.0.30` (NGINX proxy)
- [x] hub-api connects to postgres and runs migrations on startup
- [x] MinIO `hub-storage` bucket is created automatically
- [x] LakeFS initializes with S3 blockstore backend
- [x] No port conflicts with existing stack services
- [x] Postgres container uses unique name (`kohakuhub-postgres`) to avoid conflict with `gogs_postgres_db`
- [x] API and LakeFS ports bound to `127.0.0.1` only (not externally exposed)
---
## Work Implemented
### Phase 1 — Docker Host (`172.0.0.29`)
1. **Downloaded KohakuHub source** via GitHub archive tarball (git not installed on host) to `/docker_mounts/kohakuhub/`
2. **Generated secrets** using `openssl rand -hex 32/16` for SESSION_SECRET, ADMIN_TOKEN, DB_KEY, LAKEFS_KEY, DB_PASS
3. **Created `/docker_mounts/kohakuhub/.env`** with `UID=1000` / `GID=1000` for LakeFS container user mapping
4. **Created `/docker_mounts/kohakuhub/docker-compose.yml`** with production configuration:
- Absolute volume paths under `/docker_mounts/kohakuhub/`
- Secrets substituted in-place
- Postgres container renamed to `kohakuhub-postgres`
- API/LakeFS/Postgres bound to `127.0.0.1` only
5. **Built Vue.js frontends** using `docker run node:22-alpine` (Node.js not installed on host):
- `src/kohaku-hub-ui/dist/` — main SPA
- `src/kohaku-hub-admin/dist/` — admin portal
6. **Started stack** with `docker compose up -d --build`; hub-api recovered from initial postgres race condition on its own
7. **Verified** all containers `Up`, API docs at `:48888/docs` returning HTTP 200, `hub-storage` bucket auto-created
### Phase 2 — NGINX Proxy (`172.0.0.30`)
1. **Created `/etc/nginx/conf.d/ai-hub.conf`** — proxies `ai-hub.tftsr.com``172.0.0.29:28080` with `client_max_body_size 100G`, 3600s timeouts, LetsEncrypt SSL
2. **Created `/etc/nginx/conf.d/ai-hub-files.conf`** — proxies `ai-hub-files.tftsr.com``172.0.0.29:29001` with same settings
3. **Resolved write issue**: initial writes via `sudo tee` with piped password produced empty files (heredoc stdin conflict); corrected by writing to `/tmp` then `sudo cp`
4. **Validated and reloaded** NGINX: `nginx -t` passes (pre-existing `ssl_stapling` warnings are environment-wide, unrelated to these configs)
### Phase 3 — CoreDNS (`172.0.0.29`)
1. **Updated `/docker_mounts/coredns/tftsr.com.db`**:
- SOA serial: `1718910701``2026040501`
- Appended: `ai-hub.tftsr.com. 3600 IN A 172.0.0.30`
- Appended: `ai-hub-files.tftsr.com. 3600 IN A 172.0.0.30`
2. **Reloaded CoreDNS** via `docker kill --signal=SIGHUP coredns`
---
## Testing Needed
### Functional
- [ ] **Browser test**: Navigate to `https://ai-hub.tftsr.com` and verify login page, registration, and model browsing work
- [ ] **Admin portal**: Navigate to `https://ai-hub.tftsr.com/admin` and verify admin dashboard is accessible with the ADMIN_TOKEN
- [ ] **Model upload**: Upload a test model file and verify it lands in MinIO under `hub-storage`
- [ ] **Git clone**: Clone a model repo via `git clone https://ai-hub.tftsr.com/<user>/<repo>.git` and verify Git LFS works
- [ ] **File download**: Verify `https://ai-hub-files.tftsr.com` properly serves file download redirects
### Infrastructure
- [ ] **Restart persistence**: `docker compose down && docker compose up -d` on 172.0.0.29 — verify all services restart cleanly and data persists
- [ ] **LakeFS credentials**: Check `/docker_mounts/kohakuhub/hub-meta/hub-api/credentials.env` for generated LakeFS credentials
- [ ] **Admin token recovery**: Run `docker logs hub-api | grep -i admin` to retrieve the admin token if needed
- [ ] **MinIO console**: Verify MinIO console accessible at `http://172.0.0.29:29000` (internal only)
- [ ] **Postgres connectivity**: `docker exec kohakuhub-postgres psql -U kohakuhub -c "\dt"` to confirm schema migration ran
### Notes
- Frontend builds used temporary `node:22-alpine` Docker containers since Node.js is not installed on `172.0.0.29`. If redeployment is needed, re-run the build steps or install Node.js on the host.
- The `ssl_stapling` warnings in NGINX are pre-existing across all vhosts on `172.0.0.30` and do not affect functionality.
- MinIO credentials (`minioadmin`/`minioadmin`) are defaults. Consider rotating via the MinIO console for production hardening.

94
src-tauri/Cargo.lock generated
View File

@ -2416,6 +2416,15 @@ dependencies = [
"serde_core",
]
[[package]]
name = "infer"
version = "0.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cb33622da908807a06f9513c19b3c1ad50fab3e4137d82a78107d502075aa199"
dependencies = [
"cfb",
]
[[package]]
name = "infer"
version = "0.19.0"
@ -5611,7 +5620,7 @@ dependencies = [
"glob",
"html5ever 0.29.1",
"http 1.4.0",
"infer",
"infer 0.19.0",
"json-patch",
"kuchikiki",
"log",
@ -5670,47 +5679,6 @@ dependencies = [
"utf-8",
]
[[package]]
name = "tftsr"
version = "0.1.0"
dependencies = [
"aes-gcm",
"aho-corasick",
"anyhow",
"async-trait",
"base64 0.22.1",
"chrono",
"dirs 5.0.1",
"docx-rs",
"futures",
"hex",
"lazy_static",
"mockito",
"printpdf",
"rand 0.8.5",
"regex",
"reqwest 0.12.28",
"rusqlite",
"serde",
"serde_json",
"sha2",
"tauri",
"tauri-build",
"tauri-plugin-dialog",
"tauri-plugin-fs",
"tauri-plugin-http",
"tauri-plugin-shell",
"tauri-plugin-stronghold",
"thiserror 1.0.69",
"tokio",
"tokio-test",
"tracing",
"tracing-subscriber",
"urlencoding",
"uuid",
"warp",
]
[[package]]
name = "thiserror"
version = "1.0.69"
@ -6168,6 +6136,48 @@ dependencies = [
"windows-sys 0.60.2",
]
[[package]]
name = "trcaa"
version = "0.1.0"
dependencies = [
"aes-gcm",
"aho-corasick",
"anyhow",
"async-trait",
"base64 0.22.1",
"chrono",
"dirs 5.0.1",
"docx-rs",
"futures",
"hex",
"infer 0.15.0",
"lazy_static",
"mockito",
"printpdf",
"rand 0.8.5",
"regex",
"reqwest 0.12.28",
"rusqlite",
"serde",
"serde_json",
"sha2",
"tauri",
"tauri-build",
"tauri-plugin-dialog",
"tauri-plugin-fs",
"tauri-plugin-http",
"tauri-plugin-shell",
"tauri-plugin-stronghold",
"thiserror 1.0.69",
"tokio",
"tokio-test",
"tracing",
"tracing-subscriber",
"urlencoding",
"uuid",
"warp",
]
[[package]]
name = "try-lock"
version = "0.2.5"

View File

@ -43,6 +43,7 @@ rand = "0.8"
lazy_static = "1.4"
warp = "0.3"
urlencoding = "2"
infer = "0.15"
[dev-dependencies]
tokio-test = "0.4"

View File

@ -1,8 +1,8 @@
use tauri::State;
use crate::db::models::{
AiConversation, AiMessage, Issue, IssueDetail, IssueFilter, IssueSummary, IssueUpdate, LogFile,
ResolutionStep,
AiConversation, AiMessage, ImageAttachment, Issue, IssueDetail, IssueFilter, IssueSummary,
IssueUpdate, LogFile, ResolutionStep,
};
use crate::state::AppState;
@ -100,6 +100,32 @@ pub async fn get_issue(
.filter_map(|r| r.ok())
.collect();
// Load image attachments
let mut img_stmt = db
.prepare(
"SELECT id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste \
FROM image_attachments WHERE issue_id = ?1 ORDER BY uploaded_at ASC",
)
.map_err(|e| e.to_string())?;
let image_attachments: Vec<ImageAttachment> = img_stmt
.query_map([&issue_id], |row| {
Ok(ImageAttachment {
id: row.get(0)?,
issue_id: row.get(1)?,
file_name: row.get(2)?,
file_path: row.get(3)?,
file_size: row.get(4)?,
mime_type: row.get(5)?,
upload_hash: row.get(6)?,
uploaded_at: row.get(7)?,
pii_warning_acknowledged: row.get::<_, i32>(8)? != 0,
is_paste: row.get::<_, i32>(9)? != 0,
})
})
.map_err(|e| e.to_string())?
.filter_map(|r| r.ok())
.collect();
// Load resolution steps (5-whys)
let mut rs_stmt = db
.prepare(
@ -148,6 +174,7 @@ pub async fn get_issue(
Ok(IssueDetail {
issue,
log_files,
image_attachments,
resolution_steps,
conversations,
})
@ -265,6 +292,11 @@ pub async fn delete_issue(issue_id: String, state: State<'_, AppState>) -> Resul
.map_err(|e| e.to_string())?;
db.execute("DELETE FROM log_files WHERE issue_id = ?1", [&issue_id])
.map_err(|e| e.to_string())?;
db.execute(
"DELETE FROM image_attachments WHERE issue_id = ?1",
[&issue_id],
)
.map_err(|e| e.to_string())?;
db.execute(
"DELETE FROM resolution_steps WHERE issue_id = ?1",
[&issue_id],

View File

@ -0,0 +1,280 @@
use base64::Engine;
use sha2::Digest;
use std::path::Path;
use tauri::State;
use crate::audit::log::write_audit_event;
use crate::db::models::{AuditEntry, ImageAttachment};
use crate::state::AppState;
const MAX_IMAGE_FILE_BYTES: u64 = 10 * 1024 * 1024;
const SUPPORTED_IMAGE_MIME_TYPES: [&str; 5] = [
"image/png",
"image/jpeg",
"image/gif",
"image/webp",
"image/svg+xml",
];
fn validate_image_file_path(file_path: &str) -> Result<std::path::PathBuf, String> {
let path = Path::new(file_path);
let canonical = std::fs::canonicalize(path).map_err(|_| "Unable to access selected file")?;
let metadata = std::fs::metadata(&canonical).map_err(|_| "Unable to read file metadata")?;
if !metadata.is_file() {
return Err("Selected path is not a file".to_string());
}
if metadata.len() > MAX_IMAGE_FILE_BYTES {
return Err(format!(
"Image file exceeds maximum supported size ({} MB)",
MAX_IMAGE_FILE_BYTES / 1024 / 1024
));
}
Ok(canonical)
}
fn is_supported_image_format(mime_type: &str) -> bool {
SUPPORTED_IMAGE_MIME_TYPES.contains(&mime_type)
}
#[tauri::command]
pub async fn upload_image_attachment(
issue_id: String,
file_path: String,
state: State<'_, AppState>,
) -> Result<ImageAttachment, String> {
let canonical_path = validate_image_file_path(&file_path)?;
let content =
std::fs::read(&canonical_path).map_err(|_| "Failed to read selected image file")?;
let content_hash = format!("{:x}", sha2::Sha256::digest(&content));
let file_name = canonical_path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("unknown")
.to_string();
let file_size = content.len() as i64;
let mime_type: String = infer::get(&content)
.map(|m| m.mime_type().to_string())
.unwrap_or_else(|| "image/png".to_string());
if !is_supported_image_format(mime_type.as_str()) {
return Err(format!(
"Unsupported image format: {}. Supported formats: {}",
mime_type,
SUPPORTED_IMAGE_MIME_TYPES.join(", ")
));
}
let canonical_file_path = canonical_path.to_string_lossy().to_string();
let attachment = ImageAttachment::new(
issue_id.clone(),
file_name,
canonical_file_path,
file_size,
mime_type,
content_hash.clone(),
true,
false,
);
let db = state.db.lock().map_err(|e| e.to_string())?;
db.execute(
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste) \
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
rusqlite::params![
attachment.id,
attachment.issue_id,
attachment.file_name,
attachment.file_path,
attachment.file_size,
attachment.mime_type,
attachment.upload_hash,
attachment.uploaded_at,
attachment.pii_warning_acknowledged as i32,
attachment.is_paste as i32,
],
)
.map_err(|_| "Failed to store uploaded image metadata".to_string())?;
let entry = AuditEntry::new(
"upload_image_attachment".to_string(),
"image_attachment".to_string(),
attachment.id.clone(),
serde_json::json!({
"issue_id": issue_id,
"file_name": attachment.file_name,
"is_paste": false,
})
.to_string(),
);
if let Err(err) = write_audit_event(
&db,
&entry.action,
&entry.entity_type,
&entry.entity_id,
&entry.details,
) {
tracing::warn!(error = %err, "failed to write upload_image_attachment audit entry");
}
Ok(attachment)
}
#[tauri::command]
pub async fn upload_paste_image(
issue_id: String,
base64_image: String,
mime_type: String,
state: State<'_, AppState>,
) -> Result<ImageAttachment, String> {
if !base64_image.starts_with("data:image/") {
return Err("Invalid image data - must be a data URL".to_string());
}
let data_part = base64_image
.split(',')
.nth(1)
.ok_or("Invalid image data format - missing base64 content")?;
let decoded = base64::engine::general_purpose::STANDARD.decode(data_part).map_err(|_| "Failed to decode base64 image data")?;
let content_hash = format!("{:x}", sha2::Sha256::digest(&decoded));
let file_size = decoded.len() as i64;
let file_name = format!("pasted-image-{}.png", uuid::Uuid::now_v7());
if !is_supported_image_format(mime_type.as_str()) {
return Err(format!(
"Unsupported image format: {}. Supported formats: {}",
mime_type,
SUPPORTED_IMAGE_MIME_TYPES.join(", ")
));
}
let attachment = ImageAttachment::new(
issue_id.clone(),
file_name.clone(),
String::new(),
file_size,
mime_type,
content_hash,
true,
true,
);
let db = state.db.lock().map_err(|e| e.to_string())?;
db.execute(
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste) \
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
rusqlite::params![
attachment.id,
attachment.issue_id,
attachment.file_name,
attachment.file_path,
attachment.file_size,
attachment.mime_type,
attachment.upload_hash,
attachment.uploaded_at,
attachment.pii_warning_acknowledged as i32,
attachment.is_paste as i32,
],
)
.map_err(|_| "Failed to store pasted image metadata".to_string())?;
let entry = AuditEntry::new(
"upload_paste_image".to_string(),
"image_attachment".to_string(),
attachment.id.clone(),
serde_json::json!({
"issue_id": issue_id,
"file_name": attachment.file_name,
"is_paste": true,
})
.to_string(),
);
if let Err(err) = write_audit_event(
&db,
&entry.action,
&entry.entity_type,
&entry.entity_id,
&entry.details,
) {
tracing::warn!(error = %err, "failed to write upload_paste_image audit entry");
}
Ok(attachment)
}
#[tauri::command]
pub async fn list_image_attachments(
issue_id: String,
state: State<'_, AppState>,
) -> Result<Vec<ImageAttachment>, String> {
let db = state.db.lock().map_err(|e| e.to_string())?;
let mut stmt = db
.prepare(
"SELECT id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste \
FROM image_attachments WHERE issue_id = ?1 ORDER BY uploaded_at ASC",
)
.map_err(|e| e.to_string())?;
let attachments = stmt
.query_map([&issue_id], |row| {
Ok(ImageAttachment {
id: row.get(0)?,
issue_id: row.get(1)?,
file_name: row.get(2)?,
file_path: row.get(3)?,
file_size: row.get(4)?,
mime_type: row.get(5)?,
upload_hash: row.get(6)?,
uploaded_at: row.get(7)?,
pii_warning_acknowledged: row.get::<_, i32>(8)? != 0,
is_paste: row.get::<_, i32>(9)? != 0,
})
})
.map_err(|e| e.to_string())?
.filter_map(|r| r.ok())
.collect();
Ok(attachments)
}
#[tauri::command]
pub async fn delete_image_attachment(
attachment_id: String,
state: State<'_, AppState>,
) -> Result<(), String> {
let db = state.db.lock().map_err(|e| e.to_string())?;
let affected = db
.execute(
"DELETE FROM image_attachments WHERE id = ?1",
[&attachment_id],
)
.map_err(|e| e.to_string())?;
if affected == 0 {
return Err("Image attachment not found".to_string());
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_is_supported_image_format() {
assert!(is_supported_image_format("image/png"));
assert!(is_supported_image_format("image/jpeg"));
assert!(is_supported_image_format("image/gif"));
assert!(is_supported_image_format("image/webp"));
assert!(is_supported_image_format("image/svg+xml"));
assert!(!is_supported_image_format("image/bmp"));
assert!(!is_supported_image_format("text/plain"));
}
}

View File

@ -2,5 +2,6 @@ pub mod ai;
pub mod analysis;
pub mod db;
pub mod docs;
pub mod image;
pub mod integrations;
pub mod system;

View File

@ -141,52 +141,3 @@ pub async fn get_audit_log(
Ok(rows)
}
// Security note: the bundled binary's integrity is guaranteed by the CI release pipeline
// which verifies SHA256 checksums against Ollama's published sha256sums.txt before bundling.
// Runtime re-verification is not performed here; the app bundle itself is the trust boundary.
#[tauri::command]
pub async fn install_ollama_from_bundle(
app: tauri::AppHandle,
) -> Result<String, String> {
use std::fs;
use std::path::PathBuf;
use tauri::Manager;
let resource_path = app
.path()
.resource_dir()
.map_err(|e: tauri::Error| e.to_string())?
.join("ollama")
.join(if cfg!(windows) { "ollama.exe" } else { "ollama" });
if !resource_path.exists() {
return Err("Bundled Ollama not found in resources".to_string());
}
#[cfg(unix)]
let install_path = PathBuf::from("/usr/local/bin/ollama");
#[cfg(windows)]
let install_path = {
let local_app_data = std::env::var("LOCALAPPDATA").map_err(|e| e.to_string())?;
PathBuf::from(local_app_data)
.join("Programs")
.join("Ollama")
.join("ollama.exe")
};
if let Some(parent) = install_path.parent() {
fs::create_dir_all(parent).map_err(|e| e.to_string())?;
}
fs::copy(&resource_path, &install_path).map_err(|e| e.to_string())?;
#[cfg(unix)]
{
use std::os::unix::fs::PermissionsExt;
fs::set_permissions(&install_path, fs::Permissions::from_mode(0o755))
.map_err(|e| e.to_string())?;
}
Ok(format!("Ollama installed to {}", install_path.display()))
}

View File

@ -155,6 +155,21 @@ pub fn run_migrations(conn: &Connection) -> anyhow::Result<()> {
"ALTER TABLE audit_log ADD COLUMN prev_hash TEXT NOT NULL DEFAULT '';
ALTER TABLE audit_log ADD COLUMN entry_hash TEXT NOT NULL DEFAULT '';",
),
(
"013_image_attachments",
"CREATE TABLE IF NOT EXISTS image_attachments (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
file_name TEXT NOT NULL,
file_path TEXT NOT NULL DEFAULT '',
file_size INTEGER NOT NULL DEFAULT 0,
mime_type TEXT NOT NULL DEFAULT 'image/png',
upload_hash TEXT NOT NULL DEFAULT '',
uploaded_at TEXT NOT NULL DEFAULT (datetime('now')),
pii_warning_acknowledged INTEGER NOT NULL DEFAULT 1,
is_paste INTEGER NOT NULL DEFAULT 0
);",
),
];
for (name, sql) in migrations {
@ -192,21 +207,21 @@ mod tests {
}
#[test]
fn test_create_credentials_table() {
fn test_create_image_attachments_table() {
let conn = setup_test_db();
// Verify table exists
let count: i64 = conn
.query_row(
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='credentials'",
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='image_attachments'",
[],
|r| r.get(0),
)
.unwrap();
assert_eq!(count, 1);
// Verify columns
let mut stmt = conn.prepare("PRAGMA table_info(credentials)").unwrap();
let mut stmt = conn
.prepare("PRAGMA table_info(image_attachments)")
.unwrap();
let columns: Vec<String> = stmt
.query_map([], |row| row.get::<_, String>(1))
.unwrap()
@ -214,11 +229,15 @@ mod tests {
.unwrap();
assert!(columns.contains(&"id".to_string()));
assert!(columns.contains(&"service".to_string()));
assert!(columns.contains(&"token_hash".to_string()));
assert!(columns.contains(&"encrypted_token".to_string()));
assert!(columns.contains(&"created_at".to_string()));
assert!(columns.contains(&"expires_at".to_string()));
assert!(columns.contains(&"issue_id".to_string()));
assert!(columns.contains(&"file_name".to_string()));
assert!(columns.contains(&"file_path".to_string()));
assert!(columns.contains(&"file_size".to_string()));
assert!(columns.contains(&"mime_type".to_string()));
assert!(columns.contains(&"upload_hash".to_string()));
assert!(columns.contains(&"uploaded_at".to_string()));
assert!(columns.contains(&"pii_warning_acknowledged".to_string()));
assert!(columns.contains(&"is_paste".to_string()));
}
#[test]
@ -389,4 +408,64 @@ mod tests {
assert_eq!(count, 1);
}
#[test]
fn test_store_and_retrieve_image_attachment() {
let conn = setup_test_db();
// Create an issue first (required for foreign key)
let now = chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string();
conn.execute(
"INSERT INTO issues (id, title, description, severity, status, category, source, created_at, updated_at, resolved_at, assigned_to, tags)
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10, ?11, ?12)",
rusqlite::params![
"test-issue-1",
"Test Issue",
"Test description",
"medium",
"open",
"test",
"manual",
now.clone(),
now.clone(),
None::<Option<String>>,
"",
"[]",
],
)
.unwrap();
// Now insert the image attachment
conn.execute(
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste)
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
rusqlite::params![
"test-img-1",
"test-issue-1",
"screenshot.png",
"/path/to/screenshot.png",
102400,
"image/png",
"abc123hash",
now,
1,
0,
],
)
.unwrap();
let (id, issue_id, file_name, mime_type, is_paste): (String, String, String, String, i32) = conn
.query_row(
"SELECT id, issue_id, file_name, mime_type, is_paste FROM image_attachments WHERE id = ?1",
["test-img-1"],
|r| Ok((r.get(0)?, r.get(1)?, r.get(2)?, r.get(3)?, r.get(4)?)),
)
.unwrap();
assert_eq!(id, "test-img-1");
assert_eq!(issue_id, "test-issue-1");
assert_eq!(file_name, "screenshot.png");
assert_eq!(mime_type, "image/png");
assert_eq!(is_paste, 0);
}
}

View File

@ -0,0 +1,482 @@
use rusqlite::Connection;
/// Run all database migrations in order, tracking which have been applied.
pub fn run_migrations(conn: &Connection) -> anyhow::Result<()> {
conn.execute_batch(
"CREATE TABLE IF NOT EXISTS _migrations (
id INTEGER PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
applied_at TEXT NOT NULL DEFAULT (datetime('now'))
);",
)?;
let migrations: &[(&str, &str)] = &[
(
"001_create_issues",
"CREATE TABLE IF NOT EXISTS issues (
id TEXT PRIMARY KEY,
title TEXT NOT NULL,
description TEXT NOT NULL DEFAULT '',
severity TEXT NOT NULL DEFAULT 'medium',
status TEXT NOT NULL DEFAULT 'open',
category TEXT NOT NULL DEFAULT 'general',
source TEXT NOT NULL DEFAULT 'manual',
created_at TEXT NOT NULL DEFAULT (datetime('now')),
updated_at TEXT NOT NULL DEFAULT (datetime('now')),
resolved_at TEXT,
assigned_to TEXT NOT NULL DEFAULT '',
tags TEXT NOT NULL DEFAULT '[]'
);",
),
(
"002_create_log_files",
"CREATE TABLE IF NOT EXISTS log_files (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
file_name TEXT NOT NULL,
file_path TEXT NOT NULL DEFAULT '',
file_size INTEGER NOT NULL DEFAULT 0,
mime_type TEXT NOT NULL DEFAULT 'text/plain',
content_hash TEXT NOT NULL DEFAULT '',
uploaded_at TEXT NOT NULL DEFAULT (datetime('now')),
redacted INTEGER NOT NULL DEFAULT 0
);",
),
(
"003_create_pii_spans",
"CREATE TABLE IF NOT EXISTS pii_spans (
id TEXT PRIMARY KEY,
log_file_id TEXT NOT NULL REFERENCES log_files(id) ON DELETE CASCADE,
pii_type TEXT NOT NULL,
start_offset INTEGER NOT NULL,
end_offset INTEGER NOT NULL,
original_value TEXT NOT NULL,
replacement TEXT NOT NULL
);",
),
(
"004_create_ai_conversations",
"CREATE TABLE IF NOT EXISTS ai_conversations (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
provider TEXT NOT NULL,
model TEXT NOT NULL,
created_at TEXT NOT NULL DEFAULT (datetime('now')),
title TEXT NOT NULL DEFAULT 'Untitled'
);",
),
(
"005_create_ai_messages",
"CREATE TABLE IF NOT EXISTS ai_messages (
id TEXT PRIMARY KEY,
conversation_id TEXT NOT NULL REFERENCES ai_conversations(id) ON DELETE CASCADE,
role TEXT NOT NULL CHECK(role IN ('system','user','assistant')),
content TEXT NOT NULL,
token_count INTEGER NOT NULL DEFAULT 0,
created_at TEXT NOT NULL DEFAULT (datetime('now'))
);",
),
(
"006_create_resolution_steps",
"CREATE TABLE IF NOT EXISTS resolution_steps (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
step_order INTEGER NOT NULL DEFAULT 0,
why_question TEXT NOT NULL DEFAULT '',
answer TEXT NOT NULL DEFAULT '',
evidence TEXT NOT NULL DEFAULT '',
created_at TEXT NOT NULL DEFAULT (datetime('now'))
);",
),
(
"007_create_documents",
"CREATE TABLE IF NOT EXISTS documents (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
doc_type TEXT NOT NULL,
title TEXT NOT NULL,
content_md TEXT NOT NULL,
created_at INTEGER NOT NULL,
updated_at INTEGER NOT NULL
);",
),
(
"008_create_audit_log",
"CREATE TABLE IF NOT EXISTS audit_log (
id TEXT PRIMARY KEY,
timestamp TEXT NOT NULL DEFAULT (datetime('now')),
action TEXT NOT NULL,
entity_type TEXT NOT NULL DEFAULT '',
entity_id TEXT NOT NULL DEFAULT '',
user_id TEXT NOT NULL DEFAULT 'local',
details TEXT NOT NULL DEFAULT '{}'
);",
),
(
"009_create_settings",
"CREATE TABLE IF NOT EXISTS settings (
key TEXT PRIMARY KEY,
value TEXT NOT NULL DEFAULT '',
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
);",
),
(
"010_issues_fts",
"CREATE VIRTUAL TABLE IF NOT EXISTS issues_fts USING fts5(
id UNINDEXED, title, description,
content='issues', content_rowid='rowid'
);",
),
(
"011_create_integrations",
"CREATE TABLE IF NOT EXISTS credentials (
id TEXT PRIMARY KEY,
service TEXT NOT NULL CHECK(service IN ('confluence','servicenow','azuredevops')),
token_hash TEXT NOT NULL,
encrypted_token TEXT NOT NULL,
created_at TEXT NOT NULL DEFAULT (datetime('now')),
expires_at TEXT,
UNIQUE(service)
);
CREATE TABLE IF NOT EXISTS integration_config (
id TEXT PRIMARY KEY,
service TEXT NOT NULL CHECK(service IN ('confluence','servicenow','azuredevops')),
base_url TEXT NOT NULL,
username TEXT,
project_name TEXT,
space_key TEXT,
auto_create_enabled INTEGER NOT NULL DEFAULT 0,
updated_at TEXT NOT NULL DEFAULT (datetime('now')),
UNIQUE(service)
);",
),
(
"012_audit_hash_chain",
"ALTER TABLE audit_log ADD COLUMN prev_hash TEXT NOT NULL DEFAULT '';
ALTER TABLE audit_log ADD COLUMN entry_hash TEXT NOT NULL DEFAULT '';",
),
(
"013_image_attachments",
"CREATE TABLE IF NOT EXISTS image_attachments (
id TEXT PRIMARY KEY,
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
file_name TEXT NOT NULL,
file_path TEXT NOT NULL DEFAULT '',
file_size INTEGER NOT NULL DEFAULT 0,
mime_type TEXT NOT NULL DEFAULT 'image/png',
upload_hash TEXT NOT NULL DEFAULT '',
uploaded_at TEXT NOT NULL DEFAULT (datetime('now')),
pii_warning_acknowledged INTEGER NOT NULL DEFAULT 1,
is_paste INTEGER NOT NULL DEFAULT 0
);",
),
];
for (name, sql) in migrations {
let already_applied: bool = conn
.prepare("SELECT COUNT(*) FROM _migrations WHERE name = ?1")?
.query_row([name], |row| row.get::<_, i64>(0))
.map(|count| count > 0)?;
if !already_applied {
// FTS5 virtual table creation can be skipped if FTS5 is not compiled in
if let Err(e) = conn.execute_batch(sql) {
if name.contains("fts") {
tracing::warn!("FTS5 not available, skipping: {e}");
} else {
return Err(e.into());
}
}
conn.execute("INSERT INTO _migrations (name) VALUES (?1)", [name])?;
tracing::info!("Applied migration: {name}");
}
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
use rusqlite::Connection;
fn setup_test_db() -> Connection {
let conn = Connection::open_in_memory().unwrap();
run_migrations(&conn).unwrap();
conn
}
#[test]
fn test_create_integration_config_table() {
let conn = setup_test_db();
let count: i64 = conn
.query_row(
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='integration_config'",
[],
|r| r.get(0),
)
.unwrap();
assert_eq!(count, 1);
let mut stmt = conn
.prepare("PRAGMA table_info(integration_config)")
.unwrap();
let columns: Vec<String> = stmt
.query_map([], |row| row.get::<_, String>(1))
.unwrap()
.collect::<Result<Vec<_>, _>>()
.unwrap();
assert!(columns.contains(&"id".to_string()));
assert!(columns.contains(&"service".to_string()));
assert!(columns.contains(&"base_url".to_string()));
assert!(columns.contains(&"username".to_string()));
assert!(columns.contains(&"project_name".to_string()));
assert!(columns.contains(&"space_key".to_string()));
assert!(columns.contains(&"auto_create_enabled".to_string()));
assert!(columns.contains(&"updated_at".to_string()));
}
#[test]
fn test_create_image_attachments_table() {
let conn = setup_test_db();
let count: i64 = conn
.query_row(
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='image_attachments'",
[],
|r| r.get(0),
)
.unwrap();
assert_eq!(count, 1);
let mut stmt = conn
.prepare("PRAGMA table_info(image_attachments)")
.unwrap();
let columns: Vec<String> = stmt
.query_map([], |row| row.get::<_, String>(1))
.unwrap()
.collect::<Result<Vec<_>, _>>()
.unwrap();
assert!(columns.contains(&"id".to_string()));
assert!(columns.contains(&"issue_id".to_string()));
assert!(columns.contains(&"file_name".to_string()));
assert!(columns.contains(&"file_path".to_string()));
assert!(columns.contains(&"file_size".to_string()));
assert!(columns.contains(&"mime_type".to_string()));
assert!(columns.contains(&"upload_hash".to_string()));
assert!(columns.contains(&"uploaded_at".to_string()));
assert!(columns.contains(&"pii_warning_acknowledged".to_string()));
assert!(columns.contains(&"is_paste".to_string()));
}
#[test]
fn test_create_integration_config_table() {
let conn = setup_test_db();
// Verify table exists
let count: i64 = conn
.query_row(
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='integration_config'",
[],
|r| r.get(0),
)
.unwrap();
assert_eq!(count, 1);
// Verify columns
let mut stmt = conn
.prepare("PRAGMA table_info(integration_config)")
.unwrap();
let columns: Vec<String> = stmt
.query_map([], |row| row.get::<_, String>(1))
.unwrap()
.collect::<Result<Vec<_>, _>>()
.unwrap();
assert!(columns.contains(&"id".to_string()));
assert!(columns.contains(&"service".to_string()));
assert!(columns.contains(&"base_url".to_string()));
assert!(columns.contains(&"username".to_string()));
assert!(columns.contains(&"project_name".to_string()));
assert!(columns.contains(&"space_key".to_string()));
assert!(columns.contains(&"auto_create_enabled".to_string()));
assert!(columns.contains(&"updated_at".to_string()));
}
#[test]
fn test_store_and_retrieve_credential() {
let conn = setup_test_db();
// Insert credential
conn.execute(
"INSERT INTO credentials (id, service, token_hash, encrypted_token, created_at)
VALUES (?1, ?2, ?3, ?4, ?5)",
rusqlite::params![
"test-id",
"confluence",
"test_hash",
"encrypted_test",
chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string()
],
)
.unwrap();
// Retrieve
let (service, token_hash): (String, String) = conn
.query_row(
"SELECT service, token_hash FROM credentials WHERE service = ?1",
["confluence"],
|r| Ok((r.get(0)?, r.get(1)?)),
)
.unwrap();
assert_eq!(service, "confluence");
assert_eq!(token_hash, "test_hash");
}
#[test]
fn test_store_and_retrieve_integration_config() {
let conn = setup_test_db();
// Insert config
conn.execute(
"INSERT INTO integration_config (id, service, base_url, space_key, auto_create_enabled, updated_at)
VALUES (?1, ?2, ?3, ?4, ?5, ?6)",
rusqlite::params![
"test-config-id",
"confluence",
"https://example.atlassian.net",
"DEV",
1,
chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string()
],
)
.unwrap();
// Retrieve
let (service, base_url, space_key, auto_create): (String, String, String, i32) = conn
.query_row(
"SELECT service, base_url, space_key, auto_create_enabled FROM integration_config WHERE service = ?1",
["confluence"],
|r| Ok((r.get(0)?, r.get(1)?, r.get(2)?, r.get(3)?)),
)
.unwrap();
assert_eq!(service, "confluence");
assert_eq!(base_url, "https://example.atlassian.net");
assert_eq!(space_key, "DEV");
assert_eq!(auto_create, 1);
}
#[test]
fn test_service_uniqueness_constraint() {
let conn = setup_test_db();
// Insert first credential
conn.execute(
"INSERT INTO credentials (id, service, token_hash, encrypted_token, created_at)
VALUES (?1, ?2, ?3, ?4, ?5)",
rusqlite::params![
"test-id-1",
"confluence",
"hash1",
"token1",
chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string()
],
)
.unwrap();
// Try to insert duplicate service - should fail
let result = conn.execute(
"INSERT INTO credentials (id, service, token_hash, encrypted_token, created_at)
VALUES (?1, ?2, ?3, ?4, ?5)",
rusqlite::params![
"test-id-2",
"confluence",
"hash2",
"token2",
chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string()
],
);
assert!(result.is_err());
}
#[test]
fn test_migration_tracking() {
let conn = setup_test_db();
// Verify migration 011 was applied
let applied: i64 = conn
.query_row(
"SELECT COUNT(*) FROM _migrations WHERE name = ?1",
["011_create_integrations"],
|r| r.get(0),
)
.unwrap();
assert_eq!(applied, 1);
}
#[test]
fn test_migrations_idempotent() {
let conn = Connection::open_in_memory().unwrap();
// Run migrations twice
run_migrations(&conn).unwrap();
run_migrations(&conn).unwrap();
// Verify migration was only recorded once
let count: i64 = conn
.query_row(
"SELECT COUNT(*) FROM _migrations WHERE name = ?1",
["011_create_integrations"],
|r| r.get(0),
)
.unwrap();
assert_eq!(count, 1);
}
#[test]
fn test_store_and_retrieve_image_attachment() {
let conn = setup_test_db();
let now = chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string();
conn.execute(
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste)
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
rusqlite::params![
"test-img-1",
"test-issue-1",
"screenshot.png",
"/path/to/screenshot.png",
102400,
"image/png",
"abc123hash",
now,
1,
0,
],
)
.unwrap();
let (id, issue_id, file_name, mime_type, is_paste): (String, String, String, String, i32) = conn
.query_row(
"SELECT id, issue_id, file_name, mime_type, is_paste FROM image_attachments WHERE id = ?1",
["test-img-1"],
|r| Ok((r.get(0)?, r.get(1)?, r.get(2)?, r.get(3)?, r.get(4)?)),
)
.unwrap();
assert_eq!(id, "test-img-1");
assert_eq!(issue_id, "test-issue-1");
assert_eq!(file_name, "screenshot.png");
assert_eq!(mime_type, "image/png");
assert_eq!(is_paste, 0);
}
}

View File

@ -44,6 +44,7 @@ impl Issue {
pub struct IssueDetail {
pub issue: Issue,
pub log_files: Vec<LogFile>,
pub image_attachments: Vec<ImageAttachment>,
pub resolution_steps: Vec<ResolutionStep>,
pub conversations: Vec<AiConversation>,
}
@ -392,3 +393,46 @@ impl IntegrationConfig {
}
}
}
// ─── Image Attachment ────────────────────────────────────────────────────────────
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ImageAttachment {
pub id: String,
pub issue_id: String,
pub file_name: String,
pub file_path: String,
pub file_size: i64,
pub mime_type: String,
pub upload_hash: String,
pub uploaded_at: String,
pub pii_warning_acknowledged: bool,
pub is_paste: bool,
}
impl ImageAttachment {
#[allow(clippy::too_many_arguments)]
pub fn new(
issue_id: String,
file_name: String,
file_path: String,
file_size: i64,
mime_type: String,
upload_hash: String,
pii_warning_acknowledged: bool,
is_paste: bool,
) -> Self {
ImageAttachment {
id: Uuid::now_v7().to_string(),
issue_id,
file_name,
file_path,
file_size,
mime_type,
upload_hash,
uploaded_at: chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string(),
pii_warning_acknowledged,
is_paste,
}
}
}

View File

@ -177,6 +177,7 @@ mod tests {
tags: "[]".to_string(),
},
log_files: vec![],
image_attachments: vec![],
resolution_steps: vec![ResolutionStep {
id: "rs-pm-1".to_string(),
issue_id: "pm-456".to_string(),

View File

@ -172,6 +172,7 @@ mod tests {
uploaded_at: "2025-01-15 10:30:00".to_string(),
redacted: false,
}],
image_attachments: vec![],
resolution_steps: vec![
ResolutionStep {
id: "rs-1".to_string(),

View File

@ -73,6 +73,10 @@ pub fn run() {
commands::analysis::upload_log_file,
commands::analysis::detect_pii,
commands::analysis::apply_redactions,
commands::image::upload_image_attachment,
commands::image::list_image_attachments,
commands::image::delete_image_attachment,
commands::image::upload_paste_image,
// AI
commands::ai::analyze_logs,
commands::ai::chat_message,
@ -109,7 +113,6 @@ pub fn run() {
commands::system::get_settings,
commands::system::update_settings,
commands::system::get_audit_log,
commands::system::install_ollama_from_bundle,
])
.run(tauri::generate_context!())
.expect("Error running Troubleshooting and RCA Assistant application");

View File

@ -34,7 +34,7 @@
"icons/icon.icns",
"icons/icon.ico"
],
"resources": ["resources/ollama/*"],
"resources": [],
"externalBin": [],
"copyright": "Troubleshooting and RCA Assistant Contributors",
"category": "Utility",

View File

@ -0,0 +1,165 @@
import React, { useState, useRef, useEffect } from "react";
import { X, AlertTriangle, ExternalLink, Image as ImageIcon } from "lucide-react";
import type { ImageAttachment } from "@/lib/tauriCommands";
interface ImageGalleryProps {
images: ImageAttachment[];
onDelete?: (attachment: ImageAttachment) => void;
showWarning?: boolean;
}
export function ImageGallery({ images, onDelete, showWarning = true }: ImageGalleryProps) {
const [selectedImage, setSelectedImage] = useState<ImageAttachment | null>(null);
const [isModalOpen, setIsModalOpen] = useState(false);
const modalRef = useRef<HTMLDivElement>(null);
useEffect(() => {
const handleKeyDown = (e: KeyboardEvent) => {
if (e.key === "Escape" && isModalOpen) {
setIsModalOpen(false);
setSelectedImage(null);
}
};
window.addEventListener("keydown", handleKeyDown);
return () => window.removeEventListener("keydown", handleKeyDown);
}, [isModalOpen]);
if (images.length === 0) return null;
const base64ToDataUrl = (base64: string, mimeType: string): string => {
if (base64.startsWith("data:image/")) {
return base64;
}
return `data:${mimeType};base64,${base64}`;
};
const getPreviewUrl = (attachment: ImageAttachment): string => {
if (attachment.file_path && attachment.file_path.length > 0) {
return `file://${attachment.file_path}`;
}
return base64ToDataUrl(attachment.upload_hash, attachment.mime_type);
};
const isWebSource = (image: ImageAttachment): boolean => {
return image.file_path.length > 0 &&
(image.file_path.startsWith("http://") ||
image.file_path.startsWith("https://"));
};
return (
<div className="space-y-4">
{showWarning && (
<div className="bg-amber-100 border border-amber-300 text-amber-800 p-3 rounded-md flex items-center gap-2">
<AlertTriangle className="w-5 h-5 flex-shrink-0" />
<span className="text-sm">
PII cannot be automatically redacted from images. Use at your own risk.
</span>
</div>
)}
{images.some(img => isWebSource(img)) && (
<div className="bg-red-100 border border-red-300 text-red-800 p-3 rounded-md flex items-center gap-2">
<ExternalLink className="w-5 h-5 flex-shrink-0" />
<span className="text-sm">
Some images appear to be from web sources. Ensure you have permission to share.
</span>
</div>
)}
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-5 gap-4">
{images.map((image, idx) => (
<div key={image.id} className="group relative rounded-lg overflow-hidden bg-gray-100 border border-gray-200">
<button
onClick={() => {
setSelectedImage(image);
setIsModalOpen(true);
}}
className="w-full aspect-video object-cover"
>
<img
src={getPreviewUrl(image)}
alt={image.file_name}
className="w-full h-full object-cover transition-transform group-hover:scale-110"
loading="lazy"
/>
</button>
<div className="p-2">
<p className="text-xs text-gray-700 truncate" title={image.file_name}>
{image.file_name}
</p>
<p className="text-xs text-gray-500">
{image.is_paste ? "Paste" : "Upload"} · {(image.file_size / 1024).toFixed(1)} KB
</p>
</div>
{onDelete && (
<button
onClick={(e) => {
e.stopPropagation();
onDelete(image);
}}
className="absolute top-1 right-1 p-1 bg-white/80 hover:bg-white rounded-md text-gray-600 hover:text-red-600 transition-colors opacity-0 group-hover:opacity-100"
title="Delete image"
>
<X className="w-4 h-4" />
</button>
)}
</div>
))}
</div>
{isModalOpen && selectedImage && (
<div
className="fixed inset-0 bg-black/50 z-50 flex items-center justify-center p-4"
onClick={() => {
setIsModalOpen(false);
setSelectedImage(null);
}}
>
<div
ref={modalRef}
className="bg-white rounded-lg overflow-hidden max-w-4xl max-h-[90vh] flex flex-col"
onClick={(e) => e.stopPropagation()}
>
<div className="bg-gray-100 p-4 flex items-center justify-between border-b">
<div className="flex items-center gap-2">
<ImageIcon className="w-5 h-5 text-gray-600" />
<h3 className="font-medium">{selectedImage.file_name}</h3>
</div>
<button
onClick={() => {
setIsModalOpen(false);
setSelectedImage(null);
}}
className="p-2 hover:bg-gray-200 rounded-lg transition-colors"
>
<X className="w-5 h-5" />
</button>
</div>
<div className="flex-1 overflow-auto bg-gray-900 flex items-center justify-center p-8">
<img
src={getPreviewUrl(selectedImage)}
alt={selectedImage.file_name}
className="max-w-full max-h-[60vh] object-contain"
/>
</div>
<div className="bg-gray-50 p-4 border-t text-sm space-y-2">
<div className="flex gap-4">
<div>
<span className="text-gray-500">Type:</span> {selectedImage.mime_type}
</div>
<div>
<span className="text-gray-500">Size:</span> {(selectedImage.file_size / 1024).toFixed(2)} KB
</div>
<div>
<span className="text-gray-500">Source:</span> {selectedImage.is_paste ? "Paste" : "File"}
</div>
</div>
</div>
</div>
</div>
)}
</div>
);
}
export default ImageGallery;

View File

@ -100,6 +100,7 @@ export interface ResolutionStep {
export interface IssueDetail {
issue: Issue;
log_files: LogFile[];
image_attachments: ImageAttachment[];
resolution_steps: ResolutionStep[];
conversations: AiConversation[];
}
@ -145,6 +146,19 @@ export interface LogFile {
redacted: boolean;
}
export interface ImageAttachment {
id: string;
issue_id: string;
file_name: string;
file_path: string;
file_size: number;
mime_type: string;
upload_hash: string;
uploaded_at: string;
pii_warning_acknowledged: boolean;
is_paste: boolean;
}
export interface PiiSpan {
id: string;
pii_type: string;
@ -263,6 +277,18 @@ export const listProvidersCmd = () => invoke<ProviderInfo[]>("list_providers");
export const uploadLogFileCmd = (issueId: string, filePath: string) =>
invoke<LogFile>("upload_log_file", { issueId, filePath });
export const uploadImageAttachmentCmd = (issueId: string, filePath: string) =>
invoke<ImageAttachment>("upload_image_attachment", { issueId, filePath });
export const uploadPasteImageCmd = (issueId: string, base64Image: string, mimeType: string) =>
invoke<ImageAttachment>("upload_paste_image", { issueId, base64Image, mimeType });
export const listImageAttachmentsCmd = (issueId: string) =>
invoke<ImageAttachment[]>("list_image_attachments", { issueId });
export const deleteImageAttachmentCmd = (attachmentId: string) =>
invoke<void>("delete_image_attachment", { attachmentId });
export const detectPiiCmd = (logFileId: string) =>
invoke<PiiDetectionResult>("detect_pii", { logFileId });
@ -436,6 +462,3 @@ export const getIntegrationConfigCmd = (service: string) =>
export const getAllIntegrationConfigsCmd = () =>
invoke<IntegrationConfig[]>("get_all_integration_configs");
export const installOllamaFromBundleCmd = () =>
invoke<string>("install_ollama_from_bundle");

View File

@ -1,16 +1,22 @@
import React, { useState, useCallback } from "react";
import React, { useState, useCallback, useRef, useEffect } from "react";
import { useNavigate, useParams } from "react-router-dom";
import { Upload, File, Trash2, ShieldCheck } from "lucide-react";
import { Upload, File, Trash2, ShieldCheck, AlertTriangle, Image as ImageIcon } from "lucide-react";
import { Button, Card, CardHeader, CardTitle, CardContent, Badge } from "@/components/ui";
import { PiiDiffViewer } from "@/components/PiiDiffViewer";
import { useSessionStore } from "@/stores/sessionStore";
import {
uploadLogFileCmd,
detectPiiCmd,
uploadImageAttachmentCmd,
uploadPasteImageCmd,
listImageAttachmentsCmd,
deleteImageAttachmentCmd,
type LogFile,
type PiiSpan,
type PiiDetectionResult,
type ImageAttachment,
} from "@/lib/tauriCommands";
import ImageGallery from "@/components/ImageGallery";
export default function LogUpload() {
const { id } = useParams<{ id: string }>();
@ -18,11 +24,14 @@ export default function LogUpload() {
const { piiSpans, approvedRedactions, setPiiSpans, setApprovedRedactions } = useSessionStore();
const [files, setFiles] = useState<{ file: File; uploaded?: LogFile }[]>([]);
const [images, setImages] = useState<ImageAttachment[]>([]);
const [piiResult, setPiiResult] = useState<PiiDetectionResult | null>(null);
const [isUploading, setIsUploading] = useState(false);
const [isDetecting, setIsDetecting] = useState(false);
const [error, setError] = useState<string | null>(null);
const fileInputRef = useRef<HTMLInputElement>(null);
const handleDrop = useCallback(
(e: React.DragEvent) => {
e.preventDefault();
@ -96,9 +105,136 @@ export default function LogUpload() {
}
};
const handleImageDrop = useCallback(
(e: React.DragEvent) => {
e.preventDefault();
const droppedFiles = Array.from(e.dataTransfer.files);
const imageFiles = droppedFiles.filter((f) => f.type.startsWith("image/"));
if (imageFiles.length > 0) {
handleImagesUpload(imageFiles);
}
},
[id]
);
const handleImageFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
if (e.target.files) {
const selected = Array.from(e.target.files).filter((f) => f.type.startsWith("image/"));
if (selected.length > 0) {
handleImagesUpload(selected);
}
}
};
const handlePaste = useCallback(
async (e: React.ClipboardEvent) => {
const items = e.clipboardData?.items;
const imageItems = items ? Array.from(items).filter((item: DataTransferItem) => item.type.startsWith("image/")) : [];
for (const item of imageItems) {
const file = item.getAsFile();
if (file) {
const reader = new FileReader();
reader.onload = async () => {
const base64Data = reader.result as string;
try {
const result = await uploadPasteImageCmd(id || "", base64Data, file.type);
setImages((prev) => [...prev, result]);
} catch (err) {
setError(String(err));
}
};
reader.readAsDataURL(file);
}
}
},
[id]
);
const handleImagesUpload = async (imageFiles: File[]) => {
if (!id || imageFiles.length === 0) return;
setIsUploading(true);
setError(null);
try {
const uploaded = await Promise.all(
imageFiles.map(async (file) => {
const result = await uploadImageAttachmentCmd(id, file.name);
return result;
})
);
setImages((prev) => [...prev, ...uploaded]);
} catch (err) {
setError(String(err));
} finally {
setIsUploading(false);
}
};
const handleDeleteImage = async (image: ImageAttachment) => {
try {
await deleteImageAttachmentCmd(image.id);
setImages((prev) => prev.filter((img) => img.id !== image.id));
} catch (err) {
setError(String(err));
}
};
const fileToBase64 = (file: File): Promise<string> => {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = () => resolve(reader.result as string);
reader.onerror = (err) => reject(err);
reader.readAsDataURL(file);
});
};
const allUploaded = files.length > 0 && files.every((f) => f.uploaded);
const piiReviewed = piiResult != null;
useEffect(() => {
const handleGlobalPaste = (e: ClipboardEvent) => {
if (document.activeElement?.tagName === "INPUT" ||
document.activeElement?.tagName === "TEXTAREA" ||
(document.activeElement as HTMLElement)?.isContentEditable || false) {
return;
}
const items = e.clipboardData?.items;
const imageItems = items ? Array.from(items).filter((item: DataTransferItem) => item.type.startsWith("image/")) : [];
for (const item of imageItems) {
const file = item.getAsFile();
if (file) {
e.preventDefault();
const reader = new FileReader();
reader.onload = async () => {
const base64Data = reader.result as string;
try {
const result = await uploadPasteImageCmd(id || "", base64Data, file.type);
setImages((prev) => [...prev, result]);
} catch (err) {
setError(String(err));
}
};
reader.readAsDataURL(file);
break;
}
}
};
window.addEventListener("paste", handleGlobalPaste);
return () => window.removeEventListener("paste", handleGlobalPaste);
}, [id]);
useEffect(() => {
if (id) {
listImageAttachmentsCmd(id).then(setImages).catch(setError);
}
}, [id]);
return (
<div className="p-6 space-y-6">
<div>
@ -165,6 +301,87 @@ export default function LogUpload() {
</Card>
)}
{/* Image Upload */}
{id && (
<>
<div>
<h2 className="text-2xl font-semibold flex items-center gap-2">
<ImageIcon className="w-6 h-6" />
Image Attachments
</h2>
<p className="text-muted-foreground mt-1">
Upload or paste screenshots and images.
</p>
</div>
{/* Image drop zone */}
<div
onDragOver={(e) => e.preventDefault()}
onDrop={handleImageDrop}
className="border-2 border-dashed border-primary/30 rounded-lg p-8 text-center hover:border-primary transition-colors cursor-pointer bg-primary/5"
onClick={() => document.getElementById("image-input")?.click()}
>
<Upload className="w-8 h-8 mx-auto text-primary mb-2" />
<p className="text-sm text-muted-foreground">
Drag and drop images here, or click to browse
</p>
<p className="text-xs text-muted-foreground mt-2">
Supported: PNG, JPEG, GIF, WebP, SVG
</p>
<input
id="image-input"
type="file"
accept="image/*"
className="hidden"
onChange={handleImageFileSelect}
/>
</div>
{/* Paste button */}
<div className="flex items-center gap-2">
<Button
onClick={async (e) => {
e.preventDefault();
document.execCommand("paste");
}}
variant="secondary"
>
Paste from Clipboard
</Button>
<span className="text-xs text-muted-foreground">
Use Ctrl+V / Cmd+V or the button above to paste images
</span>
</div>
{/* PII warning for images */}
<div className="bg-amber-50 border border-amber-200 rounded-md p-3">
<AlertTriangle className="w-5 h-5 text-amber-600 inline mr-2" />
<span className="text-sm text-amber-800">
PII cannot be automatically redacted from images. Use at your own risk.
</span>
</div>
{/* Image Gallery */}
{images.length > 0 && (
<Card>
<CardHeader>
<CardTitle className="text-lg flex items-center gap-2">
<ImageIcon className="w-5 h-5" />
Attached Images ({images.length})
</CardTitle>
</CardHeader>
<CardContent>
<ImageGallery
images={images}
onDelete={handleDeleteImage}
showWarning={false}
/>
</CardContent>
</Card>
)}
</>
)}
{/* PII Detection */}
{allUploaded && (
<Card>

View File

@ -24,7 +24,6 @@ import {
deleteOllamaModelCmd,
listOllamaModelsCmd,
getOllamaInstallGuideCmd,
installOllamaFromBundleCmd,
type OllamaStatus,
type HardwareInfo,
type ModelRecommendation,
@ -44,7 +43,6 @@ export default function Ollama() {
const [customModel, setCustomModel] = useState("");
const [isPulling, setIsPulling] = useState(false);
const [pullProgress, setPullProgress] = useState(0);
const [isInstallingBundle, setIsInstallingBundle] = useState(false);
const [error, setError] = useState<string | null>(null);
const loadData = async () => {
@ -107,19 +105,6 @@ export default function Ollama() {
}
};
const handleInstallFromBundle = async () => {
setIsInstallingBundle(true);
setError(null);
try {
await installOllamaFromBundleCmd();
await loadData();
} catch (err) {
setError(String(err));
} finally {
setIsInstallingBundle(false);
}
};
const handleDelete = async (modelName: string) => {
try {
await deleteOllamaModelCmd(modelName);
@ -184,33 +169,16 @@ export default function Ollama() {
{status && !status.installed && installGuide && (
<Card className="border-yellow-500/50">
<CardHeader>
<CardTitle className="text-lg flex items-center gap-2">
<Download className="w-5 h-5 text-yellow-500" />
<CardTitle className="text-lg">
Ollama Not Detected Installation Required
</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
<CardContent>
<ol className="space-y-2 list-decimal list-inside">
{installGuide.steps.map((step, i) => (
<li key={i} className="text-sm text-muted-foreground">{step}</li>
))}
</ol>
<div className="flex flex-wrap gap-2">
<Button
variant="outline"
onClick={() => window.open(installGuide.url, "_blank")}
>
<Download className="w-4 h-4 mr-2" />
Download Ollama for {installGuide.platform}
</Button>
<Button
onClick={handleInstallFromBundle}
disabled={isInstallingBundle}
>
<Download className="w-4 h-4 mr-2" />
{isInstallingBundle ? "Installing..." : "Install Ollama (Offline)"}
</Button>
</div>
</CardContent>
</Card>
)}

View File

@ -0,0 +1,142 @@
import { describe, expect, it } from "vitest";
import { readFileSync } from "node:fs";
import path from "node:path";
const root = process.cwd();
const readFile = (rel: string) => readFileSync(path.resolve(root, rel), "utf-8");
// ─── Dockerfiles ─────────────────────────────────────────────────────────────
describe("Dockerfile.linux-amd64", () => {
const df = readFile(".docker/Dockerfile.linux-amd64");
it("is based on the pinned Rust 1.88 slim image", () => {
expect(df).toContain("FROM rust:1.88-slim");
});
it("installs webkit2gtk 4.1 dev package", () => {
expect(df).toContain("libwebkit2gtk-4.1-dev");
});
it("installs Node.js 22 via NodeSource", () => {
expect(df).toContain("nodesource.com/setup_22.x");
expect(df).toContain("nodejs");
});
it("pre-adds the x86_64 Linux Rust target", () => {
expect(df).toContain("rustup target add x86_64-unknown-linux-gnu");
});
it("cleans apt lists to keep image lean", () => {
expect(df).toContain("rm -rf /var/lib/apt/lists/*");
});
});
describe("Dockerfile.windows-cross", () => {
const df = readFile(".docker/Dockerfile.windows-cross");
it("is based on the pinned Rust 1.88 slim image", () => {
expect(df).toContain("FROM rust:1.88-slim");
});
it("installs mingw-w64 cross-compiler", () => {
expect(df).toContain("mingw-w64");
});
it("installs nsis for Windows installer bundling", () => {
expect(df).toContain("nsis");
});
it("installs Node.js 22 via NodeSource", () => {
expect(df).toContain("nodesource.com/setup_22.x");
});
it("pre-adds the Windows GNU Rust target", () => {
expect(df).toContain("rustup target add x86_64-pc-windows-gnu");
});
it("cleans apt lists to keep image lean", () => {
expect(df).toContain("rm -rf /var/lib/apt/lists/*");
});
});
describe("Dockerfile.linux-arm64", () => {
const df = readFile(".docker/Dockerfile.linux-arm64");
it("is based on Ubuntu 22.04 (Jammy)", () => {
expect(df).toContain("FROM ubuntu:22.04");
});
it("installs aarch64 cross-compiler", () => {
expect(df).toContain("gcc-aarch64-linux-gnu");
expect(df).toContain("g++-aarch64-linux-gnu");
});
it("sets up arm64 multiarch via ports.ubuntu.com", () => {
expect(df).toContain("dpkg --add-architecture arm64");
expect(df).toContain("ports.ubuntu.com/ubuntu-ports");
expect(df).toContain("jammy");
});
it("installs arm64 webkit2gtk dev package", () => {
expect(df).toContain("libwebkit2gtk-4.1-dev:arm64");
});
it("installs Rust 1.88 with arm64 cross-compilation target", () => {
expect(df).toContain("--default-toolchain 1.88.0");
expect(df).toContain("rustup target add aarch64-unknown-linux-gnu");
});
it("adds cargo to PATH via ENV", () => {
expect(df).toContain('ENV PATH="/root/.cargo/bin:${PATH}"');
});
it("installs Node.js 22 via NodeSource", () => {
expect(df).toContain("nodesource.com/setup_22.x");
});
});
// ─── build-images.yml workflow ───────────────────────────────────────────────
describe("build-images.yml workflow", () => {
const wf = readFile(".gitea/workflows/build-images.yml");
it("triggers on changes to .docker/ files on master", () => {
expect(wf).toContain("- master");
expect(wf).toContain("- '.docker/**'");
});
it("supports manual workflow_dispatch trigger", () => {
expect(wf).toContain("workflow_dispatch:");
});
it("mounts the host Docker socket for image builds", () => {
expect(wf).toContain("-v /var/run/docker.sock:/var/run/docker.sock");
});
it("authenticates to the local Gitea registry before pushing", () => {
expect(wf).toContain("docker login");
expect(wf).toContain("--password-stdin");
expect(wf).toContain("172.0.0.29:3000");
});
it("builds and pushes all three platform images", () => {
expect(wf).toContain("trcaa-linux-amd64:rust1.88-node22");
expect(wf).toContain("trcaa-windows-cross:rust1.88-node22");
expect(wf).toContain("trcaa-linux-arm64:rust1.88-node22");
});
it("uses docker:24-cli image for build jobs", () => {
expect(wf).toContain("docker:24-cli");
});
it("runs all three build jobs on linux-amd64 runner", () => {
const matches = wf.match(/runs-on: linux-amd64/g) ?? [];
expect(matches.length).toBeGreaterThanOrEqual(3);
});
it("uses RELEASE_TOKEN secret for registry auth", () => {
expect(wf).toContain("secrets.RELEASE_TOKEN");
});
});

View File

@ -21,6 +21,7 @@ const mockIssueDetail = {
tags: "[]",
},
log_files: [],
image_attachments: [],
resolution_steps: [
{
id: "step-1",