Compare commits
115 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| c0d482ace7 | |||
| 5a12718566 | |||
|
|
4a0c7957ec | ||
| 12a76b4dd8 | |||
|
|
0e6fd09455 | ||
|
|
b7f348bf34 | ||
|
|
7234704636 | ||
| 06b0c10b17 | |||
|
|
ab231b6564 | ||
| 8b828fe4c3 | |||
|
|
27193c91e6 | ||
| cb542d7f22 | |||
|
|
d066e71eeb | ||
| 257b2fb9c5 | |||
|
|
d715ba0b25 | ||
|
|
8b0cbc3ce8 | ||
|
|
13c4969e31 | ||
|
|
79a623dbb2 | ||
|
|
107fee8853 | ||
| 6d105a70ad | |||
| ca56b583c5 | |||
|
|
8c35e91aef | ||
|
|
1055841b6f | ||
| f38ca7e2fc | |||
| a9956a16a4 | |||
|
|
bc50a78db7 | ||
|
|
e6d1965342 | ||
|
|
708e1e9c18 | ||
|
|
5b45c6c418 | ||
|
|
096068ed2b | ||
|
|
9248811076 | ||
|
|
007d0ee9d5 | ||
|
|
9e1a9b1d34 | ||
| cdb1dd1dad | |||
| 6dbe40ef03 | |||
|
|
75fc3ca67c | ||
| fdae6d6e6d | |||
|
|
d78181e8c0 | ||
|
|
b4ff52108a | ||
| 29a68c07e9 | |||
|
|
40a2c25428 | ||
|
|
62e3570a15 | ||
| 41e5753de6 | |||
|
|
25201eaac1 | ||
| 618eb6b43d | |||
|
|
5084dca5e3 | ||
|
|
6cbdcaed21 | ||
|
|
8298506435 | ||
| 412c5e70f0 | |||
| 05f87a7bff | |||
|
|
8e1d43da43 | ||
|
|
2d7aac8413 | ||
|
|
84c69fbea8 | ||
| 9bc570774a | |||
| f7011c8837 | |||
|
|
f74238a65a | ||
|
|
2da529fb75 | ||
| 2f6d5c1865 | |||
|
|
280a9f042e | ||
| 41bc5f38ff | |||
|
|
6d2b69ffb0 | ||
| eae1c6e8b7 | |||
|
|
27a46a7542 | ||
| 21de93174c | |||
|
|
a365cba30e | ||
|
|
2ce38b9477 | ||
|
|
461959fbca | ||
|
|
a86ae81161 | ||
|
|
decd1fe5cf | ||
|
|
16930dca70 | ||
|
|
bb0f3eceab | ||
| 4fa01ae7ed | |||
|
|
181b9ef734 | ||
|
|
144a4551f2 | ||
|
|
47b2e824e0 | ||
|
|
82aae00858 | ||
|
|
1a4c6df6c9 | ||
|
|
2d0f95e9db | ||
|
|
61cb5db63e | ||
|
|
44584d6302 | ||
|
|
1db1b20762 | ||
|
|
8f73a7d017 | ||
|
|
5e61d4f550 | ||
|
|
d759486b51 | ||
|
|
63a055d4fe | ||
|
|
98a0f908d7 | ||
|
|
f47dcf69a3 | ||
|
|
0b85258e7d | ||
|
|
8cee1c5655 | ||
|
|
de59684432 | ||
|
|
849d3176fd | ||
| 182a508f4e | |||
|
|
68d815e3e1 | ||
| 46c48fb4a3 | |||
|
|
ed2af2a1cc | ||
|
|
6ebe3612cd | ||
|
|
420411882e | ||
| 859d7a0da8 | |||
|
|
b6e68be959 | ||
| b3765aa65d | |||
|
|
fbc6656374 | ||
|
|
298bce8536 | ||
| 6593cb760c | |||
|
|
c49b8ebfc0 | ||
| 042335f380 | |||
|
|
92fc67a92c | ||
| c7a797d720 | |||
| e79846e849 | |||
|
|
5b8a7d7173 | ||
| 093bc6ea15 | |||
|
|
e83dc19dcc | ||
|
|
d83e8598e0 | ||
|
|
f6f48b934b | ||
|
|
9f6cab2436 | ||
|
|
19cc78a05f |
@ -1,11 +1,14 @@
|
||||
# Pre-baked builder for Linux amd64 Tauri releases.
|
||||
# All system dependencies are installed once here; CI jobs skip apt-get entirely.
|
||||
# Rebuild when: Rust toolchain version changes, webkit2gtk/gtk major version changes,
|
||||
# or Node.js major version changes. Tag format: rust<VER>-node<VER>
|
||||
# Node.js major version changes, OpenSSL major version changes (used via OPENSSL_STATIC=1),
|
||||
# or Tauri CLI version changes that affect bundler system deps.
|
||||
# Tag format: rust<VER>-node<VER>
|
||||
FROM rust:1.88-slim
|
||||
|
||||
RUN apt-get update -qq \
|
||||
&& apt-get install -y -qq --no-install-recommends \
|
||||
ca-certificates \
|
||||
libwebkit2gtk-4.1-dev \
|
||||
libssl-dev \
|
||||
libgtk-3-dev \
|
||||
@ -21,4 +24,5 @@ RUN apt-get update -qq \
|
||||
&& apt-get install -y --no-install-recommends nodejs \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
RUN rustup target add x86_64-unknown-linux-gnu
|
||||
RUN rustup target add x86_64-unknown-linux-gnu \
|
||||
&& rustup component add rustfmt clippy
|
||||
|
||||
@ -1,7 +1,9 @@
|
||||
# Pre-baked cross-compiler for Linux arm64 Tauri releases (runs on Linux amd64).
|
||||
# Bakes in: amd64 cross-toolchain, arm64 multiarch dev libs, Node.js, and Rust.
|
||||
# This image takes ~15 min to build but is only rebuilt when deps change.
|
||||
# Rebuild when: Rust toolchain version, webkit2gtk/gtk major version, or Node.js changes.
|
||||
# Rebuild when: Rust toolchain version, webkit2gtk/gtk major version, Node.js major version,
|
||||
# OpenSSL major version (used via OPENSSL_STATIC=1), or Tauri CLI changes that affect
|
||||
# bundler system deps.
|
||||
# Tag format: rust<VER>-node<VER>
|
||||
FROM ubuntu:22.04
|
||||
|
||||
@ -10,7 +12,7 @@ ARG DEBIAN_FRONTEND=noninteractive
|
||||
# Step 1: amd64 host tools and cross-compiler
|
||||
RUN apt-get update -qq \
|
||||
&& apt-get install -y -qq --no-install-recommends \
|
||||
curl git gcc g++ make patchelf pkg-config perl jq \
|
||||
ca-certificates curl git gcc g++ make patchelf pkg-config perl jq \
|
||||
gcc-aarch64-linux-gnu g++-aarch64-linux-gnu \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
@ -40,6 +42,7 @@ RUN curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
|
||||
# Step 4: Rust 1.88 with arm64 cross-compilation target
|
||||
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
|
||||
--default-toolchain 1.88.0 --profile minimal --no-modify-path \
|
||||
&& /root/.cargo/bin/rustup target add aarch64-unknown-linux-gnu
|
||||
&& /root/.cargo/bin/rustup target add aarch64-unknown-linux-gnu \
|
||||
&& /root/.cargo/bin/rustup component add rustfmt clippy
|
||||
|
||||
ENV PATH="/root/.cargo/bin:${PATH}"
|
||||
|
||||
@ -1,11 +1,14 @@
|
||||
# Pre-baked cross-compiler for Windows amd64 Tauri releases (runs on Linux amd64).
|
||||
# All MinGW and Node.js dependencies are installed once here; CI jobs skip apt-get entirely.
|
||||
# Rebuild when: Rust toolchain version changes or Node.js major version changes.
|
||||
# Rebuild when: Rust toolchain version changes, Node.js major version changes,
|
||||
# OpenSSL major version changes (used via OPENSSL_STATIC=1), or Tauri CLI changes
|
||||
# that affect bundler system deps.
|
||||
# Tag format: rust<VER>-node<VER>
|
||||
FROM rust:1.88-slim
|
||||
|
||||
RUN apt-get update -qq \
|
||||
&& apt-get install -y -qq --no-install-recommends \
|
||||
ca-certificates \
|
||||
mingw-w64 \
|
||||
curl \
|
||||
nsis \
|
||||
|
||||
26
.eslintrc.json
Normal file
26
.eslintrc.json
Normal file
@ -0,0 +1,26 @@
|
||||
{
|
||||
"extends": ["eslint:recommended", "plugin:@typescript-eslint/recommended", "plugin:react/recommended", "plugin:react-hooks/recommended"],
|
||||
"parser": "@typescript-eslint/parser",
|
||||
"parserOptions": {
|
||||
"ecmaFeatures": {
|
||||
"jsx": true
|
||||
},
|
||||
"ecmaVersion": "latest",
|
||||
"sourceType": "module",
|
||||
"project": ["./tsconfig.json"]
|
||||
},
|
||||
"plugins": ["@typescript-eslint", "react", "react-hooks"],
|
||||
"settings": {
|
||||
"react": {
|
||||
"version": "detect"
|
||||
}
|
||||
},
|
||||
"rules": {
|
||||
"no-unused-vars": "off",
|
||||
"@typescript-eslint/no-unused-vars": ["error", { "argsIgnorePattern": "^_" }],
|
||||
"no-console": ["warn", { "allow": ["warn", "error"] }],
|
||||
"react/react-in-jsx-scope": "off",
|
||||
"react/prop-types": "off"
|
||||
},
|
||||
"ignorePatterns": ["dist/", "node_modules/", "src-tauri/", "target/", "coverage/"]
|
||||
}
|
||||
@ -65,6 +65,138 @@ jobs:
|
||||
|
||||
echo "Tag $NEXT pushed successfully"
|
||||
|
||||
changelog:
|
||||
needs: autotag
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: alpine:latest
|
||||
steps:
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
set -eu
|
||||
apk add --no-cache git curl jq
|
||||
|
||||
- name: Checkout (full history + all tags)
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
git init
|
||||
git remote add origin \
|
||||
"http://oauth2:${RELEASE_TOKEN}@172.0.0.29:3000/${GITHUB_REPOSITORY}.git"
|
||||
git fetch --tags --depth=2147483647 origin
|
||||
git checkout FETCH_HEAD
|
||||
git config user.name "gitea-actions[bot]"
|
||||
git config user.email "gitea-actions@local"
|
||||
|
||||
- name: Install git-cliff
|
||||
run: |
|
||||
set -eu
|
||||
CLIFF_VER="2.7.0"
|
||||
curl -fsSL \
|
||||
"https://github.com/orhun/git-cliff/releases/download/v${CLIFF_VER}/git-cliff-${CLIFF_VER}-x86_64-unknown-linux-musl.tar.gz" \
|
||||
| tar -xz --strip-components=1 -C /usr/local/bin \
|
||||
"git-cliff-${CLIFF_VER}/git-cliff"
|
||||
|
||||
- name: Generate changelog
|
||||
run: |
|
||||
set -eu
|
||||
git-cliff --config cliff.toml --output CHANGELOG.md
|
||||
git-cliff --config cliff.toml --latest --strip all > /tmp/release_body.md
|
||||
echo "=== Release body preview ==="
|
||||
cat /tmp/release_body.md
|
||||
|
||||
- name: Update Gitea release body
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
TAG=$(git describe --tags --abbrev=0)
|
||||
# Create release if it doesn't exist yet (build jobs may still be running)
|
||||
curl -sf -X POST "$API/releases" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"tag_name\":\"$TAG\",\"name\":\"TFTSR $TAG\",\"body\":\"Release $TAG\",\"draft\":false}" || true
|
||||
RELEASE_ID=$(curl -sf "$API/releases/tags/$TAG" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | jq -r '.id')
|
||||
if [ -z "$RELEASE_ID" ] || [ "$RELEASE_ID" = "null" ]; then
|
||||
echo "ERROR: Failed to get release ID for $TAG"
|
||||
exit 1
|
||||
fi
|
||||
curl -sf -X PATCH "$API/releases/$RELEASE_ID" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
--data-binary "{\"body\":$(jq -Rs . < /tmp/release_body.md)}"
|
||||
echo "✓ Release body updated"
|
||||
|
||||
- name: Commit CHANGELOG.md to master
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
TAG=$(git describe --tags --abbrev=0)
|
||||
# Validate tag format to prevent shell injection in commit message / JSON
|
||||
if ! echo "$TAG" | grep -qE '^v[0-9]+\.[0-9]+\.[0-9]+$'; then
|
||||
echo "ERROR: Unexpected tag format: $TAG"
|
||||
exit 1
|
||||
fi
|
||||
# Fetch current blob SHA from master; empty if file doesn't exist yet
|
||||
CURRENT_SHA=$(curl -sf \
|
||||
-H "Accept: application/json" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
"$API/contents/CHANGELOG.md?ref=master" 2>/dev/null \
|
||||
| jq -r '.sha // empty' 2>/dev/null || true)
|
||||
# Base64-encode content (no line wrapping)
|
||||
CONTENT=$(base64 -w 0 CHANGELOG.md)
|
||||
# Build JSON payload — omit "sha" when file doesn't exist yet (new repo)
|
||||
PAYLOAD=$(jq -n \
|
||||
--arg msg "chore: update CHANGELOG.md for ${TAG} [skip ci]" \
|
||||
--arg body "$CONTENT" \
|
||||
--arg sha "$CURRENT_SHA" \
|
||||
'if $sha == ""
|
||||
then {message: $msg, content: $body, branch: "master"}
|
||||
else {message: $msg, content: $body, sha: $sha, branch: "master"}
|
||||
end')
|
||||
# PUT atomically updates (or creates) the file on master — no fast-forward needed
|
||||
RESP_FILE=$(mktemp)
|
||||
HTTP_CODE=$(curl -s -o "$RESP_FILE" -w "%{http_code}" -X PUT \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$PAYLOAD" \
|
||||
"$API/contents/CHANGELOG.md")
|
||||
if [ "$HTTP_CODE" -lt 200 ] || [ "$HTTP_CODE" -ge 300 ]; then
|
||||
echo "ERROR: Failed to update CHANGELOG.md (HTTP $HTTP_CODE)"
|
||||
cat "$RESP_FILE" >&2
|
||||
exit 1
|
||||
fi
|
||||
echo "✓ CHANGELOG.md committed to master"
|
||||
|
||||
- name: Upload CHANGELOG.md as release asset
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
TAG=$(git describe --tags --abbrev=0)
|
||||
RELEASE_ID=$(curl -sf "$API/releases/tags/$TAG" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | jq -r '.id')
|
||||
if [ -z "$RELEASE_ID" ] || [ "$RELEASE_ID" = "null" ]; then
|
||||
echo "ERROR: Failed to get release ID for $TAG"
|
||||
exit 1
|
||||
fi
|
||||
EXISTING=$(curl -sf "$API/releases/$RELEASE_ID" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
| jq -r '.assets[]? | select(.name=="CHANGELOG.md") | .id')
|
||||
[ -n "$EXISTING" ] && curl -sf -X DELETE \
|
||||
"$API/releases/$RELEASE_ID/assets/$EXISTING" \
|
||||
-H "Authorization: token $RELEASE_TOKEN"
|
||||
curl -sf -X POST "$API/releases/$RELEASE_ID/assets" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-F "attachment=@CHANGELOG.md;filename=CHANGELOG.md"
|
||||
echo "✓ CHANGELOG.md uploaded"
|
||||
|
||||
wiki-sync:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
@ -132,27 +264,36 @@ jobs:
|
||||
needs: autotag
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
image: 172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
apt-get update -qq && apt-get install -y -qq git
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
git fetch --depth=1 origin "$GITHUB_SHA"
|
||||
git checkout FETCH_HEAD
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
apt-get update -qq && apt-get install -y -qq \
|
||||
libwebkit2gtk-4.1-dev libssl-dev libgtk-3-dev \
|
||||
libayatana-appindicator3-dev librsvg2-dev patchelf \
|
||||
pkg-config curl perl jq
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
|
||||
apt-get install -y nodejs
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry/index
|
||||
~/.cargo/registry/cache
|
||||
~/.cargo/git/db
|
||||
key: ${{ runner.os }}-cargo-linux-amd64-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-linux-amd64-
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-npm-
|
||||
- name: Build
|
||||
env:
|
||||
APPIMAGE_EXTRACT_AND_RUN: "1"
|
||||
run: |
|
||||
npm ci --legacy-peer-deps
|
||||
rustup target add x86_64-unknown-linux-gnu
|
||||
CI=true npx tauri build --target x86_64-unknown-linux-gnu
|
||||
- name: Upload artifacts
|
||||
env:
|
||||
@ -181,7 +322,7 @@ jobs:
|
||||
fi
|
||||
echo "Release ID: $RELEASE_ID"
|
||||
ARTIFACTS=$(find src-tauri/target/x86_64-unknown-linux-gnu/release/bundle -type f \
|
||||
\( -name "*.deb" -o -name "*.rpm" -o -name "*.AppImage" \))
|
||||
\( -name "*.deb" -o -name "*.rpm" \))
|
||||
if [ -z "$ARTIFACTS" ]; then
|
||||
echo "ERROR: No Linux amd64 artifacts were found to upload."
|
||||
exit 1
|
||||
@ -218,20 +359,31 @@ jobs:
|
||||
needs: autotag
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
image: 172.0.0.29:3000/sarman/trcaa-windows-cross:rust1.88-node22
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
apt-get update -qq && apt-get install -y -qq git
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
git fetch --depth=1 origin "$GITHUB_SHA"
|
||||
git checkout FETCH_HEAD
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
apt-get update -qq && apt-get install -y -qq mingw-w64 curl nsis perl make jq
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
|
||||
apt-get install -y nodejs
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry/index
|
||||
~/.cargo/registry/cache
|
||||
~/.cargo/git/db
|
||||
key: ${{ runner.os }}-cargo-windows-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-windows-
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-npm-
|
||||
- name: Build
|
||||
env:
|
||||
CC_x86_64_pc_windows_gnu: x86_64-w64-mingw32-gcc
|
||||
@ -242,7 +394,6 @@ jobs:
|
||||
OPENSSL_STATIC: "1"
|
||||
run: |
|
||||
npm ci --legacy-peer-deps
|
||||
rustup target add x86_64-pc-windows-gnu
|
||||
CI=true npx tauri build --target x86_64-pc-windows-gnu
|
||||
- name: Upload artifacts
|
||||
env:
|
||||
@ -392,53 +543,31 @@ jobs:
|
||||
needs: autotag
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: ubuntu:22.04
|
||||
image: 172.0.0.29:3000/sarman/trcaa-linux-arm64:rust1.88-node22
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
apt-get update -qq && apt-get install -y -qq git
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
git fetch --depth=1 origin "$GITHUB_SHA"
|
||||
git checkout FETCH_HEAD
|
||||
- name: Install dependencies
|
||||
env:
|
||||
DEBIAN_FRONTEND: noninteractive
|
||||
run: |
|
||||
# Step 1: Host tools + cross-compiler (all amd64, no multiarch yet)
|
||||
apt-get update -qq
|
||||
apt-get install -y -qq curl git gcc g++ make patchelf pkg-config perl jq \
|
||||
gcc-aarch64-linux-gnu g++-aarch64-linux-gnu
|
||||
|
||||
# Step 2: Multiarch — Ubuntu uses ports.ubuntu.com for arm64,
|
||||
# keeping it on a separate mirror from amd64 (archive.ubuntu.com).
|
||||
# This avoids the binary-all index duplication and -dev package
|
||||
# conflicts that plagued the Debian single-mirror approach.
|
||||
dpkg --add-architecture arm64
|
||||
sed -i 's|^deb http://archive.ubuntu.com|deb [arch=amd64] http://archive.ubuntu.com|g' /etc/apt/sources.list
|
||||
sed -i 's|^deb http://security.ubuntu.com|deb [arch=amd64] http://security.ubuntu.com|g' /etc/apt/sources.list
|
||||
printf '%s\n' \
|
||||
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy main restricted universe multiverse' \
|
||||
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy-updates main restricted universe multiverse' \
|
||||
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy-security main restricted universe multiverse' \
|
||||
> /etc/apt/sources.list.d/arm64-ports.list
|
||||
apt-get update -qq
|
||||
|
||||
# Step 3: ARM64 dev libs — libayatana omitted (no tray icon in this app)
|
||||
apt-get install -y -qq \
|
||||
libwebkit2gtk-4.1-dev:arm64 \
|
||||
libssl-dev:arm64 \
|
||||
libgtk-3-dev:arm64 \
|
||||
librsvg2-dev:arm64
|
||||
|
||||
# Step 4: Node.js
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
|
||||
apt-get install -y nodejs
|
||||
|
||||
# Step 5: Rust (not pre-installed in ubuntu:22.04)
|
||||
# source "$HOME/.cargo/env" in the Build step handles PATH — no GITHUB_PATH needed
|
||||
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
|
||||
--default-toolchain 1.88.0 --profile minimal --no-modify-path
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry/index
|
||||
~/.cargo/registry/cache
|
||||
~/.cargo/git/db
|
||||
key: ${{ runner.os }}-cargo-arm64-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-arm64-
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-npm-
|
||||
- name: Build
|
||||
env:
|
||||
CC_aarch64_unknown_linux_gnu: aarch64-linux-gnu-gcc
|
||||
@ -452,9 +581,7 @@ jobs:
|
||||
OPENSSL_STATIC: "1"
|
||||
APPIMAGE_EXTRACT_AND_RUN: "1"
|
||||
run: |
|
||||
. "$HOME/.cargo/env"
|
||||
npm ci --legacy-peer-deps
|
||||
rustup target add aarch64-unknown-linux-gnu
|
||||
CI=true npx tauri build --target aarch64-unknown-linux-gnu --bundles deb,rpm
|
||||
- name: Upload artifacts
|
||||
env:
|
||||
|
||||
@ -37,11 +37,11 @@ jobs:
|
||||
linux-amd64:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: docker:24-cli
|
||||
image: alpine:latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
apk add --no-cache git
|
||||
apk add --no-cache git docker-cli
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
git fetch --depth=1 origin "$GITHUB_SHA"
|
||||
@ -60,11 +60,11 @@ jobs:
|
||||
windows-cross:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: docker:24-cli
|
||||
image: alpine:latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
apk add --no-cache git
|
||||
apk add --no-cache git docker-cli
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
git fetch --depth=1 origin "$GITHUB_SHA"
|
||||
@ -83,11 +83,11 @@ jobs:
|
||||
linux-arm64:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: docker:24-cli
|
||||
image: alpine:latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
apk add --no-cache git
|
||||
apk add --no-cache git docker-cli
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
git fetch --depth=1 origin "$GITHUB_SHA"
|
||||
|
||||
134
.gitea/workflows/pr-review.yml
Normal file
134
.gitea/workflows/pr-review.yml
Normal file
@ -0,0 +1,134 @@
|
||||
name: PR Review Automation
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened, edited]
|
||||
|
||||
concurrency:
|
||||
group: pr-review-${{ github.event.pull_request.number }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
review:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
pull-requests: write
|
||||
container:
|
||||
image: ubuntu:22.04
|
||||
options: --dns 8.8.8.8 --dns 1.1.1.1
|
||||
steps:
|
||||
- name: Install dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
apt-get update -qq && apt-get install -y -qq git curl jq
|
||||
|
||||
- name: Checkout code
|
||||
shell: bash
|
||||
env:
|
||||
REPOSITORY: ${{ github.repository }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
git init
|
||||
git remote add origin "https://gogs.tftsr.com/${REPOSITORY}.git"
|
||||
git fetch --depth=1 origin ${{ github.head_ref }}
|
||||
git checkout FETCH_HEAD
|
||||
|
||||
- name: Get PR diff
|
||||
id: diff
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
git fetch origin ${{ github.base_ref }}
|
||||
git diff origin/${{ github.base_ref }}..HEAD > /tmp/pr_diff.txt
|
||||
echo "diff_size=$(wc -l < /tmp/pr_diff.txt | tr -d ' ')" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Analyze with LLM
|
||||
id: analyze
|
||||
if: steps.diff.outputs.diff_size != '0'
|
||||
shell: bash
|
||||
env:
|
||||
LITELLM_URL: http://172.0.0.29:11434/v1
|
||||
LITELLM_API_KEY: ${{ secrets.OLLAMA_API_KEY }}
|
||||
PR_TITLE: ${{ github.event.pull_request.title }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
if grep -q "^Binary files" /tmp/pr_diff.txt; then
|
||||
echo "WARNING: Binary file changes detected — they will be excluded from analysis"
|
||||
fi
|
||||
DIFF_CONTENT=$(head -n 500 /tmp/pr_diff.txt \
|
||||
| grep -v -E '^[+-].*(password[[:space:]]*[=:"'"'"']|token[[:space:]]*[=:"'"'"']|secret[[:space:]]*[=:"'"'"']|api_key[[:space:]]*[=:"'"'"']|private_key[[:space:]]*[=:"'"'"']|Authorization:[[:space:]]|AKIA[A-Z0-9]{16}|xox[baprs]-[0-9]{10,13}-[0-9]{10,13}-[a-zA-Z0-9]{24}|gh[opsu]_[A-Za-z0-9_]{36,}|https?://[^@[:space:]]+:[^@[:space:]]+@)' \
|
||||
| grep -v -E '^[+-].*[A-Za-z0-9+/]{40,}={0,2}([^A-Za-z0-9+/=]|$)')
|
||||
PROMPT="Analyze the following code changes for correctness, security issues, and best practices. PR Title: ${PR_TITLE}\n\nDiff:\n${DIFF_CONTENT}\n\nProvide a review with: 1) Summary, 2) Bugs/errors, 3) Security issues, 4) Best practices. Give specific comments with suggested fixes."
|
||||
BODY=$(jq -cn \
|
||||
--arg model "qwen2.5-72b" \
|
||||
--arg content "$PROMPT" \
|
||||
'{model: $model, messages: [{role: "user", content: $content}], stream: false}')
|
||||
echo "[$(date -u +%Y-%m-%dT%H:%M:%SZ)] PR #${PR_NUMBER} - Calling liteLLM API (${#BODY} bytes)..."
|
||||
HTTP_CODE=$(curl -s --max-time 300 --connect-timeout 30 \
|
||||
--retry 3 --retry-delay 10 --retry-connrefused --retry-max-time 300 \
|
||||
-o /tmp/llm_response.json -w "%{http_code}" \
|
||||
-X POST "$LITELLM_URL/chat/completions" \
|
||||
-H "Authorization: Bearer $LITELLM_API_KEY" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$BODY")
|
||||
echo "HTTP status: $HTTP_CODE"
|
||||
echo "Response file size: $(wc -c < /tmp/llm_response.json) bytes"
|
||||
if [ "$HTTP_CODE" != "200" ]; then
|
||||
echo "ERROR: liteLLM returned HTTP $HTTP_CODE"
|
||||
cat /tmp/llm_response.json
|
||||
exit 1
|
||||
fi
|
||||
if ! jq empty /tmp/llm_response.json 2>/dev/null; then
|
||||
echo "ERROR: Invalid JSON response from liteLLM"
|
||||
cat /tmp/llm_response.json
|
||||
exit 1
|
||||
fi
|
||||
REVIEW=$(jq -r '.choices[0].message.content // empty' /tmp/llm_response.json)
|
||||
if [ -z "$REVIEW" ]; then
|
||||
echo "ERROR: No content in liteLLM response"
|
||||
exit 1
|
||||
fi
|
||||
echo "Review length: ${#REVIEW} chars"
|
||||
echo "$REVIEW" > /tmp/pr_review.txt
|
||||
|
||||
- name: Post review comment
|
||||
if: always() && steps.diff.outputs.diff_size != '0'
|
||||
shell: bash
|
||||
env:
|
||||
TF_TOKEN: ${{ secrets.TFT_GITEA_TOKEN }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
REPOSITORY: ${{ github.repository }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
if [ -z "${TF_TOKEN:-}" ]; then
|
||||
echo "ERROR: TFT_GITEA_TOKEN secret is not set"
|
||||
exit 1
|
||||
fi
|
||||
if [ -f "/tmp/pr_review.txt" ] && [ -s "/tmp/pr_review.txt" ]; then
|
||||
REVIEW_BODY=$(head -c 65536 /tmp/pr_review.txt)
|
||||
BODY=$(jq -n \
|
||||
--arg body "Automated PR Review (qwen2.5-72b via liteLLM):\n\n${REVIEW_BODY}\n\n---\n*automated code review*" \
|
||||
'{body: $body, event: "COMMENT"}')
|
||||
else
|
||||
BODY=$(jq -n \
|
||||
'{body: "Automated PR Review could not be completed - LLM analysis failed or produced no output.", event: "COMMENT"}')
|
||||
fi
|
||||
HTTP_CODE=$(curl -s --max-time 30 --connect-timeout 10 \
|
||||
-o /tmp/review_post_response.json -w "%{http_code}" \
|
||||
-X POST "https://gogs.tftsr.com/api/v1/repos/${REPOSITORY}/pulls/${PR_NUMBER}/reviews" \
|
||||
-H "Authorization: Bearer $TF_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$BODY")
|
||||
echo "[$(date -u +%Y-%m-%dT%H:%M:%SZ)] Post review HTTP status: $HTTP_CODE"
|
||||
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
|
||||
echo "ERROR: Failed to post review (HTTP $HTTP_CODE)"
|
||||
cat /tmp/review_post_response.json
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: rm -f /tmp/pr_diff.txt /tmp/llm_response.json /tmp/pr_review.txt /tmp/review_post_response.json
|
||||
@ -1,18 +1,20 @@
|
||||
name: Test
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
pull_request:
|
||||
|
||||
jobs:
|
||||
rust-fmt-check:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
image: 172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
set -eux
|
||||
apt-get update -qq && apt-get install -y -qq git
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
if [ -n "${GITHUB_SHA:-}" ] && git fetch --depth=1 origin "$GITHUB_SHA"; then
|
||||
@ -28,18 +30,31 @@ jobs:
|
||||
echo "Fetched fallback ref: master"
|
||||
fi
|
||||
git checkout FETCH_HEAD
|
||||
- run: rustup component add rustfmt
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry/index
|
||||
~/.cargo/registry/cache
|
||||
~/.cargo/git/db
|
||||
key: ${{ runner.os }}-cargo-linux-amd64-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-linux-amd64-
|
||||
- name: Install dependencies
|
||||
run: npm install --legacy-peer-deps
|
||||
- name: Update version from Git
|
||||
run: node scripts/update-version.mjs
|
||||
- run: cargo generate-lockfile --manifest-path src-tauri/Cargo.toml
|
||||
- run: cargo fmt --manifest-path src-tauri/Cargo.toml --check
|
||||
|
||||
rust-clippy:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
image: 172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
set -eux
|
||||
apt-get update -qq && apt-get install -y -qq git
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
if [ -n "${GITHUB_SHA:-}" ] && git fetch --depth=1 origin "$GITHUB_SHA"; then
|
||||
@ -55,19 +70,26 @@ jobs:
|
||||
echo "Fetched fallback ref: master"
|
||||
fi
|
||||
git checkout FETCH_HEAD
|
||||
- run: apt-get update -qq && apt-get install -y -qq libwebkit2gtk-4.1-dev libssl-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev patchelf pkg-config perl
|
||||
- run: rustup component add clippy
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry/index
|
||||
~/.cargo/registry/cache
|
||||
~/.cargo/git/db
|
||||
key: ${{ runner.os }}-cargo-linux-amd64-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-linux-amd64-
|
||||
- run: cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings
|
||||
|
||||
rust-tests:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
image: 172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22
|
||||
steps:
|
||||
- name: Checkout
|
||||
run: |
|
||||
set -eux
|
||||
apt-get update -qq && apt-get install -y -qq git
|
||||
git init
|
||||
git remote add origin http://172.0.0.29:3000/sarman/tftsr-devops_investigation.git
|
||||
if [ -n "${GITHUB_SHA:-}" ] && git fetch --depth=1 origin "$GITHUB_SHA"; then
|
||||
@ -83,8 +105,17 @@ jobs:
|
||||
echo "Fetched fallback ref: master"
|
||||
fi
|
||||
git checkout FETCH_HEAD
|
||||
- run: apt-get update -qq && apt-get install -y -qq libwebkit2gtk-4.1-dev libssl-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev patchelf pkg-config perl
|
||||
- run: cargo test --manifest-path src-tauri/Cargo.toml
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry/index
|
||||
~/.cargo/registry/cache
|
||||
~/.cargo/git/db
|
||||
key: ${{ runner.os }}-cargo-linux-amd64-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-linux-amd64-
|
||||
- run: cargo test --manifest-path src-tauri/Cargo.toml -- --test-threads=1
|
||||
|
||||
frontend-typecheck:
|
||||
runs-on: ubuntu-latest
|
||||
@ -110,6 +141,13 @@ jobs:
|
||||
echo "Fetched fallback ref: master"
|
||||
fi
|
||||
git checkout FETCH_HEAD
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-npm-
|
||||
- run: npm ci --legacy-peer-deps
|
||||
- run: npx tsc --noEmit
|
||||
|
||||
@ -137,5 +175,12 @@ jobs:
|
||||
echo "Fetched fallback ref: master"
|
||||
fi
|
||||
git checkout FETCH_HEAD
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-npm-
|
||||
- run: npm ci --legacy-peer-deps
|
||||
- run: npm run test:run
|
||||
|
||||
95
.github/workflows/build-images.yml
vendored
Normal file
95
.github/workflows/build-images.yml
vendored
Normal file
@ -0,0 +1,95 @@
|
||||
name: Build CI Docker Images
|
||||
|
||||
# Rebuilds the pre-baked builder images and pushes them to the local Gitea
|
||||
# container registry (172.0.0.29:3000).
|
||||
#
|
||||
# WHEN TO RUN:
|
||||
# - Automatically: whenever a Dockerfile under .docker/ changes on master.
|
||||
# - Manually: via workflow_dispatch (e.g. first-time setup, forced rebuild).
|
||||
#
|
||||
# ONE-TIME SERVER PREREQUISITE (run once on 172.0.0.29 before first use):
|
||||
# echo '{"insecure-registries":["172.0.0.29:3000"]}' \
|
||||
# | sudo tee /etc/docker/daemon.json
|
||||
# sudo systemctl restart docker
|
||||
#
|
||||
# Images produced:
|
||||
# 172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22
|
||||
# 172.0.0.29:3000/sarman/trcaa-windows-cross:rust1.88-node22
|
||||
# 172.0.0.29:3000/sarman/trcaa-linux-arm64:rust1.88-node22
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
paths:
|
||||
- '.docker/**'
|
||||
workflow_dispatch:
|
||||
|
||||
concurrency:
|
||||
group: build-ci-images
|
||||
cancel-in-progress: false
|
||||
|
||||
env:
|
||||
REGISTRY: 172.0.0.29:3000
|
||||
REGISTRY_USER: sarman
|
||||
|
||||
jobs:
|
||||
linux-amd64:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: docker:24-cli
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- name: Build and push linux-amd64 builder
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
echo "$RELEASE_TOKEN" | docker login $REGISTRY -u $REGISTRY_USER --password-stdin
|
||||
docker build \
|
||||
-t $REGISTRY/$REGISTRY_USER/trcaa-linux-amd64:rust1.88-node22 \
|
||||
-f .docker/Dockerfile.linux-amd64 .
|
||||
docker push $REGISTRY/$REGISTRY_USER/trcaa-linux-amd64:rust1.88-node22
|
||||
echo "✓ Pushed $REGISTRY/$REGISTRY_USER/trcaa-linux-amd64:rust1.88-node22"
|
||||
|
||||
windows-cross:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: docker:24-cli
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- name: Build and push windows-cross builder
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
echo "$RELEASE_TOKEN" | docker login $REGISTRY -u $REGISTRY_USER --password-stdin
|
||||
docker build \
|
||||
-t $REGISTRY/$REGISTRY_USER/trcaa-windows-cross:rust1.88-node22 \
|
||||
-f .docker/Dockerfile.windows-cross .
|
||||
docker push $REGISTRY/$REGISTRY_USER/trcaa-windows-cross:rust1.88-node22
|
||||
echo "✓ Pushed $REGISTRY/$REGISTRY_USER/trcaa-windows-cross:rust1.88-node22"
|
||||
|
||||
linux-arm64:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: docker:24-cli
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- name: Build and push linux-arm64 builder
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
echo "$RELEASE_TOKEN" | docker login $REGISTRY -u $REGISTRY_USER --password-stdin
|
||||
docker build \
|
||||
-t $REGISTRY/$REGISTRY_USER/trcaa-linux-arm64:rust1.88-node22 \
|
||||
-f .docker/Dockerfile.linux-arm64 .
|
||||
docker push $REGISTRY/$REGISTRY_USER/trcaa-linux-arm64:rust1.88-node22
|
||||
echo "✓ Pushed $REGISTRY/$REGISTRY_USER/trcaa-linux-arm64:rust1.88-node22"
|
||||
504
.github/workflows/release.yml
vendored
Normal file
504
.github/workflows/release.yml
vendored
Normal file
@ -0,0 +1,504 @@
|
||||
name: Auto Tag
|
||||
|
||||
# Runs on every merge to master — reads the latest semver tag, increments
|
||||
# the patch version, pushes a new tag, then runs release builds in this workflow.
|
||||
# workflow_dispatch allows manual triggering when Gitea drops a push event.
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
workflow_dispatch:
|
||||
|
||||
concurrency:
|
||||
group: auto-tag-master
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
autotag:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: alpine:latest
|
||||
steps:
|
||||
- name: Bump patch version and create tag
|
||||
id: bump
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
apk add --no-cache curl jq git
|
||||
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
|
||||
# Get the latest clean semver tag (vX.Y.Z only, ignore rc/test suffixes)
|
||||
LATEST=$(curl -s "$API/tags?limit=50" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | \
|
||||
jq -r '.[].name' | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | \
|
||||
sort -V | tail -1)
|
||||
|
||||
if [ -z "$LATEST" ]; then
|
||||
NEXT="v0.1.0"
|
||||
else
|
||||
MAJOR=$(echo "$LATEST" | cut -d. -f1 | tr -d 'v')
|
||||
MINOR=$(echo "$LATEST" | cut -d. -f2)
|
||||
PATCH=$(echo "$LATEST" | cut -d. -f3)
|
||||
NEXT="v${MAJOR}.${MINOR}.$((PATCH + 1))"
|
||||
fi
|
||||
|
||||
echo "Latest tag: ${LATEST:-none} → Next: $NEXT"
|
||||
|
||||
# Create and push the tag via git.
|
||||
git init
|
||||
git remote add origin "http://oauth2:${RELEASE_TOKEN}@172.0.0.29:3000/${GITHUB_REPOSITORY}.git"
|
||||
git fetch --depth=1 origin "$GITHUB_SHA"
|
||||
git checkout FETCH_HEAD
|
||||
git config user.name "gitea-actions[bot]"
|
||||
git config user.email "gitea-actions@local"
|
||||
|
||||
if git ls-remote --exit-code --tags origin "refs/tags/$NEXT" >/dev/null 2>&1; then
|
||||
echo "Tag $NEXT already exists; skipping."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
git tag -a "$NEXT" -m "Release $NEXT"
|
||||
git push origin "refs/tags/$NEXT"
|
||||
|
||||
echo "Tag $NEXT pushed successfully"
|
||||
|
||||
wiki-sync:
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: alpine:latest
|
||||
steps:
|
||||
- name: Install dependencies
|
||||
run: apk add --no-cache git
|
||||
|
||||
- name: Checkout main repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
- name: Configure git
|
||||
run: |
|
||||
git config --global user.email "actions@gitea.local"
|
||||
git config --global user.name "Gitea Actions"
|
||||
git config --global credential.helper ''
|
||||
|
||||
- name: Clone and sync wiki
|
||||
env:
|
||||
WIKI_TOKEN: ${{ secrets.Wiki }}
|
||||
run: |
|
||||
cd /tmp
|
||||
if [ -n "$WIKI_TOKEN" ]; then
|
||||
WIKI_URL="http://${WIKI_TOKEN}@172.0.0.29:3000/sarman/tftsr-devops_investigation.wiki.git"
|
||||
else
|
||||
WIKI_URL="http://172.0.0.29:3000/sarman/tftsr-devops_investigation.wiki.git"
|
||||
fi
|
||||
|
||||
if ! git clone "$WIKI_URL" wiki 2>/dev/null; then
|
||||
echo "Wiki doesn't exist yet, creating initial structure..."
|
||||
mkdir -p wiki
|
||||
cd wiki
|
||||
git init
|
||||
git checkout -b master
|
||||
echo "# Wiki" > Home.md
|
||||
git add Home.md
|
||||
git commit -m "Initial wiki commit"
|
||||
git remote add origin "$WIKI_URL"
|
||||
fi
|
||||
|
||||
cd /tmp/wiki
|
||||
if [ -d "$GITHUB_WORKSPACE/docs/wiki" ]; then
|
||||
cp -v "$GITHUB_WORKSPACE"/docs/wiki/*.md . 2>/dev/null || echo "No wiki files to copy"
|
||||
fi
|
||||
|
||||
git add -A
|
||||
if ! git diff --staged --quiet; then
|
||||
git commit -m "docs: sync from docs/wiki/ at commit ${GITHUB_SHA:0:8}"
|
||||
echo "Pushing to wiki..."
|
||||
if git push origin master; then
|
||||
echo "✓ Wiki successfully synced"
|
||||
else
|
||||
echo "⚠ Wiki push failed - check token permissions"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "No wiki changes to commit"
|
||||
fi
|
||||
|
||||
build-linux-amd64:
|
||||
needs: autotag
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
apt-get update -qq && apt-get install -y -qq \
|
||||
libwebkit2gtk-4.1-dev libssl-dev libgtk-3-dev \
|
||||
libayatana-appindicator3-dev librsvg2-dev patchelf \
|
||||
pkg-config curl perl jq
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
|
||||
apt-get install -y nodejs
|
||||
- name: Build
|
||||
run: |
|
||||
npm ci --legacy-peer-deps
|
||||
rustup target add x86_64-unknown-linux-gnu
|
||||
CI=true npx tauri build --target x86_64-unknown-linux-gnu
|
||||
- name: Upload artifacts
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
TAG=$(curl -s "$API/tags?limit=50" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | \
|
||||
jq -r '.[].name' | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | \
|
||||
sort -V | tail -1 || true)
|
||||
if [ -z "$TAG" ]; then
|
||||
echo "ERROR: Could not resolve release tag from repository tags."
|
||||
exit 1
|
||||
fi
|
||||
echo "Creating release for $TAG..."
|
||||
curl -sf -X POST "$API/releases" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"tag_name\":\"$TAG\",\"name\":\"TFTSR $TAG\",\"body\":\"Release $TAG\",\"draft\":false}" || true
|
||||
RELEASE_ID=$(curl -sf "$API/releases/tags/$TAG" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | jq -r '.id')
|
||||
if [ -z "$RELEASE_ID" ] || [ "$RELEASE_ID" = "null" ]; then
|
||||
echo "ERROR: Failed to get release ID for $TAG"
|
||||
exit 1
|
||||
fi
|
||||
echo "Release ID: $RELEASE_ID"
|
||||
ARTIFACTS=$(find src-tauri/target/x86_64-unknown-linux-gnu/release/bundle -type f \
|
||||
\( -name "*.deb" -o -name "*.rpm" -o -name "*.AppImage" \))
|
||||
if [ -z "$ARTIFACTS" ]; then
|
||||
echo "ERROR: No Linux amd64 artifacts were found to upload."
|
||||
exit 1
|
||||
fi
|
||||
printf '%s\n' "$ARTIFACTS" | while IFS= read -r f; do
|
||||
NAME=$(basename "$f")
|
||||
UPLOAD_NAME="linux-amd64-$NAME"
|
||||
echo "Uploading $UPLOAD_NAME..."
|
||||
EXISTING_IDS=$(curl -sf "$API/releases/$RELEASE_ID" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
| jq -r --arg name "$UPLOAD_NAME" '.assets[]? | select(.name == $name) | .id')
|
||||
if [ -n "$EXISTING_IDS" ]; then
|
||||
printf '%s\n' "$EXISTING_IDS" | while IFS= read -r id; do
|
||||
[ -n "$id" ] || continue
|
||||
echo "Deleting existing asset id=$id name=$UPLOAD_NAME before upload..."
|
||||
curl -sf -X DELETE "$API/releases/$RELEASE_ID/assets/$id" \
|
||||
-H "Authorization: token $RELEASE_TOKEN"
|
||||
done
|
||||
fi
|
||||
RESP_FILE=$(mktemp)
|
||||
HTTP_CODE=$(curl -sS -o "$RESP_FILE" -w "%{http_code}" -X POST "$API/releases/$RELEASE_ID/assets" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-F "attachment=@$f;filename=$UPLOAD_NAME")
|
||||
if [ "$HTTP_CODE" -ge 200 ] && [ "$HTTP_CODE" -lt 300 ]; then
|
||||
echo "✓ Uploaded $UPLOAD_NAME"
|
||||
else
|
||||
echo "✗ Upload failed for $UPLOAD_NAME (HTTP $HTTP_CODE)"
|
||||
python -c 'import pathlib,sys;print(pathlib.Path(sys.argv[1]).read_text(errors="replace")[:2000])' "$RESP_FILE"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
build-windows-amd64:
|
||||
needs: autotag
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
apt-get update -qq && apt-get install -y -qq mingw-w64 curl nsis perl make jq
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
|
||||
apt-get install -y nodejs
|
||||
- name: Build
|
||||
env:
|
||||
CC_x86_64_pc_windows_gnu: x86_64-w64-mingw32-gcc
|
||||
CXX_x86_64_pc_windows_gnu: x86_64-w64-mingw32-g++
|
||||
AR_x86_64_pc_windows_gnu: x86_64-w64-mingw32-ar
|
||||
CARGO_TARGET_X86_64_PC_WINDOWS_GNU_LINKER: x86_64-w64-mingw32-gcc
|
||||
OPENSSL_NO_VENDOR: "0"
|
||||
OPENSSL_STATIC: "1"
|
||||
run: |
|
||||
npm ci --legacy-peer-deps
|
||||
rustup target add x86_64-pc-windows-gnu
|
||||
CI=true npx tauri build --target x86_64-pc-windows-gnu
|
||||
- name: Upload artifacts
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
TAG=$(curl -s "$API/tags?limit=50" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | \
|
||||
jq -r '.[].name' | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | \
|
||||
sort -V | tail -1 || true)
|
||||
if [ -z "$TAG" ]; then
|
||||
echo "ERROR: Could not resolve release tag from repository tags."
|
||||
exit 1
|
||||
fi
|
||||
echo "Creating release for $TAG..."
|
||||
curl -sf -X POST "$API/releases" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"tag_name\":\"$TAG\",\"name\":\"TFTSR $TAG\",\"body\":\"Release $TAG\",\"draft\":false}" || true
|
||||
RELEASE_ID=$(curl -sf "$API/releases/tags/$TAG" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | jq -r '.id')
|
||||
if [ -z "$RELEASE_ID" ] || [ "$RELEASE_ID" = "null" ]; then
|
||||
echo "ERROR: Failed to get release ID for $TAG"
|
||||
exit 1
|
||||
fi
|
||||
echo "Release ID: $RELEASE_ID"
|
||||
ARTIFACTS=$(find src-tauri/target/x86_64-pc-windows-gnu/release/bundle -type f \
|
||||
\( -name "*.exe" -o -name "*.msi" \) 2>/dev/null)
|
||||
if [ -z "$ARTIFACTS" ]; then
|
||||
echo "ERROR: No Windows amd64 artifacts were found to upload."
|
||||
exit 1
|
||||
fi
|
||||
printf '%s\n' "$ARTIFACTS" | while IFS= read -r f; do
|
||||
NAME=$(basename "$f")
|
||||
echo "Uploading $NAME..."
|
||||
EXISTING_IDS=$(curl -sf "$API/releases/$RELEASE_ID" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
| jq -r --arg name "$NAME" '.assets[]? | select(.name == $name) | .id')
|
||||
if [ -n "$EXISTING_IDS" ]; then
|
||||
printf '%s\n' "$EXISTING_IDS" | while IFS= read -r id; do
|
||||
[ -n "$id" ] || continue
|
||||
echo "Deleting existing asset id=$id name=$NAME before upload..."
|
||||
curl -sf -X DELETE "$API/releases/$RELEASE_ID/assets/$id" \
|
||||
-H "Authorization: token $RELEASE_TOKEN"
|
||||
done
|
||||
fi
|
||||
RESP_FILE=$(mktemp)
|
||||
HTTP_CODE=$(curl -sS -o "$RESP_FILE" -w "%{http_code}" -X POST "$API/releases/$RELEASE_ID/assets" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-F "attachment=@$f;filename=$NAME")
|
||||
if [ "$HTTP_CODE" -ge 200 ] && [ "$HTTP_CODE" -lt 300 ]; then
|
||||
echo "✓ Uploaded $NAME"
|
||||
else
|
||||
echo "✗ Upload failed for $NAME (HTTP $HTTP_CODE)"
|
||||
python -c 'import pathlib,sys;print(pathlib.Path(sys.argv[1]).read_text(errors="replace")[:2000])' "$RESP_FILE"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
build-macos-arm64:
|
||||
needs: autotag
|
||||
runs-on: macos-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- name: Build
|
||||
env:
|
||||
MACOSX_DEPLOYMENT_TARGET: "11.0"
|
||||
run: |
|
||||
npm ci --legacy-peer-deps
|
||||
rustup target add aarch64-apple-darwin
|
||||
CI=true npx tauri build --target aarch64-apple-darwin --bundles app
|
||||
APP=$(find src-tauri/target/aarch64-apple-darwin/release/bundle/macos -maxdepth 1 -type d -name "*.app" | head -n 1)
|
||||
if [ -z "$APP" ]; then
|
||||
echo "ERROR: Could not find macOS app bundle"
|
||||
exit 1
|
||||
fi
|
||||
APP_NAME=$(basename "$APP" .app)
|
||||
codesign --deep --force --sign - "$APP"
|
||||
mkdir -p src-tauri/target/aarch64-apple-darwin/release/bundle/dmg
|
||||
DMG=src-tauri/target/aarch64-apple-darwin/release/bundle/dmg/${APP_NAME}.dmg
|
||||
hdiutil create -volname "$APP_NAME" -srcfolder "$APP" -ov -format UDZO "$DMG"
|
||||
- name: Upload artifacts
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
TAG=$(curl -s "$API/tags?limit=50" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | \
|
||||
jq -r '.[].name' | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | \
|
||||
sort -V | tail -1 || true)
|
||||
if [ -z "$TAG" ]; then
|
||||
echo "ERROR: Could not resolve release tag from repository tags."
|
||||
exit 1
|
||||
fi
|
||||
echo "Creating release for $TAG..."
|
||||
curl -sf -X POST "$API/releases" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"tag_name\":\"$TAG\",\"name\":\"TFTSR $TAG\",\"body\":\"Release $TAG\",\"draft\":false}" || true
|
||||
RELEASE_ID=$(curl -sf "$API/releases/tags/$TAG" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | jq -r '.id')
|
||||
if [ -z "$RELEASE_ID" ] || [ "$RELEASE_ID" = "null" ]; then
|
||||
echo "ERROR: Failed to get release ID for $TAG"
|
||||
exit 1
|
||||
fi
|
||||
echo "Release ID: $RELEASE_ID"
|
||||
ARTIFACTS=$(find src-tauri/target/aarch64-apple-darwin/release/bundle -type f -name "*.dmg")
|
||||
if [ -z "$ARTIFACTS" ]; then
|
||||
echo "ERROR: No macOS arm64 DMG artifacts were found to upload."
|
||||
exit 1
|
||||
fi
|
||||
printf '%s\n' "$ARTIFACTS" | while IFS= read -r f; do
|
||||
NAME=$(basename "$f")
|
||||
echo "Uploading $NAME..."
|
||||
EXISTING_IDS=$(curl -sf "$API/releases/$RELEASE_ID" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
| jq -r --arg name "$NAME" '.assets[]? | select(.name == $name) | .id')
|
||||
if [ -n "$EXISTING_IDS" ]; then
|
||||
printf '%s\n' "$EXISTING_IDS" | while IFS= read -r id; do
|
||||
[ -n "$id" ] || continue
|
||||
echo "Deleting existing asset id=$id name=$NAME before upload..."
|
||||
curl -sf -X DELETE "$API/releases/$RELEASE_ID/assets/$id" \
|
||||
-H "Authorization: token $RELEASE_TOKEN"
|
||||
done
|
||||
fi
|
||||
RESP_FILE=$(mktemp)
|
||||
HTTP_CODE=$(curl -sS -o "$RESP_FILE" -w "%{http_code}" -X POST "$API/releases/$RELEASE_ID/assets" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-F "attachment=@$f;filename=$NAME")
|
||||
if [ "$HTTP_CODE" -ge 200 ] && [ "$HTTP_CODE" -lt 300 ]; then
|
||||
echo "✓ Uploaded $NAME"
|
||||
else
|
||||
echo "✗ Upload failed for $NAME (HTTP $HTTP_CODE)"
|
||||
python -c 'import pathlib,sys;print(pathlib.Path(sys.argv[1]).read_text(errors="replace")[:2000])' "$RESP_FILE"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
build-linux-arm64:
|
||||
needs: autotag
|
||||
runs-on: linux-amd64
|
||||
container:
|
||||
image: ubuntu:22.04
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- name: Install dependencies
|
||||
env:
|
||||
DEBIAN_FRONTEND: noninteractive
|
||||
run: |
|
||||
# Step 1: Host tools + cross-compiler (all amd64, no multiarch yet)
|
||||
apt-get update -qq
|
||||
apt-get install -y -qq curl git gcc g++ make patchelf pkg-config perl jq \
|
||||
gcc-aarch64-linux-gnu g++-aarch64-linux-gnu
|
||||
|
||||
# Step 2: Multiarch — Ubuntu uses ports.ubuntu.com for arm64,
|
||||
# keeping it on a separate mirror from amd64 (archive.ubuntu.com).
|
||||
# This avoids the binary-all index duplication and -dev package
|
||||
# conflicts that plagued the Debian single-mirror approach.
|
||||
dpkg --add-architecture arm64
|
||||
sed -i 's|^deb http://archive.ubuntu.com|deb [arch=amd64] http://archive.ubuntu.com|g' /etc/apt/sources.list
|
||||
sed -i 's|^deb http://security.ubuntu.com|deb [arch=amd64] http://security.ubuntu.com|g' /etc/apt/sources.list
|
||||
printf '%s\n' \
|
||||
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy main restricted universe multiverse' \
|
||||
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy-updates main restricted universe multiverse' \
|
||||
'deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports jammy-security main restricted universe multiverse' \
|
||||
> /etc/apt/sources.list.d/arm64-ports.list
|
||||
apt-get update -qq
|
||||
|
||||
# Step 3: ARM64 dev libs — libayatana omitted (no tray icon in this app)
|
||||
apt-get install -y -qq \
|
||||
libwebkit2gtk-4.1-dev:arm64 \
|
||||
libssl-dev:arm64 \
|
||||
libgtk-3-dev:arm64 \
|
||||
librsvg2-dev:arm64
|
||||
|
||||
# Step 4: Node.js
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | bash -
|
||||
apt-get install -y nodejs
|
||||
|
||||
# Step 5: Rust (not pre-installed in ubuntu:22.04)
|
||||
# source "$HOME/.cargo/env" in the Build step handles PATH — no GITHUB_PATH needed
|
||||
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
|
||||
--default-toolchain 1.88.0 --profile minimal --no-modify-path
|
||||
- name: Build
|
||||
env:
|
||||
CC_aarch64_unknown_linux_gnu: aarch64-linux-gnu-gcc
|
||||
CXX_aarch64_unknown_linux_gnu: aarch64-linux-gnu-g++
|
||||
AR_aarch64_unknown_linux_gnu: aarch64-linux-gnu-ar
|
||||
CARGO_TARGET_AARCH64_UNKNOWN_LINUX_GNU_LINKER: aarch64-linux-gnu-gcc
|
||||
PKG_CONFIG_SYSROOT_DIR: /usr/aarch64-linux-gnu
|
||||
PKG_CONFIG_PATH: /usr/lib/aarch64-linux-gnu/pkgconfig
|
||||
PKG_CONFIG_ALLOW_CROSS: "1"
|
||||
OPENSSL_NO_VENDOR: "0"
|
||||
OPENSSL_STATIC: "1"
|
||||
APPIMAGE_EXTRACT_AND_RUN: "1"
|
||||
run: |
|
||||
. "$HOME/.cargo/env"
|
||||
npm ci --legacy-peer-deps
|
||||
rustup target add aarch64-unknown-linux-gnu
|
||||
CI=true npx tauri build --target aarch64-unknown-linux-gnu --bundles deb,rpm
|
||||
- name: Upload artifacts
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
run: |
|
||||
set -eu
|
||||
API="http://172.0.0.29:3000/api/v1/repos/$GITHUB_REPOSITORY"
|
||||
TAG=$(curl -s "$API/tags?limit=50" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | \
|
||||
jq -r '.[].name' | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | \
|
||||
sort -V | tail -1 || true)
|
||||
if [ -z "$TAG" ]; then
|
||||
echo "ERROR: Could not resolve release tag from repository tags."
|
||||
exit 1
|
||||
fi
|
||||
echo "Creating release for $TAG..."
|
||||
curl -sf -X POST "$API/releases" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"tag_name\":\"$TAG\",\"name\":\"TFTSR $TAG\",\"body\":\"Release $TAG\",\"draft\":false}" || true
|
||||
RELEASE_ID=$(curl -sf "$API/releases/tags/$TAG" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" | jq -r '.id')
|
||||
if [ -z "$RELEASE_ID" ] || [ "$RELEASE_ID" = "null" ]; then
|
||||
echo "ERROR: Failed to get release ID for $TAG"
|
||||
exit 1
|
||||
fi
|
||||
echo "Release ID: $RELEASE_ID"
|
||||
ARTIFACTS=$(find src-tauri/target/aarch64-unknown-linux-gnu/release/bundle -type f \
|
||||
\( -name "*.deb" -o -name "*.rpm" -o -name "*.AppImage" \))
|
||||
if [ -z "$ARTIFACTS" ]; then
|
||||
echo "ERROR: No Linux arm64 artifacts were found to upload."
|
||||
exit 1
|
||||
fi
|
||||
printf '%s\n' "$ARTIFACTS" | while IFS= read -r f; do
|
||||
NAME=$(basename "$f")
|
||||
UPLOAD_NAME="linux-arm64-$NAME"
|
||||
echo "Uploading $UPLOAD_NAME..."
|
||||
EXISTING_IDS=$(curl -sf "$API/releases/$RELEASE_ID" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
| jq -r --arg name "$UPLOAD_NAME" '.assets[]? | select(.name == $name) | .id')
|
||||
if [ -n "$EXISTING_IDS" ]; then
|
||||
printf '%s\n' "$EXISTING_IDS" | while IFS= read -r id; do
|
||||
[ -n "$id" ] || continue
|
||||
echo "Deleting existing asset id=$id name=$UPLOAD_NAME before upload..."
|
||||
curl -sf -X DELETE "$API/releases/$RELEASE_ID/assets/$id" \
|
||||
-H "Authorization: token $RELEASE_TOKEN"
|
||||
done
|
||||
fi
|
||||
RESP_FILE=$(mktemp)
|
||||
HTTP_CODE=$(curl -sS -o "$RESP_FILE" -w "%{http_code}" -X POST "$API/releases/$RELEASE_ID/assets" \
|
||||
-H "Authorization: token $RELEASE_TOKEN" \
|
||||
-F "attachment=@$f;filename=$UPLOAD_NAME")
|
||||
if [ "$HTTP_CODE" -ge 200 ] && [ "$HTTP_CODE" -lt 300 ]; then
|
||||
echo "✓ Uploaded $UPLOAD_NAME"
|
||||
else
|
||||
echo "✗ Upload failed for $UPLOAD_NAME (HTTP $HTTP_CODE)"
|
||||
python -c 'import pathlib,sys;print(pathlib.Path(sys.argv[1]).read_text(errors="replace")[:2000])' "$RESP_FILE"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
66
.github/workflows/test.yml
vendored
Normal file
66
.github/workflows/test.yml
vendored
Normal file
@ -0,0 +1,66 @@
|
||||
name: Test
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
|
||||
jobs:
|
||||
rust-fmt-check:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- run: rustup component add rustfmt
|
||||
- run: cargo fmt --manifest-path src-tauri/Cargo.toml --check
|
||||
|
||||
rust-clippy:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- run: apt-get update -qq && apt-get install -y -qq libwebkit2gtk-4.1-dev libssl-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev patchelf pkg-config perl
|
||||
- run: rustup component add clippy
|
||||
- run: cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings
|
||||
|
||||
rust-tests:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: rust:1.88-slim
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- run: apt-get update -qq && apt-get install -y -qq libwebkit2gtk-4.1-dev libssl-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev patchelf pkg-config perl
|
||||
- run: cargo test --manifest-path src-tauri/Cargo.toml
|
||||
|
||||
frontend-typecheck:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: node:22-alpine
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- run: npm ci --legacy-peer-deps
|
||||
- run: npx tsc --noEmit
|
||||
|
||||
frontend-tests:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: node:22-alpine
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
- run: npm ci --legacy-peer-deps
|
||||
- run: npm run test:run
|
||||
157
AGENTS.md
Normal file
157
AGENTS.md
Normal file
@ -0,0 +1,157 @@
|
||||
# AGENTS.md — Quick Start for OpenCode
|
||||
|
||||
## Commands
|
||||
|
||||
| Task | Command |
|
||||
|------|---------|
|
||||
| Run full dev (Tauri + Vite) | `cargo tauri dev` |
|
||||
| Frontend only (port 1420) | `npm run dev` |
|
||||
| Frontend production build | `npm run build` |
|
||||
| Rust fmt check | `cargo fmt --manifest-path src-tauri/Cargo.toml --check` |
|
||||
| Rust fmt fix | `cargo fmt --manifest-path src-tauri/Cargo.toml` |
|
||||
| Rust clippy | `cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings` |
|
||||
| Rust tests | `cargo test --manifest-path src-tauri/Cargo.toml -- --test-threads=1` |
|
||||
| Rust single test module | `cargo test --manifest-path src-tauri/Cargo.toml -- --test-threads=1 pii::detector` |
|
||||
| Rust single test | `cargo test --manifest-path src-tauri/Cargo.toml -- --test-threads=1 test_detect_ipv4` |
|
||||
| Frontend test (single run) | `npm run test:run` |
|
||||
| Frontend test (watch) | `npm run test` |
|
||||
| Frontend coverage | `npm run test:coverage` |
|
||||
| TypeScript type check | `npx tsc --noEmit` |
|
||||
| Frontend lint | `npx eslint . --quiet` |
|
||||
|
||||
**Lint Policy**: **ALWAYS run `cargo fmt` and `cargo clippy` after any Rust code change**. Fix all issues before proceeding.
|
||||
|
||||
**Note**: The build runs `npm run build` before Rust build (via `beforeBuildCommand` in `tauri.conf.json`). This ensures TS is type-checked before packaging.
|
||||
|
||||
**Requirement**: Rust toolchain must be in PATH: `source ~/.cargo/env`
|
||||
|
||||
---
|
||||
|
||||
## Project Structure
|
||||
|
||||
| Path | Responsibility |
|
||||
|------|----------------|
|
||||
| `src-tauri/src/lib.rs` | Entry point: app builder, plugin registration, IPC handler registration |
|
||||
| `src-tauri/src/state.rs` | `AppState` (DB, settings, integration_webviews) |
|
||||
| `src-tauri/src/commands/` | Tauri IPC handlers (db, ai, analysis, docs, integrations, system) |
|
||||
| `src-tauri/src/ai/provider.rs` | `Provider` trait + `create_provider()` factory |
|
||||
| `src-tauri/src/pii/` | Detection engine (12 patterns) + redaction |
|
||||
| `src-tauri/src/db/models.rs` | DB types: `Issue`, `IssueDetail` (nested), `LogFile`, `ResolutionStep`, `AiConversation` |
|
||||
| `src-tauri/src/audit/log.rs` | `write_audit_event()` before every external send |
|
||||
| `src/lib/tauriCommands.ts` | **Source of truth** for all Tauri IPC calls |
|
||||
| `src/lib/domainPrompts.ts` | 15 domain system prompts (Linux, Windows, Network, K8s, DBs, etc.) |
|
||||
| `src/stores/` | Zustand: `sessionStore` (ephemeral), `settingsStore` (persisted), `historyStore` |
|
||||
|
||||
---
|
||||
|
||||
## Key Patterns
|
||||
|
||||
### Rust Mutex Usage
|
||||
Lock `Mutex` inside a block and release **before** `.await`. Holding `MutexGuard` across await points fails to compile (not `Send`):
|
||||
|
||||
```rust
|
||||
let state: State<'_, AppState> = app.state();
|
||||
let db = state.db.clone();
|
||||
// Lock and release before await
|
||||
{ let conn = state.db.lock().unwrap(); /* use conn */ }
|
||||
// Now safe to .await
|
||||
db.query(...).await?;
|
||||
```
|
||||
|
||||
### IssueDetail Nesting
|
||||
`get_issue()` returns a **nested** struct — use `detail.issue.title`, not `detail.title`:
|
||||
|
||||
```rust
|
||||
pub struct IssueDetail {
|
||||
pub issue: Issue,
|
||||
pub log_files: Vec<LogFile>,
|
||||
pub resolution_steps: Vec<ResolutionStep>,
|
||||
pub conversations: Vec<AiConversation>,
|
||||
}
|
||||
```
|
||||
|
||||
TypeScript mirrors this shape exactly in `tauriCommands.ts`.
|
||||
|
||||
### PII Before AI Send
|
||||
`apply_redactions` **must** be called before sending logs to AI. Record the SHA-256 hash via `audit::log::write_audit_event()`. PII spans are non-overlapping (longest span wins on overlap); redactor iterates in reverse order to preserve offsets.
|
||||
|
||||
### State Persistence
|
||||
- `sessionStore`: ephemeral triage session (issue, messages, PII spans, why-level 0–5, loading) — **not persisted**
|
||||
- `settingsStore`: persisted to `localStorage` as `"tftsr-settings"`
|
||||
|
||||
---
|
||||
|
||||
## CI/CD (Gitea Actions)
|
||||
|
||||
| Workflow | Trigger | Jobs |
|
||||
|----------|---------|------|
|
||||
| `.gitea/workflows/test.yml` | Every push/PR | `rustfmt` → `clippy` → `cargo test` (64 tests) → `tsc --noEmit` → `vitest run` (13 tests) |
|
||||
| `.gitea/workflows/auto-tag.yml` | Push to master | Auto-tag, build linux/amd64 + windows/amd64 + linux/arm64 + macOS, upload assets to Gitea release |
|
||||
|
||||
**Artifacts**: `src-tauri/target/{target}/release/bundle/`
|
||||
|
||||
**Environments**:
|
||||
- Test CI images at `172.0.0.29:3000` (pull `trcaa-*:rust1.88-node22`)
|
||||
- Gitea instance: `http://172.0.0.29:3000`
|
||||
- Wiki: sync from `docs/wiki/*.md` → `https://gogs.tftsr.com/sarman/tftsr-devops_investigation/wiki`
|
||||
|
||||
---
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Purpose |
|
||||
|----------|---------|---------|
|
||||
| `TFTSR_DATA_DIR` | Platform data dir | Override database location |
|
||||
| `TFTSR_DB_KEY` | Auto-generated | SQLCipher encryption key override |
|
||||
| `TFTSR_ENCRYPTION_KEY` | Auto-generated | Credential encryption key override |
|
||||
| `RUST_LOG` | `info` | Tracing level (`debug`, `info`, `warn`, `error`) |
|
||||
|
||||
**Database path**:
|
||||
- Linux: `~/.local/share/trcaa/trcaa.db`
|
||||
- macOS: `~/Library/Application Support/trcaa/trcaa.db`
|
||||
- Windows: `%APPDATA%\trcaa\trcaa.db`
|
||||
|
||||
---
|
||||
|
||||
## Architecture Highlights
|
||||
|
||||
### Rust Backend
|
||||
- **Entry point**: `src-tauri/src/lib.rs::run()` → init tracing → init DB → register plugins → `generate_handler![]`
|
||||
- **Database**: `rusqlite` + `bundled-sqlcipher-vendored-openssl` (AES-256). `cfg!(debug_assertions)` → plain SQLite; release → SQLCipher
|
||||
- **AI providers**: `Provider` trait with factory dispatch on `config.name`. Adding a provider: implement `Provider` trait + add match arm
|
||||
- **Integration clients**: Confluence, ServiceNow, Azure DevOps stubs (v0.2). OAuth2 via WebView + callback server (warp, port 8765)
|
||||
|
||||
### Frontend (React + Vite)
|
||||
- **Dev server**: port **1420** (hardcoded)
|
||||
- **IPC**: All `invoke()` calls in `src/lib/tauriCommands.ts` — typed wrappers for every backend command
|
||||
- **Domain prompts**: 15 expert prompts injected as first message in every triage conversation (Linux, Windows, Network, K8s, DBs, Virtualization, Hardware, Observability, Telephony, Security, Public Safety, Application, Automation, HPE, Dell, Identity)
|
||||
|
||||
### Security
|
||||
- **Database encryption**: AES-256 (SQLCipher in release builds)
|
||||
- **Credential encryption**: AES-256-GCM, keys stored in `TFTSR_ENCRYPTION_KEY` or auto-generated `.enckey` (mode 0600)
|
||||
- **Audit trail**: Hash-chained entries (`prev_hash` + `entry_hash`) for tamper evidence
|
||||
- **PII protection**: 12-pattern detector → user approval gate → hash-chained audit entry
|
||||
|
||||
---
|
||||
|
||||
## Testing
|
||||
|
||||
| Layer | Command | Notes |
|
||||
|-------|---------|-------|
|
||||
| Rust | `cargo test --manifest-path src-tauri/Cargo.toml` | 64 tests, runs in `rust:1.88-slim` container |
|
||||
| TypeScript | `npm run test:run` | Vitest, 13 tests |
|
||||
| Type check | `npx tsc --noEmit` | `skipLibCheck: true` |
|
||||
| E2E | `TAURI_BINARY_PATH=./src-tauri/target/release/tftsr npm run test:e2e` | WebdriverIO, requires compiled binary |
|
||||
|
||||
**Frontend coverage**: `npm run test:coverage` → `tests/unit/` coverage report
|
||||
|
||||
---
|
||||
|
||||
## Critical Gotchas
|
||||
|
||||
1. **Mutex across await**: Never `lock().unwrap()` and `.await` without releasing the guard
|
||||
2. **IssueDetail nesting**: `detail.issue.title`, never `detail.title`
|
||||
3. **PII before AI**: Always redact and record hash before external send
|
||||
4. **Port 1420**: Vite dev server is hard-coded to 1420, not 3000
|
||||
5. **Build order**: Rust fmt → clippy → test → TS check → JS test
|
||||
6. **CI images**: Use `172.0.0.29:3000` registry for pre-baked builder images
|
||||
467
CHANGELOG.md
Normal file
467
CHANGELOG.md
Normal file
@ -0,0 +1,467 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to TFTSR are documented here.
|
||||
Commit types shown: feat, fix, perf, docs, refactor.
|
||||
CI, chore, and build changes are excluded.
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
### Bug Fixes
|
||||
- Harden timeline event input validation and atomic writes
|
||||
|
||||
### Documentation
|
||||
- Update wiki for timeline events and incident response methodology
|
||||
|
||||
### Features
|
||||
- Add timeline_events table, model, and CRUD commands
|
||||
- Populate RCA and postmortem docs with real timeline data
|
||||
- Wire incident response methodology into AI and record triage events
|
||||
|
||||
## [0.2.65] — 2026-04-15
|
||||
|
||||
### Bug Fixes
|
||||
- Add --locked to cargo commands and improve version update script
|
||||
- Remove invalid --locked flag from cargo commands and fix format string
|
||||
- **integrations**: Security and correctness improvements
|
||||
- Correct WIQL syntax and escape_wiql implementation
|
||||
|
||||
### Features
|
||||
- Implement dynamic versioning from Git tags
|
||||
- **integrations**: Implement query expansion for semantic search
|
||||
|
||||
### Security
|
||||
- Fix query expansion issues from PR review
|
||||
- Address all issues from automated PR review
|
||||
|
||||
## [0.2.63] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- Add Windows nsis target and update CHANGELOG to v0.2.61
|
||||
|
||||
## [0.2.61] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- Remove AppImage from upload artifact patterns
|
||||
|
||||
## [0.2.59] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- Remove AppImage bundling to fix linux-amd64 build
|
||||
|
||||
## [0.2.57] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- Add fuse dependency for AppImage support
|
||||
|
||||
### Refactoring
|
||||
- Remove custom linuxdeploy install per CI CI uses tauri-downloaded version
|
||||
- Revert to original Dockerfile without manual linuxdeploy installation
|
||||
|
||||
## [0.2.56] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- Add missing ai_providers columns and fix linux-amd64 build
|
||||
- Address AI review findings
|
||||
- Address critical AI review issues
|
||||
|
||||
## [0.2.55] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Use Gitea file API to push CHANGELOG.md — eliminates non-fast-forward rejection
|
||||
- **ci**: Harden CHANGELOG.md API push step per review
|
||||
|
||||
## [0.2.54] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Correct git-cliff archive path in tar extraction
|
||||
|
||||
## [0.2.53] — 2026-04-13
|
||||
|
||||
### Features
|
||||
- **ci**: Add automated changelog generation via git-cliff
|
||||
|
||||
## [0.2.52] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Add APPIMAGE_EXTRACT_AND_RUN to build-linux-amd64
|
||||
|
||||
## [0.2.51] — 2026-04-13
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Address AI review — rustup idempotency and cargo --locked
|
||||
- **ci**: Replace docker:24-cli with alpine + docker-cli in build-images
|
||||
- **docker**: Add ca-certificates to arm64 base image step 1
|
||||
- **ci**: Resolve test.yml failures — Cargo.lock, updated test assertions
|
||||
- **ci**: Address second AI review — || true, ca-certs, cache@v4, key suffixes
|
||||
|
||||
### Documentation
|
||||
- **docker**: Expand rebuild trigger comments to include OpenSSL and Tauri CLI
|
||||
|
||||
### Performance
|
||||
- **ci**: Use pre-baked images and add cargo/npm caching
|
||||
|
||||
## [0.2.50] — 2026-04-12
|
||||
|
||||
### Bug Fixes
|
||||
- Rename GITEA_TOKEN to TF_TOKEN to comply with naming restrictions
|
||||
- Remove actions/checkout to avoid Node.js dependency
|
||||
- Use ubuntu container with git installed
|
||||
- Use actions/checkout with token auth and self-hosted runner
|
||||
- Use IP addresses for internal services
|
||||
- Simplified workflow syntax
|
||||
- Add debugging output for Ollamaresponse
|
||||
- Correct Ollama URL, API endpoint, and JSON construction in pr-review workflow
|
||||
- Add diagnostics to identify empty Ollama response root cause
|
||||
- Use bash shell and remove bash-only substring expansion in pr-review
|
||||
- Restore migration 014, bump version to 0.2.50, harden pr-review workflow
|
||||
- Harden pr-review workflow and sync versions to 0.2.50
|
||||
- Configure container DNS to resolve ollama-ui.tftsr.com
|
||||
- Harden pr-review workflow — URLs, DNS, correctness and reliability
|
||||
- Resolve AI review false positives and address high/medium issues
|
||||
- Replace github.server_url with hardcoded gogs.tftsr.com for container access
|
||||
- Revert to two-dot diff — three-dot requires merge base unavailable in shallow clone
|
||||
- Harden pr-review workflow — secret redaction, log safety, auth header
|
||||
|
||||
### Features
|
||||
- Add automated PR review workflow with Ollama AI
|
||||
|
||||
## [0.2.49] — 2026-04-10
|
||||
|
||||
### Bug Fixes
|
||||
- Add missing ai_providers migration (014)
|
||||
|
||||
## [0.2.48] — 2026-04-10
|
||||
|
||||
### Bug Fixes
|
||||
- Lint fixes and formatting cleanup
|
||||
|
||||
### Features
|
||||
- Support GenAI datastore file uploads and fix paste image upload
|
||||
|
||||
## [0.2.47] — 2026-04-09
|
||||
|
||||
### Bug Fixes
|
||||
- Use 'provider' argument name to match Rust command signature
|
||||
|
||||
## [0.2.46] — 2026-04-09
|
||||
|
||||
### Bug Fixes
|
||||
- Add @types/testing-library__react for TypeScript compilation
|
||||
|
||||
### Update
|
||||
- Node_modules from npm install
|
||||
|
||||
## [0.2.45] — 2026-04-09
|
||||
|
||||
### Bug Fixes
|
||||
- Force single test thread for Rust tests to eliminate race conditions
|
||||
|
||||
## [0.2.43] — 2026-04-09
|
||||
|
||||
### Bug Fixes
|
||||
- Fix encryption test race condition with parallel tests
|
||||
- OpenWebUI provider connection and missing command registrations
|
||||
|
||||
### Features
|
||||
- Add image attachment support with PII detection
|
||||
|
||||
## [0.2.42] — 2026-04-07
|
||||
|
||||
### Documentation
|
||||
- Add AGENTS.md and SECURITY_AUDIT.md
|
||||
|
||||
## [0.2.41] — 2026-04-07
|
||||
|
||||
### Bug Fixes
|
||||
- **db,auth**: Auto-generate encryption keys for release builds
|
||||
- **lint**: Use inline format args in auth.rs
|
||||
- **lint**: Resolve all clippy warnings for CI compliance
|
||||
- **fmt**: Apply rustfmt formatting to webview_fetch.rs
|
||||
- **types**: Replace normalizeApiFormat() calls with direct value
|
||||
|
||||
### Documentation
|
||||
- **architecture**: Add C4 diagrams, ADRs, and architecture overview
|
||||
|
||||
### Features
|
||||
- **ai**: Add tool-calling and integration search as AI data source
|
||||
|
||||
## [0.2.40] — 2026-04-06
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Remove explicit docker.sock mount — act_runner mounts it automatically
|
||||
|
||||
## [0.2.36] — 2026-04-06
|
||||
|
||||
### Features
|
||||
- **ci**: Add persistent pre-baked Docker builder images
|
||||
|
||||
## [0.2.35] — 2026-04-06
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Skip Ollama download on macOS build — runner has no access to GitHub binary assets
|
||||
- **ci**: Remove all Ollama bundle download steps — use UI download button instead
|
||||
|
||||
### Refactoring
|
||||
- **ollama**: Remove download/install buttons — show plain install instructions only
|
||||
|
||||
## [0.2.34] — 2026-04-06
|
||||
|
||||
### Bug Fixes
|
||||
- **security**: Add path canonicalization and actionable permission error in install_ollama_from_bundle
|
||||
|
||||
### Features
|
||||
- **ui**: Fix model dropdown, auth prefill, PII persistence, theme toggle, and Ollama bundle
|
||||
|
||||
## [0.2.33] — 2026-04-05
|
||||
|
||||
### Features
|
||||
- **rebrand**: Rename binary to trcaa and auto-generate DB key
|
||||
|
||||
## [0.2.32] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Restrict arm64 bundles to deb,rpm — skip AppImage
|
||||
|
||||
## [0.2.31] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Set APPIMAGE_EXTRACT_AND_RUN=1 for arm64 AppImage bundling
|
||||
|
||||
## [0.2.30] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Add make to arm64 host tools for OpenSSL vendored build
|
||||
|
||||
## [0.2.28] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Use POSIX dot instead of source in arm64 build step
|
||||
|
||||
## [0.2.27] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Remove GITHUB_PATH append that was breaking arm64 install step
|
||||
|
||||
## [0.2.26] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Switch build-linux-arm64 to Ubuntu 22.04 with ports mirror
|
||||
|
||||
### Documentation
|
||||
- Update CI pipeline wiki and add ticket summary for arm64 fix
|
||||
|
||||
## [0.2.25] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Rebuild apt sources with per-arch entries before arm64 cross-compile install
|
||||
- **ci**: Add workflow_dispatch and concurrency guard to auto-tag
|
||||
- **ci**: Replace heredoc with printf in arm64 install step
|
||||
|
||||
## [0.2.24] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Fix arm64 cross-compile, drop cargo install tauri-cli, move wiki-sync
|
||||
|
||||
## [0.2.23] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Unblock release jobs and namespace linux artifacts by arch
|
||||
- **security**: Harden secret handling and audit integrity
|
||||
- **pii**: Remove lookahead from hostname regex, fix fmt in analysis test
|
||||
- **security**: Enforce PII redaction before AI log transmission
|
||||
- **ci**: Unblock release jobs and namespace linux artifacts by arch
|
||||
|
||||
## [0.2.22] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Run linux arm release natively and enforce arm artifacts
|
||||
|
||||
## [0.2.21] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Force explicit linux arm64 target for release artifacts
|
||||
|
||||
## [0.2.20] — 2026-04-05
|
||||
|
||||
### Refactoring
|
||||
- **ci**: Remove standalone release workflow
|
||||
|
||||
## [0.2.19] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Guarantee release jobs run after auto-tag
|
||||
- **ci**: Use stable auto-tag job outputs for release fanout
|
||||
- **ci**: Run post-tag release builds without job-output gating
|
||||
- **ci**: Repair auto-tag workflow yaml so jobs trigger
|
||||
|
||||
## [0.2.18] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Trigger release workflow from auto-tag pushes
|
||||
|
||||
## [0.2.17] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Harden release asset uploads for reruns
|
||||
|
||||
## [0.2.16] — 2026-04-05
|
||||
|
||||
### Bug Fixes
|
||||
- **ci**: Make release artifacts reliable across platforms
|
||||
|
||||
## [0.2.14] — 2026-04-04
|
||||
|
||||
### Bug Fixes
|
||||
- Resolve macOS bundle path after app rename
|
||||
|
||||
## [0.2.13] — 2026-04-04
|
||||
|
||||
### Bug Fixes
|
||||
- Resolve clippy uninlined_format_args in integrations and related modules
|
||||
- Resolve clippy format-args failures and OpenSSL vendoring issue
|
||||
|
||||
### Features
|
||||
- Add custom_rest provider mode and rebrand application name
|
||||
|
||||
## [0.2.12] — 2026-04-04
|
||||
|
||||
### Bug Fixes
|
||||
- ARM64 build uses native target instead of cross-compile
|
||||
|
||||
## [0.2.11] — 2026-04-04
|
||||
|
||||
### Bug Fixes
|
||||
- Persist integration settings and implement persistent browser windows
|
||||
|
||||
## [0.2.10] — 2026-04-03
|
||||
|
||||
### Features
|
||||
- Complete webview cookie extraction implementation
|
||||
|
||||
## [0.2.9] — 2026-04-03
|
||||
|
||||
### Features
|
||||
- Add multi-mode authentication for integrations (v0.2.10)
|
||||
|
||||
## [0.2.8] — 2026-04-03
|
||||
|
||||
### Features
|
||||
- Add temperature and max_tokens support for Custom REST providers (v0.2.9)
|
||||
|
||||
## [0.2.7] — 2026-04-03
|
||||
|
||||
### Bug Fixes
|
||||
- Use Wiki secret for authenticated wiki sync (v0.2.8)
|
||||
|
||||
### Documentation
|
||||
- Update wiki for v0.2.6 - integrations and Custom REST provider
|
||||
|
||||
### Features
|
||||
- Add automatic wiki sync to CI workflow (v0.2.7)
|
||||
|
||||
## [0.2.6] — 2026-04-03
|
||||
|
||||
### Bug Fixes
|
||||
- Add user_id support and OAuth shell permission (v0.2.6)
|
||||
|
||||
## [0.2.5] — 2026-04-03
|
||||
|
||||
### Documentation
|
||||
- Add Custom REST provider documentation
|
||||
|
||||
### Features
|
||||
- Implement Confluence, ServiceNow, and Azure DevOps REST API clients
|
||||
- Add Custom REST provider support
|
||||
|
||||
## [0.2.4] — 2026-04-03
|
||||
|
||||
### Features
|
||||
- Implement OAuth2 token exchange and AES-256-GCM encryption
|
||||
- Add OAuth2 Tauri commands for integration authentication
|
||||
- Implement OAuth2 callback server with automatic token exchange
|
||||
- Add OAuth2 frontend UI and complete integration flow
|
||||
|
||||
## [0.2.3] — 2026-04-03
|
||||
|
||||
### Bug Fixes
|
||||
- Improve Cancel button contrast in AI disclaimer modal
|
||||
|
||||
### Features
|
||||
- Add database schema for integration credentials and config
|
||||
|
||||
## [0.2.1] — 2026-04-03
|
||||
|
||||
### Bug Fixes
|
||||
- Implement native DOCX export without pandoc dependency
|
||||
|
||||
### Features
|
||||
- Add AI disclaimer modal before creating new issues
|
||||
|
||||
## [0.1.0] — 2026-04-03
|
||||
|
||||
### Bug Fixes
|
||||
- Resolve all clippy lints (uninlined format args, range::contains, push_str single chars)
|
||||
- Inline format args for Rust 1.88 clippy compatibility
|
||||
- Retain GPU-VRAM-eligible models in recommender even when RAM is low
|
||||
- Use alpine/git with explicit checkout for tag-based release builds
|
||||
- Set CI=true for cargo tauri build — Woodpecker sets CI=woodpecker which Tauri CLI rejects
|
||||
- Arm64 cross-compilation — add multiarch pkg-config sysroot setup
|
||||
- Remove arm64 from release pipeline — webkit2gtk multiarch conflict on x86_64 host
|
||||
- Write artifacts to workspace (shared between steps), not /artifacts/
|
||||
- Upload step needs gogs_default network to reach Gogs API (host firewall blocks default bridge)
|
||||
- Use bundled-sqlcipher-vendored-openssl for portable Windows cross-compilation
|
||||
- Add make to windows build step (required by vendored OpenSSL)
|
||||
- Replace empty icon placeholder files with real app icons
|
||||
- Suppress MinGW auto-export to resolve Windows DLL ordinal overflow
|
||||
- Use when: platform: for arm64 step routing (Woodpecker 0.15.4 compat)
|
||||
- Remove unused tauri-plugin-cli causing startup crash
|
||||
- Use $GITHUB_REF_NAME env var instead of ${{ github.ref_name }} expression
|
||||
- Remove unused tauri-plugin-updater + SQLCipher 16KB page size
|
||||
- Prevent WebKit/GTK system theme from overriding input text colors on Linux
|
||||
- Set SQLCipher cipher_page_size BEFORE first database access
|
||||
- Button text visibility, toggle contrast, create_issue IPC, ad-hoc codesign
|
||||
- Dropdown text invisible on macOS + correct codesign order for DMG
|
||||
- Add explicit text-foreground to SelectTrigger, SelectValue, and SelectItem
|
||||
- Ollama detection, install guide UI, and AI Providers auto-fill
|
||||
- Provider test FK error, model pull white screen, RECOMMENDED badge
|
||||
- Provider routing uses provider_type, Active badge, fmt
|
||||
- Navigate to /logs after issue creation, fix dashboard category display
|
||||
- Dashboard shows — while loading, exposes errors, adds refresh button
|
||||
- ListIssuesCmd was sending {query} but Rust expects {filter} — caused dashboard to always show 0 open issues
|
||||
- Arm64 linux cross-compilation — add multiarch and pkg-config env vars
|
||||
- Close from chat works before issue loads; save user reason as resolution step; dynamic version
|
||||
- DomainPrompts closing brace too early; arm64 use native platform image
|
||||
- UI contrast issues and ARM64 build failure
|
||||
- Remove Woodpecker CI and fix Gitea Actions ARM64 build
|
||||
- UI visibility issues, export errors, filtering, and audit log enhancement
|
||||
- ARM64 build native compilation instead of cross-compilation
|
||||
- Improve release artifact upload error handling
|
||||
- Install jq in Linux/Windows build containers
|
||||
- Improve download button visibility and add DOCX export
|
||||
|
||||
### Documentation
|
||||
- Update PLAN.md with accurate implementation status
|
||||
- Add CLAUDE.md with development guidance
|
||||
- Add wiki source files and CI auto-sync pipeline
|
||||
- Update PLAN.md - Phase 11 complete, redact token references
|
||||
- Update README and wiki for v0.1.0-alpha release
|
||||
- Remove broken arm64 CI step, document Woodpecker 0.15.4 limitation
|
||||
- Update README and wiki for Gitea Actions migration
|
||||
- Update README, wiki, and UI version to v0.1.1
|
||||
- Add LiteLLM + AWS Bedrock integration guide
|
||||
|
||||
### Features
|
||||
- Initial implementation of TFTSR IT Triage & RCA application
|
||||
- Add Windows amd64 cross-compile to release pipeline; add arm64 QEMU agent
|
||||
- Add native linux/arm64 release build step
|
||||
- Add macOS arm64 act_runner and release build job
|
||||
- Auto-increment patch tag on every merge to master
|
||||
- Inline file/screenshot attachment in triage chat
|
||||
- Close issues, restore history, auto-save resolution steps
|
||||
- Expand domains to 13 — add Telephony, Security/Vault, Public Safety, Application, Automation/CI-CD
|
||||
- Add HPE, Dell, Identity domains + expand k8s/security/observability/VESTA NXT
|
||||
|
||||
### Security
|
||||
- Rotate exposed token, redact from PLAN.md, add secret patterns to .gitignore
|
||||
|
||||
|
||||
@ -1,175 +0,0 @@
|
||||
# Integration Authentication Guide
|
||||
|
||||
## Overview
|
||||
|
||||
The TRCAA application supports three integration authentication methods, with automatic fallback between them:
|
||||
|
||||
1. **API Tokens** (Manual) - Recommended ✅
|
||||
2. **OAuth 2.0** - Fully automated (when configured)
|
||||
3. **Browser Cookies** - Partially working ⚠️
|
||||
|
||||
## Authentication Priority
|
||||
|
||||
When you ask an AI question, the system attempts authentication in this order:
|
||||
|
||||
```
|
||||
1. Extract cookies from persistent browser window
|
||||
↓ (if fails)
|
||||
2. Use stored API token from database
|
||||
↓ (if fails)
|
||||
3. Skip that integration and log guidance
|
||||
```
|
||||
|
||||
## HttpOnly Cookie Limitation
|
||||
|
||||
**Problem**: Confluence, ServiceNow, and Azure DevOps use **HttpOnly cookies** for security. These cookies:
|
||||
- ✅ Exist in the persistent browser window
|
||||
- ✅ Are sent automatically by the browser
|
||||
- ❌ **Cannot be extracted by JavaScript** (security feature)
|
||||
- ❌ **Cannot be used in separate HTTP requests**
|
||||
|
||||
**Impact**: Cookie extraction via the persistent browser window **fails** for HttpOnly cookies, even though you're logged in.
|
||||
|
||||
## Recommended Solution: Use API Tokens
|
||||
|
||||
### Confluence Personal Access Token
|
||||
|
||||
1. Log into Confluence
|
||||
2. Go to **Profile → Settings → Personal Access Tokens**
|
||||
3. Click **Create token**
|
||||
4. Copy the generated token
|
||||
5. In TRCAA app:
|
||||
- Go to **Settings → Integrations**
|
||||
- Find your Confluence integration
|
||||
- Click **"Save Manual Token"**
|
||||
- Paste the token
|
||||
- Token Type: `Bearer`
|
||||
|
||||
### ServiceNow API Key
|
||||
|
||||
1. Log into ServiceNow
|
||||
2. Go to **System Security → Application Registry**
|
||||
3. Click **New → OAuth API endpoint for external clients**
|
||||
4. Configure and generate API key
|
||||
5. In TRCAA app:
|
||||
- Go to **Settings → Integrations**
|
||||
- Find your ServiceNow integration
|
||||
- Click **"Save Manual Token"**
|
||||
- Paste the API key
|
||||
|
||||
### Azure DevOps Personal Access Token (PAT)
|
||||
|
||||
1. Log into Azure DevOps
|
||||
2. Click **User Settings (top right) → Personal Access Tokens**
|
||||
3. Click **New Token**
|
||||
4. Scopes: Select **Read** for:
|
||||
- Code (for wiki)
|
||||
- Work Items (for work item search)
|
||||
5. Click **Create** and copy the token
|
||||
6. In TRCAA app:
|
||||
- Go to **Settings → Integrations**
|
||||
- Find your Azure DevOps integration
|
||||
- Click **"Save Manual Token"**
|
||||
- Paste the token
|
||||
- Token Type: `Bearer`
|
||||
|
||||
## Verification
|
||||
|
||||
After adding API tokens, test the integration:
|
||||
|
||||
1. Open or create an issue
|
||||
2. Go to Triage page
|
||||
3. Ask a question like: "How do I upgrade Vesta NXT to 1.0.12"
|
||||
4. Check the logs for:
|
||||
```
|
||||
INFO Using stored cookies for confluence (count: 1)
|
||||
INFO Found X integration sources for AI context
|
||||
```
|
||||
|
||||
If successful, the AI response should include:
|
||||
- Content from internal documentation
|
||||
- Source citations with URLs
|
||||
- Links to Confluence/ServiceNow/Azure DevOps pages
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### No search results found
|
||||
|
||||
**Symptom**: AI gives generic answers instead of internal documentation
|
||||
|
||||
**Check logs for**:
|
||||
```
|
||||
WARN Unable to search confluence - no authentication available
|
||||
```
|
||||
|
||||
**Solution**: Add an API token (see above)
|
||||
|
||||
### Cookie extraction timeout
|
||||
|
||||
**Symptom**: Logs show:
|
||||
```
|
||||
WARN Failed to extract cookies from confluence: Timeout extracting cookies
|
||||
```
|
||||
|
||||
**Why**: HttpOnly cookies cannot be extracted via JavaScript
|
||||
|
||||
**Solution**: Use API tokens instead
|
||||
|
||||
### Integration not configured
|
||||
|
||||
**Symptom**: No integration searches at all
|
||||
|
||||
**Check**: Settings → Integrations - ensure integration is added with:
|
||||
- Base URL configured
|
||||
- Either browser window open OR API token saved
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
### Native Cookie Extraction (Planned)
|
||||
|
||||
We plan to implement platform-specific native cookie extraction that can access HttpOnly cookies directly from the webview's cookie store:
|
||||
|
||||
- **macOS**: Use WKWebView's HTTPCookieStore (requires `cocoa`/`objc` crates)
|
||||
- **Windows**: Use WebView2's cookie manager (requires `windows` crate)
|
||||
- **Linux**: Use WebKitGTK cookie manager (requires `webkit2gtk` binding)
|
||||
|
||||
This will make the persistent browser approach fully automatic, even with HttpOnly cookies.
|
||||
|
||||
### Webview-Based Search (Experimental)
|
||||
|
||||
Another approach is to make search requests FROM within the authenticated webview using JavaScript fetch, which automatically includes HttpOnly cookies. This requires reliable IPC communication between JavaScript and Rust.
|
||||
|
||||
## Security Notes
|
||||
|
||||
### Token Storage
|
||||
|
||||
API tokens are:
|
||||
- ✅ **Encrypted** using AES-256-GCM before storage
|
||||
- ✅ **Hashed** (SHA-256) for audit logging
|
||||
- ✅ Stored in encrypted SQLite database
|
||||
- ✅ Never exposed to frontend JavaScript
|
||||
|
||||
### Cookie Storage (when working)
|
||||
|
||||
Extracted cookies are:
|
||||
- ✅ Encrypted before database storage
|
||||
- ✅ Only retrieved when making API requests
|
||||
- ✅ Transmitted only over HTTPS
|
||||
|
||||
### Audit Trail
|
||||
|
||||
All integration authentication attempts are logged:
|
||||
- Cookie extraction attempts
|
||||
- Token usage
|
||||
- Search requests
|
||||
- Authentication failures
|
||||
|
||||
Check **Settings → Security → Audit Log** to review activity.
|
||||
|
||||
## Summary
|
||||
|
||||
**For reliable integration search NOW**: Use API tokens (Option 1)
|
||||
|
||||
**For automatic integration search LATER**: Native cookie extraction will be implemented in a future update
|
||||
|
||||
**Current workaround**: API tokens provide full functionality without browser dependency
|
||||
27
README.md
27
README.md
@ -4,8 +4,7 @@ A structured, AI-backed desktop tool for IT incident triage, 5-Whys root cause a
|
||||
|
||||
Built with **Tauri 2** (Rust + WebView), **React 18**, **TypeScript**, and **SQLCipher AES-256** encrypted storage.
|
||||
|
||||
[](./LICENSE)
|
||||

|
||||
**CI status:**  — all checks green (rustfmt · clippy · 64 Rust tests · tsc · vitest)
|
||||
|
||||
---
|
||||
|
||||
@ -19,6 +18,7 @@ Built with **Tauri 2** (Rust + WebView), **React 18**, **TypeScript**, and **SQL
|
||||
- **Ollama Management** — Hardware detection, model recommendations, pull/delete models in-app
|
||||
- **Audit Trail** — Every external data send logged with SHA-256 hash
|
||||
- **Domain System Prompts** — Pre-built expert context for 8 IT domains (Linux, Windows, Network, Kubernetes, Databases, Virtualization, Hardware, Observability)
|
||||
- **Image Attachments** — Upload and manage image files with PII detection and mandatory user approval
|
||||
- **Integrations** *(v0.2, coming soon)* — Confluence, ServiceNow, Azure DevOps
|
||||
|
||||
---
|
||||
@ -131,7 +131,6 @@ Launch the app and go to **Settings → AI Providers** to add a provider:
|
||||
| Ollama (local) | `http://localhost:11434` | No key needed — fully offline |
|
||||
| Azure OpenAI | `https://<resource>.openai.azure.com/openai/deployments/<deployment>` | Requires API key |
|
||||
| **AWS Bedrock (via LiteLLM)** | `http://localhost:8000/v1` | See [LiteLLM + AWS Bedrock](#litellm--aws-bedrock-setup) below |
|
||||
| **Custom REST Gateway** | Your gateway URL | See [Custom REST format](docs/wiki/AI-Providers.md) |
|
||||
|
||||
For offline use, install [Ollama](https://ollama.com) and pull a model:
|
||||
```bash
|
||||
@ -289,9 +288,9 @@ All data is stored locally in a SQLCipher-encrypted database at:
|
||||
|
||||
| OS | Path |
|
||||
|---|---|
|
||||
| Linux | `~/.local/share/trcaa/trcaa.db` |
|
||||
| macOS | `~/Library/Application Support/trcaa/trcaa.db` |
|
||||
| Windows | `%APPDATA%\trcaa\trcaa.db` |
|
||||
| Linux | `~/.local/share/tftsr/tftsr.db` |
|
||||
| macOS | `~/Library/Application Support/tftsr/tftsr.db` |
|
||||
| Windows | `%APPDATA%\tftsr\tftsr.db` |
|
||||
|
||||
Override with the `TFTSR_DATA_DIR` environment variable.
|
||||
|
||||
@ -302,8 +301,8 @@ Override with the `TFTSR_DATA_DIR` environment variable.
|
||||
| Variable | Default | Purpose |
|
||||
|---|---|---|
|
||||
| `TFTSR_DATA_DIR` | Platform data dir | Override database location |
|
||||
| `TFTSR_DB_KEY` | _(auto-generated)_ | Database encryption key override — auto-generated at first launch if unset |
|
||||
| `TFTSR_ENCRYPTION_KEY` | _(auto-generated)_ | Credential encryption key override — auto-generated at first launch if unset |
|
||||
| `TFTSR_DB_KEY` | _(none)_ | Database encryption key (required in release builds) |
|
||||
| `TFTSR_ENCRYPTION_KEY` | _(none)_ | Credential encryption key (required in release builds) |
|
||||
| `RUST_LOG` | `info` | Tracing log level (`debug`, `info`, `warn`, `error`) |
|
||||
|
||||
---
|
||||
@ -327,16 +326,6 @@ Override with the `TFTSR_DATA_DIR` environment variable.
|
||||
|
||||
---
|
||||
|
||||
## Support
|
||||
|
||||
If this tool has been useful to you, consider buying me a coffee!
|
||||
|
||||
[](https://buymeacoffee.com/tftsr)
|
||||
|
||||
---
|
||||
|
||||
## License
|
||||
|
||||
MIT © 2025 [Shaun Arman](https://github.com/sarman)
|
||||
|
||||
See [LICENSE](./LICENSE) for the full text. You are free to use, modify, and distribute this software — personal, commercial, or enterprise — as long as the original copyright notice is retained.
|
||||
Private — internal tooling. All rights reserved.
|
||||
|
||||
335
SECURITY_AUDIT.md
Normal file
335
SECURITY_AUDIT.md
Normal file
@ -0,0 +1,335 @@
|
||||
# Security Audit Report
|
||||
|
||||
**Application**: Troubleshooting and RCA Assistant (TRCAA)
|
||||
**Audit Date**: 2026-04-06
|
||||
**Scope**: All git-tracked source files (159 files)
|
||||
**Context**: Pre-open-source release under MIT license
|
||||
|
||||
---
|
||||
|
||||
## Executive Summary
|
||||
|
||||
The codebase is generally well-structured with several positive security practices already in place: parameterized SQL queries, AES-256-GCM credential encryption, PKCE for OAuth flows, PII detection and redaction before AI transmission, hash-chained audit logs, and a restrictive CSP. However, the audit identified **3 CRITICAL**, **5 HIGH**, **5 MEDIUM**, and **5 LOW** findings that must be addressed before public release.
|
||||
|
||||
---
|
||||
|
||||
## CRITICAL Findings
|
||||
|
||||
### C1. Corporate-Internal Documents Shipped in Repository
|
||||
|
||||
**Files**:
|
||||
- `GenAI API User Guide.md` (entire file)
|
||||
- `HANDOFF-MSI-GENAI.md` (entire file)
|
||||
|
||||
**Issue**: These files contain proprietary Motorola Solutions / MSI internal documentation. `GenAI API User Guide.md` is authored by named MSI employees (Dipjyoti Bisharad, Jahnavi Alike, Sunil Vurandur, Anjali Kamath, Vibin Jacob, Girish Manivel) and documents internal API contracts at `genai-service.stage.commandcentral.com` and `genai-service.commandcentral.com`. `HANDOFF-MSI-GENAI.md` explicitly references "MSI GenAI API" integration details including internal endpoint URLs, header formats, and payload contracts.
|
||||
|
||||
Publishing these files under MIT license likely violates corporate IP agreements and exposes internal infrastructure details.
|
||||
|
||||
**Recommended Fix**: Remove both files from the repository entirely and scrub from git history using `git filter-repo` before making the repo public.
|
||||
|
||||
---
|
||||
|
||||
### C2. Internal Infrastructure URLs Hardcoded in CSP and Source
|
||||
|
||||
**File**: `src-tauri/tauri.conf.json`, line 13
|
||||
**Also**: `src-tauri/src/ai/openai.rs`, line 219
|
||||
|
||||
**Issue**: The CSP `connect-src` directive includes corporate-internal endpoints:
|
||||
```
|
||||
https://genai-service.stage.commandcentral.com
|
||||
https://genai-service.commandcentral.com
|
||||
```
|
||||
|
||||
Additionally, `openai.rs` line 219 sends `X-msi-genai-client: troubleshooting-rca-assistant` as a hardcoded header in the custom REST path, tying the application to an internal MSI service.
|
||||
|
||||
These expose internal service infrastructure to anyone reading the source and indicate the app was designed to interact with corporate systems.
|
||||
|
||||
**Recommended Fix**:
|
||||
- Remove the two `commandcentral.com` entries from the CSP.
|
||||
- Remove or make the `X-msi-genai-client` header configurable rather than hardcoded.
|
||||
- Audit the CSP to ensure only generic/public endpoints remain (OpenAI, Anthropic, Mistral, Google, Ollama, Atlassian, Microsoft are fine).
|
||||
|
||||
---
|
||||
|
||||
### C3. Private Gogs Server IP Exposed in All CI Workflows
|
||||
|
||||
**Files**:
|
||||
- `.gitea/workflows/test.yml` (lines 17, 44, 72, 99, 126)
|
||||
- `.gitea/workflows/auto-tag.yml` (lines 31, 52, 79, 95, 97, 141, 162, 227, 252, 313, 338, 401, 464)
|
||||
- `.gitea/workflows/build-images.yml` (lines 4, 10, 11, 16-18, 33, 46, 69, 92)
|
||||
|
||||
**Issue**: All CI workflow files reference `172.0.0.29:3000` (a private Gogs instance) and `sarman` username. While the IP is RFC1918 private address space, it reveals internal infrastructure topology and the developer's username across dozens of lines. The `build-images.yml` also exposes `REGISTRY_USER: sarman` and container registry details.
|
||||
|
||||
**Recommended Fix**: Before open-sourcing, replace all workflow files with GitHub Actions equivalents, or at minimum replace the hardcoded private IP and username with parameterized variables or remove the `.gitea/` directory entirely if moving to GitHub.
|
||||
|
||||
---
|
||||
|
||||
## HIGH Findings
|
||||
|
||||
### H1. Hardcoded Development Encryption Key in Auth Module
|
||||
|
||||
**File**: `src-tauri/src/integrations/auth.rs`, line 179
|
||||
|
||||
```rust
|
||||
return Ok("dev-key-change-me-in-production-32b".to_string());
|
||||
```
|
||||
|
||||
**Issue**: In debug builds, the credential encryption key is a well-known hardcoded string. Anyone reading the source can decrypt any credentials stored by a debug build. Since this is about to be open source, attackers know the exact key to use against any debug-mode installation.
|
||||
|
||||
**Also at**: `src-tauri/src/db/connection.rs`, line 39: `"dev-key-change-in-prod"`
|
||||
|
||||
While this is gated behind `cfg!(debug_assertions)`, open-sourcing the code means the development key is permanently public knowledge. If any user runs a debug build or if the release profile check is ever misconfigured, all stored credentials are trivially decryptable.
|
||||
|
||||
**Recommended Fix**:
|
||||
- Remove the hardcoded dev key entirely.
|
||||
- In debug mode, auto-generate and persist a random key the same way the release path does (lines 44-57 of `connection.rs` already implement this pattern).
|
||||
- Document in a `SECURITY.md` file that credentials are encrypted at rest and the key management approach.
|
||||
|
||||
---
|
||||
|
||||
### H2. Encryption Key Derivation Uses Raw SHA-256 Instead of a KDF
|
||||
|
||||
**File**: `src-tauri/src/integrations/auth.rs`, lines 185-191
|
||||
|
||||
```rust
|
||||
fn derive_aes_key() -> Result<[u8; 32], String> {
|
||||
let key_material = get_encryption_key_material()?;
|
||||
let digest = Sha256::digest(key_material.as_bytes());
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
**Issue**: The AES-256-GCM key is derived from the raw material by a single SHA-256 hash. There is no salt and no iteration count. This means if the key material has low entropy (as the dev key does), the derived key is trivially brute-forceable. In contrast, the database encryption properly uses PBKDF2-HMAC-SHA512 with 256,000 iterations (line 69 of `connection.rs`).
|
||||
|
||||
**Recommended Fix**: Use a proper KDF (PBKDF2, Argon2, or HKDF) with a persisted random salt and sufficient iteration count for deriving the AES key. The `db/connection.rs` module already demonstrates the correct approach.
|
||||
|
||||
---
|
||||
|
||||
### H3. Release Build Fails Open if TFTSR_ENCRYPTION_KEY is Unset
|
||||
|
||||
**File**: `src-tauri/src/integrations/auth.rs`, line 182
|
||||
|
||||
```rust
|
||||
Err("TFTSR_ENCRYPTION_KEY must be set in release builds".to_string())
|
||||
```
|
||||
|
||||
**Issue**: In release mode, if the `TFTSR_ENCRYPTION_KEY` environment variable is not set, any attempt to store or retrieve credentials will fail with an error. Unlike the database key management (which auto-generates and persists a key), credential encryption requires manual environment variable configuration. For a desktop app distributed to end users, this is an unworkable UX: users will never set this variable, meaning credential storage will be broken out of the box in release builds.
|
||||
|
||||
**Recommended Fix**: Mirror the database key management pattern: auto-generate a random key on first use, persist it to a file in the app data directory with 0600 permissions (as already done for `.dbkey`), and read it back on subsequent launches.
|
||||
|
||||
---
|
||||
|
||||
### H4. API Keys Transmitted to Frontend via IPC and Stored in Memory
|
||||
|
||||
**File**: `src/stores/settingsStore.ts`, lines 56-63
|
||||
**Also**: `src-tauri/src/state.rs`, line 12 (`api_key` field in `ProviderConfig`)
|
||||
|
||||
**Issue**: The `ProviderConfig` struct includes `api_key: String` which is serialized over Tauri's IPC bridge from Rust to TypeScript and back. The settings store correctly strips API keys before persisting to `localStorage` (line 60: `api_key: ""`), which is good. However, the full API key lives in the Zustand store in browser memory for the duration of the session. If the webview's JavaScript context is compromised (e.g., via a future XSS or a malicious Tauri plugin), the API key is accessible.
|
||||
|
||||
**Recommended Fix**: Store API keys exclusively in the Rust backend (encrypted in the database). The frontend should only send a provider identifier; the backend should look up the key internally before making API calls. This eliminates API keys from the IPC surface entirely.
|
||||
|
||||
---
|
||||
|
||||
### H5. Filesystem Capabilities Are Overly Broad
|
||||
|
||||
**File**: `src-tauri/capabilities/default.json`, lines 16-24
|
||||
|
||||
```json
|
||||
"fs:allow-read",
|
||||
"fs:allow-write",
|
||||
"fs:allow-mkdir",
|
||||
```
|
||||
|
||||
**Issue**: The capabilities include `fs:allow-read` and `fs:allow-write` without scope constraints (in addition to the properly scoped `fs:scope-app-recursive` and `fs:scope-temp-recursive`). The unscoped `fs:allow-read`/`fs:allow-write` permissions may override the scope restrictions, potentially allowing the frontend JavaScript to read or write arbitrary files on the filesystem depending on Tauri 2.x ACL resolution order.
|
||||
|
||||
**Recommended Fix**: Remove the unscoped `fs:allow-read`, `fs:allow-write`, and `fs:allow-mkdir` permissions. Keep only the scoped variants (`fs:allow-app-read-recursive`, `fs:allow-app-write-recursive`, `fs:allow-temp-read-recursive`, `fs:allow-temp-write-recursive`) plus the `fs:scope-*` directives. File dialog operations (`dialog:allow-open`, `dialog:allow-save`) already handle user-initiated file access.
|
||||
|
||||
---
|
||||
|
||||
## MEDIUM Findings
|
||||
|
||||
### M1. Export Document Accepts Arbitrary Output Directory Without Validation
|
||||
|
||||
**File**: `src-tauri/src/commands/docs.rs`, lines 154-162
|
||||
|
||||
```rust
|
||||
let base_dir = if output_dir.is_empty() || output_dir == "." {
|
||||
dirs::download_dir().unwrap_or_else(|| { ... })
|
||||
} else {
|
||||
PathBuf::from(&output_dir)
|
||||
};
|
||||
```
|
||||
|
||||
**Issue**: The `export_document` command accepts an `output_dir` string from the frontend and writes files to it without canonicalization or path validation. While the frontend likely provides a dialog-selected path, a compromised frontend could write files to arbitrary directories (e.g., `../../etc/cron.d/` on Linux). There is no check that `output_dir` is within an expected scope.
|
||||
|
||||
**Recommended Fix**: Canonicalize the path and validate it against an allowlist of directories (Downloads, app data, or user-selected via dialog). Reject paths containing `..` or pointing to system directories.
|
||||
|
||||
---
|
||||
|
||||
### M2. OAuth Callback Server Listens on Fixed Port Without CSRF Protection
|
||||
|
||||
**File**: `src-tauri/src/integrations/callback_server.rs`, lines 14-33
|
||||
|
||||
**Issue**: The OAuth callback server binds to `127.0.0.1:8765`. While binding to localhost is correct, the server accepts any HTTP GET to `/callback?code=...&state=...` without verifying the origin of the request. A malicious local process or a webpage with access to `localhost` could forge a callback request. The `state` parameter provides some CSRF protection, but it is stored in a global `HashMap` without TTL, meaning stale state values persist indefinitely.
|
||||
|
||||
**Recommended Fix**:
|
||||
- Add a TTL (e.g., 10 minutes) to OAuth state entries to prevent stale state accumulation.
|
||||
- Consider using a random high port instead of the fixed 8765 to reduce predictability.
|
||||
|
||||
---
|
||||
|
||||
### M3. Audit Log Hash Chain is Appendable but Not Verifiable
|
||||
|
||||
**File**: `src-tauri/src/audit/log.rs`, lines 4-16
|
||||
|
||||
**Issue**: The audit log implements a hash chain (each entry includes the hash of the previous entry), which is good for tamper detection. However, there is no command or function to verify the integrity of the chain. An attacker with database access could modify entries and recompute all subsequent hashes. Without an external anchor (e.g., periodic hash checkpoint to an external store), the chain only proves ordering, not immutability.
|
||||
|
||||
**Recommended Fix**: Add a `verify_audit_chain()` function and consider periodically exporting chain checkpoints to a file outside the database. Document the threat model in `SECURITY.md`.
|
||||
|
||||
---
|
||||
|
||||
### M4. Non-Windows Key File Permissions Not Enforced
|
||||
|
||||
**File**: `src-tauri/src/db/connection.rs`, lines 25-28
|
||||
|
||||
```rust
|
||||
#[cfg(not(unix))]
|
||||
fn write_key_file(path: &Path, key: &str) -> anyhow::Result<()> {
|
||||
std::fs::write(path, key)?;
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
**Issue**: On non-Unix platforms (Windows), the database key file is written with default permissions, potentially making it world-readable. The Unix path correctly uses mode `0o600`.
|
||||
|
||||
**Recommended Fix**: On Windows, use platform-specific ACL APIs to restrict the key file to the current user, or at minimum document this limitation.
|
||||
|
||||
---
|
||||
|
||||
### M5. `unsafe-inline` in Style CSP Directive
|
||||
|
||||
**File**: `src-tauri/tauri.conf.json`, line 13
|
||||
|
||||
```
|
||||
style-src 'self' 'unsafe-inline'
|
||||
```
|
||||
|
||||
**Issue**: The CSP allows `unsafe-inline` for styles. While this is common in React/Tailwind applications and the attack surface is lower than `unsafe-inline` for scripts, it still permits style-based data exfiltration attacks (e.g., CSS injection to leak attribute values).
|
||||
|
||||
**Recommended Fix**: If feasible, use nonce-based or hash-based style CSP. If not feasible due to Tailwind's runtime style injection, document this as an accepted risk.
|
||||
|
||||
---
|
||||
|
||||
## LOW Findings
|
||||
|
||||
### L1. `http:default` Capability Grants Broad Network Access
|
||||
|
||||
**File**: `src-tauri/capabilities/default.json`, line 28
|
||||
|
||||
**Issue**: The `http:default` permission allows the frontend to make arbitrary HTTP requests. Combined with the broad CSP `connect-src`, this gives the webview significant network access. For a desktop app this is often necessary, but it should be documented and reviewed.
|
||||
|
||||
**Recommended Fix**: Consider restricting `http` permissions to specific URL patterns matching only the known AI provider APIs and integration endpoints.
|
||||
|
||||
---
|
||||
|
||||
### L2. IntelliJ IDEA Config Files Tracked in Git
|
||||
|
||||
**Files**:
|
||||
- `.idea/.gitignore`
|
||||
- `.idea/copilot.data.migration.ask2agent.xml`
|
||||
- `.idea/misc.xml`
|
||||
- `.idea/modules.xml`
|
||||
- `.idea/tftsr-devops_investigation.iml`
|
||||
- `.idea/vcs.xml`
|
||||
|
||||
**Issue**: IDE configuration files are tracked. These may leak editor preferences and do not belong in an open-source repository.
|
||||
|
||||
**Recommended Fix**: Add `.idea/` to `.gitignore` and remove from tracking with `git rm -r --cached .idea/`.
|
||||
|
||||
---
|
||||
|
||||
### L3. Placeholder OAuth Client IDs in Source
|
||||
|
||||
**File**: `src-tauri/src/commands/integrations.rs`, lines 181, 187
|
||||
|
||||
```rust
|
||||
"confluence-client-id-placeholder"
|
||||
"ado-client-id-placeholder"
|
||||
```
|
||||
|
||||
**Issue**: These placeholder strings are used as fallbacks when environment variables are not set. While they are obviously not real credentials, they could confuse users or be mistaken for actual client IDs in bug reports.
|
||||
|
||||
**Recommended Fix**: Make the OAuth flow fail explicitly with a clear error message when the client ID environment variable is not set, rather than falling back to a placeholder.
|
||||
|
||||
---
|
||||
|
||||
### L4. Username `sarman` Embedded in CI Workflows and Makefile
|
||||
|
||||
**Files**: `.gitea/workflows/*.yml`, `Makefile` line 2
|
||||
|
||||
**Issue**: The developer's username appears throughout CI configuration. While not a security vulnerability per se, it is a privacy concern for open-source release.
|
||||
|
||||
**Recommended Fix**: Parameterize the username in CI workflows. Update the Makefile to use a generic repository reference.
|
||||
|
||||
---
|
||||
|
||||
### L5. `shell:allow-open` Capability Enabled
|
||||
|
||||
**File**: `src-tauri/capabilities/default.json`, line 27
|
||||
|
||||
**Issue**: The `shell:allow-open` permission allows the frontend to open URLs in the system browser. This is used for OAuth flows and external links. While convenient, a compromised frontend could open arbitrary URLs.
|
||||
|
||||
**Recommended Fix**: This is acceptable for the app's functionality but should be documented. Consider restricting to specific URL patterns if Tauri 2.x supports it.
|
||||
|
||||
---
|
||||
|
||||
## Positive Security Observations
|
||||
|
||||
The following practices are already well-implemented:
|
||||
|
||||
1. **Parameterized SQL queries**: All database operations use `rusqlite::params![]` with positional parameters. No string interpolation in SQL. The dynamic query builder in `list_issues` and `get_audit_log` correctly uses indexed parameter placeholders.
|
||||
|
||||
2. **SQLCipher encryption at rest**: Release builds encrypt the database using AES-256-CBC via SQLCipher with PBKDF2-HMAC-SHA512 (256k iterations).
|
||||
|
||||
3. **PII detection and mandatory redaction**: Log files must pass PII detection and redaction before being sent to AI providers (`redacted_path_for()` enforces this check).
|
||||
|
||||
4. **PKCE for OAuth**: The OAuth implementation uses PKCE (S256) with cryptographically random verifiers.
|
||||
|
||||
5. **Hash-chained audit log**: Every security-relevant action is logged with a SHA-256 hash chain.
|
||||
|
||||
6. **Path traversal prevention**: `upload_log_file` uses `std::fs::canonicalize()` and validates the result is a regular file with size limits.
|
||||
|
||||
7. **No `dangerouslySetInnerHTML` or `eval()`**: The frontend renders AI responses as plain text via `{msg.content}` in JSX, preventing XSS from AI model output.
|
||||
|
||||
8. **API key scrubbing from localStorage**: The settings store explicitly strips `api_key` before persisting (line 60 of `settingsStore.ts`).
|
||||
|
||||
9. **No shell command injection**: All `std::process::Command` calls use hardcoded binary names with literal arguments. No user input is passed to shell commands.
|
||||
|
||||
10. **No secrets in git history**: `.gitignore` properly excludes `.env`, `.secrets`, `secrets.yml`, and related files. No private keys or certificates are tracked.
|
||||
|
||||
11. **Mutex guards not held across await points**: The codebase correctly drops `MutexGuard` before `.await` by scoping locks inside `{ }` blocks.
|
||||
|
||||
---
|
||||
|
||||
## Recommendations Summary (Priority Order)
|
||||
|
||||
| Priority | Action | Effort |
|
||||
|----------|--------|--------|
|
||||
| **P0** | Remove `GenAI API User Guide.md` and `HANDOFF-MSI-GENAI.md` from repo and git history | Small |
|
||||
| **P0** | Remove `commandcentral.com` URLs from CSP and hardcoded MSI headers from `openai.rs` | Small |
|
||||
| **P0** | Replace or parameterize private IP (`172.0.0.29`) and username in all `.gitea/` workflows | Medium |
|
||||
| **P1** | Replace hardcoded dev encryption keys with auto-generated per-install keys | Small |
|
||||
| **P1** | Use proper KDF (PBKDF2/HKDF) for AES key derivation in `auth.rs` | Small |
|
||||
| **P1** | Auto-generate encryption key for credential storage (mirror `connection.rs` pattern) | Small |
|
||||
| **P1** | Remove unscoped `fs:allow-read`/`fs:allow-write` from capabilities | Small |
|
||||
| **P2** | Move API key storage to backend-only (remove from IPC surface) | Medium |
|
||||
| **P2** | Add path validation to `export_document` output directory | Small |
|
||||
| **P2** | Add TTL to OAuth state entries | Small |
|
||||
| **P2** | Add audit chain verification function | Small |
|
||||
| **P3** | Remove `.idea/` from git tracking | Trivial |
|
||||
| **P3** | Replace placeholder OAuth client IDs with explicit errors | Trivial |
|
||||
| **P3** | Parameterize username in CI/Makefile | Small |
|
||||
|
||||
---
|
||||
|
||||
*Report generated by security audit of git-tracked source files at commit HEAD on feature/ai-tool-calling-integration-search branch.*
|
||||
@ -1,254 +0,0 @@
|
||||
# Ticket Summary - Persistent Browser Windows for Integration Authentication
|
||||
|
||||
## Description
|
||||
|
||||
Implement persistent browser window sessions for integration authentication (Confluence, Azure DevOps, ServiceNow). Browser windows now persist across application restarts, eliminating the need to extract HttpOnly cookies via JavaScript (which fails due to browser security restrictions).
|
||||
|
||||
This follows a Playwright-style "piggyback" authentication approach where the browser window maintains its own internal cookie store, allowing the user to log in once and have the session persist indefinitely until they manually close the window.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [x] Integration browser windows persist to database when created
|
||||
- [x] Browser windows are automatically restored on app startup
|
||||
- [x] Cookies are maintained automatically by the browser's internal store (no JavaScript extraction of HttpOnly cookies)
|
||||
- [x] Windows can be manually closed by the user, which removes them from persistence
|
||||
- [x] Database migration creates `persistent_webviews` table
|
||||
- [x] Window close events are handled to update database and in-memory tracking
|
||||
|
||||
## Work Implemented
|
||||
|
||||
### 1. Database Migration for Persistent Webviews
|
||||
|
||||
**Files Modified:**
|
||||
- `src-tauri/src/db/migrations.rs:154-167`
|
||||
|
||||
**Changes:**
|
||||
- Added migration `013_create_persistent_webviews` to create the `persistent_webviews` table
|
||||
- Table schema includes:
|
||||
- `id` (TEXT PRIMARY KEY)
|
||||
- `service` (TEXT with CHECK constraint for 'confluence', 'servicenow', 'azuredevops')
|
||||
- `webview_label` (TEXT - the Tauri window identifier)
|
||||
- `base_url` (TEXT - the integration base URL)
|
||||
- `last_active` (TEXT timestamp, defaults to now)
|
||||
- `window_x`, `window_y`, `window_width`, `window_height` (INTEGER - for future window position persistence)
|
||||
- UNIQUE constraint on `service` (one browser window per integration)
|
||||
|
||||
### 2. Webview Persistence on Creation
|
||||
|
||||
**Files Modified:**
|
||||
- `src-tauri/src/commands/integrations.rs:531-591`
|
||||
|
||||
**Changes:**
|
||||
- Modified `authenticate_with_webview` command to persist webview state to database after creation
|
||||
- Stores service name, webview label, and base URL
|
||||
- Logs persistence operation for debugging
|
||||
- Sets up window close event handler to remove webview from tracking and database
|
||||
- Event handler properly clones Arc fields for `'static` lifetime requirement
|
||||
- Updated success message to inform user that window persists across restarts
|
||||
|
||||
### 3. Webview Restoration on App Startup
|
||||
|
||||
**Files Modified:**
|
||||
- `src-tauri/src/commands/integrations.rs:793-865` - Added `restore_persistent_webviews` function
|
||||
- `src-tauri/src/lib.rs:60-84` - Added `.setup()` hook to call restoration
|
||||
|
||||
**Changes:**
|
||||
- Added `restore_persistent_webviews` async function that:
|
||||
- Queries `persistent_webviews` table for all saved webviews
|
||||
- Recreates each webview window by calling `authenticate_with_webview`
|
||||
- Updates in-memory tracking map
|
||||
- Removes from database if restoration fails
|
||||
- Logs all operations for debugging
|
||||
- Updated `lib.rs` to call restoration in `.setup()` hook:
|
||||
- Clones Arc fields from `AppState` for `'static` lifetime
|
||||
- Spawns async task to restore webviews
|
||||
- Logs warnings if restoration fails
|
||||
|
||||
### 4. Window Close Event Handling
|
||||
|
||||
**Files Modified:**
|
||||
- `src-tauri/src/commands/integrations.rs:559-591`
|
||||
|
||||
**Changes:**
|
||||
- Added `on_window_event` listener to detect window close events
|
||||
- On `CloseRequested` event:
|
||||
- Spawns async task to clean up
|
||||
- Removes service from in-memory `integration_webviews` map
|
||||
- Deletes entry from `persistent_webviews` database table
|
||||
- Logs all cleanup operations
|
||||
- Properly handles Arc cloning to avoid lifetime issues in spawned task
|
||||
|
||||
### 5. Removed Auto-Close Behavior
|
||||
|
||||
**Files Modified:**
|
||||
- `src-tauri/src/commands/integrations.rs:606-618`
|
||||
|
||||
**Changes:**
|
||||
- Removed automatic window closing in `extract_cookies_from_webview`
|
||||
- Windows now stay open after cookie extraction
|
||||
- Updated success message to inform user that window persists for future use
|
||||
|
||||
### 6. Frontend UI Update - Removed "Complete Login" Button
|
||||
|
||||
**Files Modified:**
|
||||
- `src/pages/Settings/Integrations.tsx:371-409` - Updated webview authentication UI
|
||||
- `src/pages/Settings/Integrations.tsx:140-165` - Simplified `handleConnectWebview`
|
||||
- `src/pages/Settings/Integrations.tsx:167-200` - Removed `handleCompleteWebviewLogin` function
|
||||
- `src/pages/Settings/Integrations.tsx:16-26` - Removed unused `extractCookiesFromWebviewCmd` import
|
||||
- `src/pages/Settings/Integrations.tsx:670-677` - Updated authentication method comparison text
|
||||
|
||||
**Changes:**
|
||||
- Removed "Complete Login" button that tried to extract cookies via JavaScript
|
||||
- Updated UI to show success message when browser opens, explaining persistence
|
||||
- Removed confusing two-step flow (open browser → complete login)
|
||||
- New flow: click "Open Browser" → log in → leave window open (that's it!)
|
||||
- Updated description text to explain persistent window behavior
|
||||
- Mark integration as "connected" immediately when browser opens
|
||||
- Removed unused function and import for cookie extraction
|
||||
|
||||
### 7. Unused Import Cleanup
|
||||
|
||||
**Files Modified:**
|
||||
- `src-tauri/src/integrations/webview_auth.rs:2`
|
||||
- `src-tauri/src/lib.rs:13` - Added `use tauri::Manager;`
|
||||
|
||||
**Changes:**
|
||||
- Removed unused `Listener` import from webview_auth.rs
|
||||
- Added `Manager` trait import to lib.rs for `.state()` method
|
||||
|
||||
## Testing Needed
|
||||
|
||||
### Manual Testing
|
||||
|
||||
1. **Initial Browser Window Creation**
|
||||
- [ ] Navigate to Settings > Integrations
|
||||
- [ ] Configure a Confluence integration with base URL
|
||||
- [ ] Click "Open Browser" button
|
||||
- [ ] Verify browser window opens with Confluence login page
|
||||
- [ ] Complete login in the browser window
|
||||
- [ ] Verify window stays open after login
|
||||
|
||||
2. **Window Persistence Across Restarts**
|
||||
- [ ] With Confluence browser window open, close the main application
|
||||
- [ ] Relaunch the application
|
||||
- [ ] Verify Confluence browser window is automatically restored
|
||||
- [ ] Verify you are still logged in (cookies maintained)
|
||||
- [ ] Navigate to different pages in Confluence to verify session works
|
||||
|
||||
3. **Manual Window Close**
|
||||
- [ ] With browser window open, manually close it (X button)
|
||||
- [ ] Restart the application
|
||||
- [ ] Verify browser window does NOT reopen (removed from persistence)
|
||||
|
||||
4. **Database Verification**
|
||||
- [ ] Open database: `sqlite3 ~/Library/Application\ Support/trcaa/data.db`
|
||||
- [ ] Run: `SELECT * FROM persistent_webviews;`
|
||||
- [ ] Verify entry exists when window is open
|
||||
- [ ] Close window and verify entry is removed
|
||||
|
||||
5. **Multiple Integration Windows**
|
||||
- [ ] Open browser window for Confluence
|
||||
- [ ] Open browser window for Azure DevOps
|
||||
- [ ] Restart application
|
||||
- [ ] Verify both windows are restored
|
||||
- [ ] Close one window
|
||||
- [ ] Verify only one is removed from database
|
||||
- [ ] Restart and verify remaining window still restores
|
||||
|
||||
6. **Cookie Persistence (No HttpOnly Extraction Needed)**
|
||||
- [ ] Log into Confluence browser window
|
||||
- [ ] Close main application
|
||||
- [ ] Relaunch application
|
||||
- [ ] Navigate to a Confluence page that requires authentication
|
||||
- [ ] Verify you are still logged in (cookies maintained by browser)
|
||||
|
||||
### Automated Testing
|
||||
|
||||
```bash
|
||||
# Type checking
|
||||
npx tsc --noEmit
|
||||
|
||||
# Rust compilation
|
||||
cargo check --manifest-path src-tauri/Cargo.toml
|
||||
|
||||
# Rust tests
|
||||
cargo test --manifest-path src-tauri/Cargo.toml
|
||||
|
||||
# Rust linting
|
||||
cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings
|
||||
```
|
||||
|
||||
### Edge Cases to Test
|
||||
|
||||
- Application crash while browser window is open (verify restoration on next launch)
|
||||
- Database corruption (verify graceful handling of restore failures)
|
||||
- Window already exists when trying to create duplicate (verify existing window is focused)
|
||||
- Network connectivity lost during window restoration (verify error handling)
|
||||
- Multiple rapid window open/close cycles (verify database consistency)
|
||||
|
||||
## Architecture Notes
|
||||
|
||||
### Design Decision: Persistent Windows vs Cookie Extraction
|
||||
|
||||
**Problem:** HttpOnly cookies cannot be accessed via JavaScript (`document.cookie`), which broke the original cookie extraction approach for Confluence and other services.
|
||||
|
||||
**Solution:** Instead of extracting cookies, keep the browser window alive across app restarts:
|
||||
- Browser maintains its own internal cookie store (includes HttpOnly cookies)
|
||||
- Cookies are automatically sent with all HTTP requests from the browser
|
||||
- No need for JavaScript extraction or manual token management
|
||||
- Matches Playwright's approach of persistent browser contexts
|
||||
|
||||
### Lifecycle Flow
|
||||
|
||||
1. **Window Creation:** User clicks "Open Browser" → `authenticate_with_webview` creates window → State saved to database
|
||||
2. **App Running:** Window stays open, user can browse freely, cookies maintained by browser
|
||||
3. **Window Close:** User closes window → Event handler removes from database and memory
|
||||
4. **App Restart:** `restore_persistent_webviews` queries database → Recreates all windows → Windows resume with original cookies
|
||||
|
||||
### Database Schema
|
||||
|
||||
```sql
|
||||
CREATE TABLE persistent_webviews (
|
||||
id TEXT PRIMARY KEY,
|
||||
service TEXT NOT NULL CHECK(service IN ('confluence','servicenow','azuredevops')),
|
||||
webview_label TEXT NOT NULL,
|
||||
base_url TEXT NOT NULL,
|
||||
last_active TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
window_x INTEGER,
|
||||
window_y INTEGER,
|
||||
window_width INTEGER,
|
||||
window_height INTEGER,
|
||||
UNIQUE(service)
|
||||
);
|
||||
```
|
||||
|
||||
### Future Enhancements
|
||||
|
||||
- [ ] Save and restore window position/size (columns already exist in schema)
|
||||
- [ ] Add "last_active" timestamp updates on window focus events
|
||||
- [ ] Implement "Close All Windows" command for cleanup
|
||||
- [ ] Add visual indicator in main UI showing which integrations have active browser windows
|
||||
- [ ] Implement session timeout logic (close windows after X days of inactivity)
|
||||
|
||||
## Related Files
|
||||
|
||||
- `src-tauri/src/db/migrations.rs` - Database schema migration
|
||||
- `src-tauri/src/commands/integrations.rs` - Webview persistence and restoration logic
|
||||
- `src-tauri/src/integrations/webview_auth.rs` - Browser window creation
|
||||
- `src-tauri/src/lib.rs` - App startup hook for restoration
|
||||
- `src-tauri/src/state.rs` - AppState structure with `integration_webviews` map
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- Cookie storage remains in the browser's internal secure store (not extracted to database)
|
||||
- Database only stores window metadata (service, label, URL)
|
||||
- No credential information persisted beyond what the browser already maintains
|
||||
- Audit log still tracks all integration API calls separately
|
||||
|
||||
## Migration Path
|
||||
|
||||
Users upgrading to this version will:
|
||||
1. See new database migration `013_create_persistent_webviews` applied automatically
|
||||
2. Existing integrations continue to work (migration is additive only)
|
||||
3. First time opening a browser window will persist it for future sessions
|
||||
4. No manual action required from users
|
||||
@ -1,178 +0,0 @@
|
||||
# Ticket Summary - AI Disclaimer Modal
|
||||
|
||||
## Description
|
||||
|
||||
Added a mandatory AI disclaimer warning that users must accept before creating new issues. This ensures users understand the risks and limitations of AI-assisted triage and accept responsibility for any actions taken based on AI recommendations.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [x] Disclaimer appears automatically on first visit to New Issue page
|
||||
- [x] Modal blocks interaction with page until user accepts or cancels
|
||||
- [x] Acceptance is persisted across sessions
|
||||
- [x] Clear, professional warning about AI limitations
|
||||
- [x] Covers key risks: mistakes, hallucinations, incorrect commands
|
||||
- [x] Emphasizes user responsibility and accountability
|
||||
- [x] Includes best practices for safe AI usage
|
||||
- [x] Cancel button returns user to dashboard
|
||||
- [x] Modal re-appears if user tries to create issue without accepting
|
||||
|
||||
## Work Implemented
|
||||
|
||||
### Frontend Changes
|
||||
**File:** `src/pages/NewIssue/index.tsx`
|
||||
|
||||
1. **Modal Component:**
|
||||
- Full-screen overlay with backdrop
|
||||
- Centered modal dialog (max-width 2xl)
|
||||
- Scrollable content area for long disclaimer text
|
||||
- Professional styling with proper contrast
|
||||
|
||||
2. **Disclaimer Content:**
|
||||
- **Header:** "AI-Assisted Triage Disclaimer"
|
||||
- **Warning Section** (red background):
|
||||
- AI can provide incorrect, incomplete, or outdated information
|
||||
- AI can hallucinate false information
|
||||
- Recommendations may not apply to specific environments
|
||||
- Commands may have unintended consequences (data loss, downtime, security issues)
|
||||
- **Responsibility Section** (yellow background):
|
||||
- User is solely responsible for all actions taken
|
||||
- Must verify AI suggestions against documentation
|
||||
- Must test in non-production first
|
||||
- Must understand commands before executing
|
||||
- Must have backups and rollback plans
|
||||
- **Best Practices:**
|
||||
- Treat AI as starting point, not definitive answer
|
||||
- Consult senior engineers for critical systems
|
||||
- Review AI content for accuracy
|
||||
- Maintain change control processes
|
||||
- Document decisions
|
||||
- **Legal acknowledgment**
|
||||
|
||||
3. **State Management:**
|
||||
- `showDisclaimer` state controls modal visibility
|
||||
- `useEffect` hook checks localStorage on page load
|
||||
- Acceptance stored as `tftsr-ai-disclaimer-accepted` in localStorage
|
||||
- Persists across sessions and app restarts
|
||||
|
||||
4. **User Flow:**
|
||||
- User visits New Issue → Modal appears
|
||||
- User clicks "I Understand and Accept" → Modal closes, localStorage updated
|
||||
- User clicks "Cancel" → Navigates back to dashboard
|
||||
- User tries to create issue without accepting → Modal re-appears
|
||||
- After acceptance, modal never shows again (unless localStorage cleared)
|
||||
|
||||
### Technical Details
|
||||
|
||||
**Storage:** `localStorage.getItem("tftsr-ai-disclaimer-accepted")`
|
||||
- Key: `tftsr-ai-disclaimer-accepted`
|
||||
- Value: `"true"` when accepted
|
||||
- Scope: Per-browser, persists across sessions
|
||||
|
||||
**Validation Points:**
|
||||
1. Page load - Shows modal if not accepted
|
||||
2. "Start Triage" button click - Re-checks acceptance before proceeding
|
||||
|
||||
**Styling:**
|
||||
- Dark overlay: `bg-black/50`
|
||||
- Modal: `bg-background` with border and shadow
|
||||
- Red warning box: `bg-destructive/10 border-destructive/20`
|
||||
- Yellow responsibility box: `bg-yellow-500/10 border-yellow-500/20`
|
||||
- Scrollable content: `max-h-[60vh] overflow-y-auto`
|
||||
|
||||
## Testing Needed
|
||||
|
||||
### Manual Testing
|
||||
|
||||
1. **First Visit Flow:**
|
||||
- [ ] Navigate to New Issue page
|
||||
- [ ] Verify modal appears automatically
|
||||
- [ ] Verify page content is blocked/dimmed
|
||||
- [ ] Verify modal is scrollable
|
||||
- [ ] Verify all sections are visible and readable
|
||||
|
||||
2. **Acceptance Flow:**
|
||||
- [ ] Click "I Understand and Accept"
|
||||
- [ ] Verify modal closes
|
||||
- [ ] Verify can now create issues
|
||||
- [ ] Refresh page
|
||||
- [ ] Verify modal does NOT re-appear
|
||||
|
||||
3. **Cancel Flow:**
|
||||
- [ ] Clear localStorage: `localStorage.removeItem("tftsr-ai-disclaimer-accepted")`
|
||||
- [ ] Go to New Issue page
|
||||
- [ ] Click "Cancel" button
|
||||
- [ ] Verify redirected to dashboard
|
||||
- [ ] Go back to New Issue page
|
||||
- [ ] Verify modal appears again
|
||||
|
||||
4. **Rejection Flow:**
|
||||
- [ ] Clear localStorage
|
||||
- [ ] Go to New Issue page
|
||||
- [ ] Close modal without accepting (if possible)
|
||||
- [ ] Fill in issue details
|
||||
- [ ] Click "Start Triage"
|
||||
- [ ] Verify modal re-appears before issue creation
|
||||
|
||||
5. **Visual Testing:**
|
||||
- [ ] Test in light theme - verify text contrast
|
||||
- [ ] Test in dark theme - verify text contrast
|
||||
- [ ] Test on mobile viewport - verify modal fits
|
||||
- [ ] Test with very long issue title - verify modal remains on top
|
||||
- [ ] Verify warning colors are distinct (red vs yellow boxes)
|
||||
|
||||
6. **Accessibility:**
|
||||
- [ ] Verify modal can be navigated with keyboard
|
||||
- [ ] Verify "Accept" button can be focused and activated with Enter
|
||||
- [ ] Verify "Cancel" button can be focused
|
||||
- [ ] Verify modal traps focus (Tab doesn't leave modal)
|
||||
- [ ] Verify text is readable at different zoom levels
|
||||
|
||||
### Browser Testing
|
||||
|
||||
Test localStorage persistence across:
|
||||
- [ ] Chrome/Edge
|
||||
- [ ] Firefox
|
||||
- [ ] Safari
|
||||
- [ ] Browser restart
|
||||
- [ ] Tab close and reopen
|
||||
|
||||
### Edge Cases
|
||||
|
||||
- [ ] Multiple browser tabs - verify acceptance in one tab reflects in others on reload
|
||||
- [ ] Incognito/private browsing - verify modal appears every session
|
||||
- [ ] localStorage quota exceeded - verify graceful degradation
|
||||
- [ ] Disabled JavaScript - app won't work, but no crashes
|
||||
- [ ] Fast double-click on Accept - verify no duplicate localStorage writes
|
||||
|
||||
## Security Considerations
|
||||
|
||||
**Disclaimer Bypass Risk:**
|
||||
Users could theoretically bypass the disclaimer by:
|
||||
1. Manually setting localStorage: `localStorage.setItem("tftsr-ai-disclaimer-accepted", "true")`
|
||||
2. Using browser dev tools
|
||||
|
||||
**Mitigation:** This is acceptable because:
|
||||
- The disclaimer is for liability protection, not security
|
||||
- Users who bypass it are technical enough to understand the risks
|
||||
- The disclaimer is shown prominently and is hard to miss accidentally
|
||||
- Acceptance is logged client-side (could be enhanced to log server-side for audit)
|
||||
|
||||
**Future Enhancement:**
|
||||
- Log acceptance event to backend with timestamp
|
||||
- Store acceptance in database tied to user session
|
||||
- Require periodic re-acceptance (e.g., every 90 days)
|
||||
- Add version tracking to re-show on disclaimer updates
|
||||
|
||||
## Legal Notes
|
||||
|
||||
This disclaimer should be reviewed by legal counsel to ensure:
|
||||
- Adequate liability protection
|
||||
- Compliance with jurisdiction-specific requirements
|
||||
- Appropriate language for organizational use
|
||||
- Clear "Use at your own risk" messaging
|
||||
|
||||
**Recommended additions (by legal):**
|
||||
- Add version number/date to disclaimer
|
||||
- Log acceptance with timestamp for audit trail
|
||||
- Consider adding "This is an experimental tool" if applicable
|
||||
- Add specific disclaimer for any regulated environments (healthcare, finance, etc.)
|
||||
41
cliff.toml
Normal file
41
cliff.toml
Normal file
@ -0,0 +1,41 @@
|
||||
[changelog]
|
||||
header = """
|
||||
# Changelog
|
||||
|
||||
All notable changes to TFTSR are documented here.
|
||||
Commit types shown: feat, fix, perf, docs, refactor.
|
||||
CI, chore, and build changes are excluded.
|
||||
|
||||
"""
|
||||
body = """
|
||||
{% if version -%}
|
||||
## [{{ version | trim_start_matches(pat="v") }}] — {{ timestamp | date(format="%Y-%m-%d") }}
|
||||
|
||||
{% else -%}
|
||||
## [Unreleased]
|
||||
|
||||
{% endif -%}
|
||||
{% for group, commits in commits | group_by(attribute="group") -%}
|
||||
### {{ group | upper_first }}
|
||||
{% for commit in commits -%}
|
||||
- {% if commit.scope %}**{{ commit.scope }}**: {% endif %}{{ commit.message | upper_first }}
|
||||
{% endfor %}
|
||||
{% endfor %}
|
||||
"""
|
||||
footer = ""
|
||||
trim = true
|
||||
|
||||
[git]
|
||||
conventional_commits = true
|
||||
filter_unconventional = true
|
||||
tag_pattern = "v[0-9].*"
|
||||
ignore_tags = "rc|alpha|beta"
|
||||
sort_commits = "oldest"
|
||||
commit_parsers = [
|
||||
{ message = "^feat", group = "Features" },
|
||||
{ message = "^fix", group = "Bug Fixes" },
|
||||
{ message = "^perf", group = "Performance" },
|
||||
{ message = "^docs", group = "Documentation" },
|
||||
{ message = "^refactor", group = "Refactoring" },
|
||||
{ message = "^ci|^chore|^build|^test|^style", skip = true },
|
||||
]
|
||||
@ -29,8 +29,7 @@ TFTSR uses a Tauri 2.x architecture: a Rust backend runs natively, and a React/T
|
||||
pub struct AppState {
|
||||
pub db: Arc<Mutex<rusqlite::Connection>>,
|
||||
pub settings: Arc<Mutex<AppSettings>>,
|
||||
pub app_data_dir: PathBuf, // ~/.local/share/trcaa on Linux
|
||||
pub integration_webviews: Arc<Mutex<HashMap<String, String>>>,
|
||||
pub app_data_dir: PathBuf, // ~/.local/share/tftsr on Linux
|
||||
}
|
||||
```
|
||||
|
||||
@ -47,10 +46,11 @@ All command handlers receive `State<'_, AppState>` as a Tauri-injected parameter
|
||||
| `commands/analysis.rs` | Log file upload, PII detection, redaction |
|
||||
| `commands/docs.rs` | RCA and post-mortem generation, document export |
|
||||
| `commands/system.rs` | Ollama management, hardware probe, settings, audit log |
|
||||
| `commands/integrations.rs` | Confluence / ServiceNow / ADO — OAuth2, WebView auth, tool calling |
|
||||
| `commands/image.rs` | Image attachment upload, list, delete, paste |
|
||||
| `commands/integrations.rs` | Confluence / ServiceNow / ADO — v0.2 stubs |
|
||||
| `ai/provider.rs` | `Provider` trait + `create_provider()` factory |
|
||||
| `pii/detector.rs` | Multi-pattern PII scanner with overlap resolution |
|
||||
| `db/migrations.rs` | Versioned schema (14 migrations tracked in `_migrations` table) |
|
||||
| `db/migrations.rs` | Versioned schema (17 migrations in `_migrations` table) |
|
||||
| `db/models.rs` | All DB types — see `IssueDetail` note below |
|
||||
| `docs/rca.rs` + `docs/postmortem.rs` | Markdown template builders |
|
||||
| `audit/log.rs` | `write_audit_event()` — called before every external send |
|
||||
@ -75,6 +75,7 @@ src-tauri/src/
|
||||
│ ├── analysis.rs
|
||||
│ ├── docs.rs
|
||||
│ ├── system.rs
|
||||
│ ├── image.rs
|
||||
│ └── integrations.rs
|
||||
├── pii/
|
||||
│ ├── patterns.rs
|
||||
@ -175,34 +176,75 @@ pub struct IssueDetail {
|
||||
|
||||
Use `detail.issue.title`, **not** `detail.title`.
|
||||
|
||||
## Incident Response Methodology
|
||||
|
||||
The application integrates a comprehensive incident response framework via system prompt injection. The `INCIDENT_RESPONSE_FRAMEWORK` constant in `src/lib/domainPrompts.ts` is appended to all 17 domain-specific system prompts (Linux, Windows, Network, Kubernetes, Databases, Virtualization, Hardware, Observability, and others).
|
||||
|
||||
**5-Phase Framework:**
|
||||
|
||||
1. **Detection & Evidence Gathering** — Initial issue assessment, log collection, PII redaction
|
||||
2. **Diagnosis & Hypothesis Testing** — AI-assisted analysis, pattern matching against known incidents
|
||||
3. **Root Cause Analysis with 5-Whys** — Iterative questioning to identify underlying cause (steps 1–5)
|
||||
4. **Resolution & Prevention** — Remediation planning and implementation
|
||||
5. **Post-Incident Review** — Timeline-based blameless post-mortem and lessons learned
|
||||
|
||||
**System Prompt Injection:**
|
||||
|
||||
The `chat_message` command accepts an optional `system_prompt` parameter. If provided, it prepends domain expertise before the conversation history. If omitted, the framework selects the appropriate domain prompt based on the issue category. This allows:
|
||||
|
||||
- **Specialized expertise**: Different frameworks for Linux vs. Kubernetes vs. Network incidents
|
||||
- **Flexible override**: Users can inject custom system prompts for cross-domain problems
|
||||
- **Consistent methodology**: All 17 domain prompts follow the same 5-phase incident response structure
|
||||
|
||||
**Timeline Event Recording:**
|
||||
|
||||
Timeline events are recorded non-blockingly at key triage moments:
|
||||
|
||||
```
|
||||
Issue Creation → triage_started
|
||||
↓
|
||||
Log Upload → log_uploaded (metadata: file_name, file_size)
|
||||
↓
|
||||
Why-Level Progression → why_level_advanced (metadata: from_level → to_level)
|
||||
↓
|
||||
Root Cause Identified → root_cause_identified (metadata: root_cause, confidence)
|
||||
↓
|
||||
RCA Generated → rca_generated (metadata: doc_id, section_count)
|
||||
↓
|
||||
Postmortem Generated → postmortem_generated (metadata: doc_id, timeline_events_count)
|
||||
↓
|
||||
Document Exported → document_exported (metadata: format, file_path)
|
||||
```
|
||||
|
||||
**Document Generation:**
|
||||
|
||||
RCA and Postmortem generators now use real timeline event data instead of placeholders:
|
||||
|
||||
- **RCA**: Incorporates timeline to show detection-to-root-cause progression
|
||||
- **Postmortem**: Uses full timeline to demonstrate the complete incident lifecycle and response effectiveness
|
||||
|
||||
Timeline events are stored in the `timeline_events` table (indexed by issue_id and created_at for fast retrieval) and dual-written to `audit_log` for security/compliance purposes.
|
||||
|
||||
## Application Startup Sequence
|
||||
|
||||
```
|
||||
1. Initialize tracing (RUST_LOG controls level)
|
||||
2. Determine data directory (state::get_app_data_dir() or TFTSR_DATA_DIR)
|
||||
3. Auto-generate or load .dbkey / .enckey (mode 0600) — see ADR-005
|
||||
4. Open / create SQLCipher encrypted database
|
||||
- If plain SQLite detected (debug→release upgrade): auto-migrate + backup
|
||||
5. Run DB migrations (14 schema versions)
|
||||
6. Create AppState (db + settings + app_data_dir + integration_webviews)
|
||||
7. Register Tauri plugins (stronghold, dialog, fs, shell, http)
|
||||
8. Register all IPC command handlers via generate_handler![]
|
||||
9. Start WebView with React app
|
||||
2. Determine data directory (~/.local/share/tftsr or TFTSR_DATA_DIR)
|
||||
3. Open / create SQLite database (run migrations)
|
||||
4. Create AppState (db + settings + app_data_dir)
|
||||
5. Register Tauri plugins (stronghold, dialog, fs, shell, http, cli, updater)
|
||||
6. Register all 39 IPC command handlers
|
||||
7. Start WebView with React app
|
||||
```
|
||||
|
||||
## Architecture Documentation
|
||||
## Image Attachments
|
||||
|
||||
Full architecture documentation with C4 diagrams, data flow diagrams, and Architecture Decision Records (ADRs) is available in [`docs/architecture/`](../architecture/README.md):
|
||||
The app supports uploading and managing image files (screenshots, diagrams) as attachments:
|
||||
|
||||
| Document | Contents |
|
||||
|----------|----------|
|
||||
| [Architecture Overview](../architecture/README.md) | C4 diagrams, data flows, security model |
|
||||
| [ADR-001](../architecture/adrs/ADR-001-tauri-desktop-framework.md) | Why Tauri over Electron |
|
||||
| [ADR-002](../architecture/adrs/ADR-002-sqlcipher-encrypted-database.md) | SQLCipher encryption choices |
|
||||
| [ADR-003](../architecture/adrs/ADR-003-provider-trait-pattern.md) | AI provider trait design |
|
||||
| [ADR-004](../architecture/adrs/ADR-004-pii-regex-aho-corasick.md) | PII detection implementation |
|
||||
| [ADR-005](../architecture/adrs/ADR-005-auto-generate-encryption-keys.md) | Key auto-generation design |
|
||||
| [ADR-006](../architecture/adrs/ADR-006-zustand-state-management.md) | Frontend state management |
|
||||
1. **Upload** via `upload_image_attachmentCmd()` or `upload_paste_imageCmd()` (clipboard paste)
|
||||
2. **PII detection** runs automatically on upload
|
||||
3. **User approval** required before image is stored
|
||||
4. **Database storage** in `image_attachments` table with SHA-256 hash
|
||||
|
||||
## Data Flow
|
||||
|
||||
|
||||
@ -27,12 +27,77 @@ macOS runner runs jobs **directly on the host** (no Docker container) — macOS
|
||||
|
||||
---
|
||||
|
||||
## Test Pipeline (`.woodpecker/test.yml`)
|
||||
## Pre-baked Builder Images
|
||||
|
||||
CI build and test jobs use pre-baked Docker images pushed to the local Gitea registry
|
||||
at `172.0.0.29:3000`. These images bake in all system dependencies (Tauri libs, Node.js,
|
||||
Rust toolchain, cross-compilers) so that CI jobs skip package installation entirely.
|
||||
|
||||
| Image | Used by jobs | Contents |
|
||||
|-------|-------------|----------|
|
||||
| `172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22` | `rust-fmt-check`, `rust-clippy`, `rust-tests`, `build-linux-amd64` | Rust 1.88 + rustfmt + clippy + Tauri amd64 libs + Node.js 22 |
|
||||
| `172.0.0.29:3000/sarman/trcaa-windows-cross:rust1.88-node22` | `build-windows-amd64` | Rust 1.88 + mingw-w64 + NSIS + Node.js 22 |
|
||||
| `172.0.0.29:3000/sarman/trcaa-linux-arm64:rust1.88-node22` | `build-linux-arm64` | Rust 1.88 + aarch64 cross-toolchain + arm64 multiarch libs + Node.js 22 |
|
||||
|
||||
**Rebuild triggers:** Rust toolchain version bump, webkit2gtk/gtk major version change, Node.js major version change.
|
||||
|
||||
**How to rebuild images:**
|
||||
1. Trigger `build-images.yml` via `workflow_dispatch` in the Gitea Actions UI
|
||||
2. Confirm all 3 images appear in the Gitea package/container registry at `172.0.0.29:3000`
|
||||
3. Only then merge workflow changes that depend on the new image contents
|
||||
|
||||
**Server prerequisite — insecure registry** (one-time, on 172.0.0.29):
|
||||
```sh
|
||||
echo '{"insecure-registries":["172.0.0.29:3000"]}' | sudo tee /etc/docker/daemon.json
|
||||
sudo systemctl restart docker
|
||||
```
|
||||
This must be configured on every machine running an act_runner for the runner's Docker
|
||||
daemon to pull from the local HTTP registry.
|
||||
|
||||
---
|
||||
|
||||
## Cargo and npm Caching
|
||||
|
||||
All Rust and build jobs use `actions/cache@v3` to cache downloaded package artifacts.
|
||||
Gitea 1.22 implements the GitHub Actions cache API natively.
|
||||
|
||||
**Cargo cache** (Rust jobs):
|
||||
```yaml
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry/index
|
||||
~/.cargo/registry/cache
|
||||
~/.cargo/git/db
|
||||
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-
|
||||
```
|
||||
|
||||
**npm cache** (frontend and build jobs):
|
||||
```yaml
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-npm-
|
||||
```
|
||||
|
||||
Cache keys for cross-compile jobs use a suffix to avoid collisions:
|
||||
- Windows build: `${{ runner.os }}-cargo-windows-${{ hashFiles('**/Cargo.lock') }}`
|
||||
- arm64 build: `${{ runner.os }}-cargo-arm64-${{ hashFiles('**/Cargo.lock') }}`
|
||||
|
||||
---
|
||||
|
||||
## Test Pipeline (`.gitea/workflows/test.yml`)
|
||||
|
||||
**Triggers:** Pull requests only.
|
||||
|
||||
```
|
||||
Pipeline steps:
|
||||
Pipeline jobs (run in parallel):
|
||||
1. rust-fmt-check → cargo fmt --check
|
||||
2. rust-clippy → cargo clippy -- -D warnings
|
||||
3. rust-tests → cargo test (64 tests)
|
||||
@ -41,28 +106,9 @@ Pipeline steps:
|
||||
```
|
||||
|
||||
**Docker images used:**
|
||||
- `rust:1.88-slim` — Rust steps (minimum for cookie_store + time + darling)
|
||||
- `172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22` — Rust steps (replaces `rust:1.88-slim`)
|
||||
- `node:22-alpine` — Frontend steps
|
||||
|
||||
**Pipeline YAML format (Woodpecker 2.x — steps list format):**
|
||||
```yaml
|
||||
clone:
|
||||
git:
|
||||
image: woodpeckerci/plugin-git
|
||||
network_mode: gogs_default # requires repo_trusted=1
|
||||
environment:
|
||||
- CI_REPO_CLONE_URL=http://gitea_app:3000/sarman/tftsr-devops_investigation.git
|
||||
|
||||
steps:
|
||||
- name: step-name # LIST format (- name:)
|
||||
image: rust:1.88-slim
|
||||
commands:
|
||||
- cargo test
|
||||
```
|
||||
|
||||
> ⚠️ Woodpecker 2.x uses the `steps:` list format. The legacy `pipeline:` map format from
|
||||
> Woodpecker 0.15.4 is no longer supported.
|
||||
|
||||
---
|
||||
|
||||
## Release Pipeline (`.gitea/workflows/auto-tag.yml`)
|
||||
@ -73,14 +119,16 @@ Auto tags are created by `.gitea/workflows/auto-tag.yml` using `git tag` + `git
|
||||
Release jobs are executed in the same workflow and depend on `autotag` completion.
|
||||
|
||||
```
|
||||
Jobs (run in parallel):
|
||||
build-linux-amd64 → cargo tauri build (x86_64-unknown-linux-gnu)
|
||||
Jobs (run in parallel after autotag):
|
||||
build-linux-amd64 → image: trcaa-linux-amd64:rust1.88-node22
|
||||
→ cargo tauri build (x86_64-unknown-linux-gnu)
|
||||
→ {.deb, .rpm, .AppImage} uploaded to Gitea release
|
||||
→ fails fast if no Linux artifacts are produced
|
||||
build-windows-amd64 → cargo tauri build (x86_64-pc-windows-gnu) via mingw-w64
|
||||
build-windows-amd64 → image: trcaa-windows-cross:rust1.88-node22
|
||||
→ cargo tauri build (x86_64-pc-windows-gnu) via mingw-w64
|
||||
→ {.exe, .msi} uploaded to Gitea release
|
||||
→ fails fast if no Windows artifacts are produced
|
||||
build-linux-arm64 → Ubuntu 22.04 base (ports.ubuntu.com for arm64 packages)
|
||||
build-linux-arm64 → image: trcaa-linux-arm64:rust1.88-node22 (ubuntu:22.04-based)
|
||||
→ cargo tauri build (aarch64-unknown-linux-gnu)
|
||||
→ {.deb, .rpm, .AppImage} uploaded to Gitea release
|
||||
→ fails fast if no Linux artifacts are produced
|
||||
@ -209,6 +257,52 @@ UPDATE protect_branch SET protected=true, require_pull_request=true WHERE repo_i
|
||||
|
||||
---
|
||||
|
||||
## Changelog Generation
|
||||
|
||||
Changelogs are generated automatically by **git-cliff** on every release.
|
||||
Configuration lives in `cliff.toml` at the repo root.
|
||||
|
||||
### How it works
|
||||
|
||||
A `changelog` job in `auto-tag.yml` runs in parallel with the build jobs, immediately
|
||||
after `autotag` completes:
|
||||
|
||||
1. Clones the full repo history with all tags (`--depth=2147483647` — git-cliff needs
|
||||
every tag to compute version boundaries).
|
||||
2. Downloads the git-cliff v2.7.0 static musl binary (~5 MB, no image change needed).
|
||||
3. Runs `git-cliff --output CHANGELOG.md` to regenerate the full cumulative changelog.
|
||||
4. Runs `git-cliff --latest --strip all` to produce release notes for the new tag only.
|
||||
5. PATCHes the Gitea release body with those notes (replaces the static `"Release vX.Y.Z"`).
|
||||
6. Commits `CHANGELOG.md` to master with `[skip ci]` appended to the message.
|
||||
The `[skip ci]` token prevents `auto-tag.yml` from re-triggering on the CHANGELOG commit.
|
||||
7. Uploads `CHANGELOG.md` as a release asset (replaces any previous version).
|
||||
|
||||
### cliff.toml reference
|
||||
|
||||
| Setting | Value |
|
||||
|---------|-------|
|
||||
| `tag_pattern` | `v[0-9].*` |
|
||||
| `ignore_tags` | `rc\|alpha\|beta` |
|
||||
| `filter_unconventional` | `true` — non-conventional commits are dropped |
|
||||
| Included types | `feat`, `fix`, `perf`, `docs`, `refactor` |
|
||||
| Excluded types | `ci`, `chore`, `build`, `test`, `style` |
|
||||
|
||||
### Loop prevention
|
||||
|
||||
The `[skip ci]` suffix on the CHANGELOG commit message is recognised by Gitea Actions
|
||||
and causes the workflow to be skipped for that push. Without it, the CHANGELOG commit
|
||||
would trigger `auto-tag.yml` again, incrementing the patch version forever.
|
||||
|
||||
### Bootstrap
|
||||
|
||||
The initial `CHANGELOG.md` was generated locally before the first PR:
|
||||
```sh
|
||||
git-cliff --config cliff.toml --output CHANGELOG.md
|
||||
```
|
||||
Subsequent runs are fully automated by CI.
|
||||
|
||||
---
|
||||
|
||||
## Known Issues & Fixes
|
||||
|
||||
### Debian Multiarch Breaks arm64 Cross-Compile (`held broken packages`)
|
||||
|
||||
@ -2,7 +2,7 @@
|
||||
|
||||
## Overview
|
||||
|
||||
TFTSR uses **SQLite** via `rusqlite` with the `bundled-sqlcipher` feature for AES-256 encryption in production. 11 versioned migrations are tracked in the `_migrations` table.
|
||||
TFTSR uses **SQLite** via `rusqlite` with the `bundled-sqlcipher` feature for AES-256 encryption in production. 17 versioned migrations are tracked in the `_migrations` table.
|
||||
|
||||
**DB file location:** `{app_data_dir}/tftsr.db`
|
||||
|
||||
@ -38,7 +38,7 @@ pub fn init_db(data_dir: &Path) -> anyhow::Result<Connection> {
|
||||
|
||||
---
|
||||
|
||||
## Schema (11 Migrations)
|
||||
## Schema (17 Migrations)
|
||||
|
||||
### 001 — issues
|
||||
|
||||
@ -211,6 +211,29 @@ CREATE TABLE integration_config (
|
||||
);
|
||||
```
|
||||
|
||||
### 012 — image_attachments (v0.2.7+)
|
||||
|
||||
```sql
|
||||
CREATE TABLE image_attachments (
|
||||
id TEXT PRIMARY KEY,
|
||||
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
|
||||
file_name TEXT NOT NULL,
|
||||
file_path TEXT NOT NULL DEFAULT '',
|
||||
file_size INTEGER NOT NULL DEFAULT 0,
|
||||
mime_type TEXT NOT NULL DEFAULT 'image/png',
|
||||
upload_hash TEXT NOT NULL DEFAULT '',
|
||||
uploaded_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
pii_warning_acknowledged INTEGER NOT NULL DEFAULT 1,
|
||||
is_paste INTEGER NOT NULL DEFAULT 0
|
||||
);
|
||||
```
|
||||
|
||||
**Features:**
|
||||
- Image file metadata stored in database
|
||||
- `upload_hash`: SHA-256 hash of file content (for deduplication)
|
||||
- `pii_warning_acknowledged`: User confirmation that PII may be present
|
||||
- `is_paste`: Flag for screenshots copied from clipboard
|
||||
|
||||
**Encryption:**
|
||||
- OAuth2 tokens encrypted with AES-256-GCM
|
||||
- Key derived from `TFTSR_DB_KEY` environment variable
|
||||
@ -222,6 +245,51 @@ CREATE TABLE integration_config (
|
||||
- Basic auth (ServiceNow): Store encrypted password
|
||||
- One credential per service (enforced by UNIQUE constraint)
|
||||
|
||||
### 017 — timeline_events (Incident Response Timeline)
|
||||
|
||||
```sql
|
||||
CREATE TABLE timeline_events (
|
||||
id TEXT PRIMARY KEY,
|
||||
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
|
||||
event_type TEXT NOT NULL,
|
||||
description TEXT NOT NULL,
|
||||
metadata TEXT, -- JSON object with event-specific data
|
||||
created_at TEXT NOT NULL
|
||||
);
|
||||
|
||||
CREATE INDEX idx_timeline_events_issue ON timeline_events(issue_id);
|
||||
CREATE INDEX idx_timeline_events_time ON timeline_events(created_at);
|
||||
```
|
||||
|
||||
**Event Types:**
|
||||
- `triage_started` — Incident response begins, initial issue properties recorded
|
||||
- `log_uploaded` — Log file uploaded and analyzed
|
||||
- `why_level_advanced` — 5-Whys entry completed, progression to next level
|
||||
- `root_cause_identified` — Root cause determined from analysis
|
||||
- `rca_generated` — Root Cause Analysis document created
|
||||
- `postmortem_generated` — Post-mortem document created
|
||||
- `document_exported` — Document exported to file (MD or PDF)
|
||||
|
||||
**Metadata Structure (JSON):**
|
||||
```json
|
||||
{
|
||||
"triage_started": {"severity": "high", "category": "network"},
|
||||
"log_uploaded": {"file_name": "app.log", "file_size": 2048576},
|
||||
"why_level_advanced": {"from_level": 2, "to_level": 3, "question": "Why did the service timeout?"},
|
||||
"root_cause_identified": {"root_cause": "DNS resolution failure", "confidence": 0.95},
|
||||
"rca_generated": {"doc_id": "doc_abc123", "section_count": 7},
|
||||
"postmortem_generated": {"doc_id": "doc_def456", "timeline_events_count": 12},
|
||||
"document_exported": {"format": "pdf", "file_path": "/home/user/docs/rca.pdf"}
|
||||
}
|
||||
```
|
||||
|
||||
**Design Notes:**
|
||||
- Timeline events are **queryable** (indexed by issue_id and created_at) for document generation
|
||||
- Dual-write: Events recorded to both `timeline_events` and `audit_log` — timeline for chronological reporting, audit_log for security/compliance
|
||||
- `created_at`: TEXT UTC timestamp (`YYYY-MM-DD HH:MM:SS`)
|
||||
- Non-blocking writes: Timeline events recorded asynchronously at key triage moments
|
||||
- Cascade delete from issues ensures cleanup
|
||||
|
||||
---
|
||||
|
||||
## Key Design Notes
|
||||
@ -266,4 +334,13 @@ pub struct AuditEntry {
|
||||
pub user_id: String,
|
||||
pub details: Option<String>,
|
||||
}
|
||||
|
||||
pub struct TimelineEvent {
|
||||
pub id: String,
|
||||
pub issue_id: String,
|
||||
pub event_type: String,
|
||||
pub description: String,
|
||||
pub metadata: Option<String>, // JSON
|
||||
pub created_at: String,
|
||||
}
|
||||
```
|
||||
|
||||
@ -32,12 +32,14 @@
|
||||
- **Ollama Management** — Hardware detection, model recommendations, in-app model management
|
||||
- **Audit Trail** — Every external data send logged with SHA-256 hash
|
||||
- **Domain-Specific Prompts** — 8 IT domains: Linux, Windows, Network, Kubernetes, Databases, Virtualization, Hardware, Observability
|
||||
- **Image Attachments** — Upload and manage image files with PII detection and mandatory user approval
|
||||
|
||||
## Releases
|
||||
|
||||
| Version | Status | Highlights |
|
||||
|---------|--------|-----------|
|
||||
| v0.2.6 | 🚀 Latest | Custom REST AI gateway support, OAuth2 shell permissions, user ID tracking |
|
||||
| v0.2.5 | Released | Image attachments with PII detection and approval workflow |
|
||||
| v0.2.3 | Released | Confluence/ServiceNow/ADO REST API clients (19 TDD tests) |
|
||||
| v0.1.1 | Released | Core application with PII detection, RCA generation |
|
||||
|
||||
|
||||
@ -62,11 +62,27 @@ updateFiveWhyCmd(entryId: string, answer: string) → void
|
||||
```
|
||||
Sets or updates the answer for an existing 5-Whys entry.
|
||||
|
||||
### `get_timeline_events`
|
||||
```typescript
|
||||
getTimelineEventsCmd(issueId: string) → TimelineEvent[]
|
||||
```
|
||||
Retrieves all timeline events for an issue, ordered by created_at ascending.
|
||||
```typescript
|
||||
interface TimelineEvent {
|
||||
id: string;
|
||||
issue_id: string;
|
||||
event_type: string; // One of: triage_started, log_uploaded, why_level_advanced, etc.
|
||||
description: string;
|
||||
metadata?: Record<string, any>; // Event-specific JSON data
|
||||
created_at: string; // UTC timestamp
|
||||
}
|
||||
```
|
||||
|
||||
### `add_timeline_event`
|
||||
```typescript
|
||||
addTimelineEventCmd(issueId: string, eventType: string, description: string) → TimelineEvent
|
||||
addTimelineEventCmd(issueId: string, eventType: string, description: string, metadata?: Record<string, any>) → TimelineEvent
|
||||
```
|
||||
Records a timestamped event in the issue timeline.
|
||||
Records a timestamped event in the issue timeline. Dual-writes to both `timeline_events` (for document generation) and `audit_log` (for security audit trail).
|
||||
|
||||
---
|
||||
|
||||
@ -99,6 +115,34 @@ Rewrites file content with approved redactions. Records SHA-256 in audit log. Re
|
||||
|
||||
---
|
||||
|
||||
## Image Attachment Commands
|
||||
|
||||
### `upload_image_attachment`
|
||||
```typescript
|
||||
uploadImageAttachmentCmd(issueId: string, filePath: string, piiWarningAcknowledged: boolean) → ImageAttachment
|
||||
```
|
||||
Uploads an image file. Computes SHA-256, stores metadata in DB. Returns `ImageAttachment` record.
|
||||
|
||||
### `list_image_attachments`
|
||||
```typescript
|
||||
listImageAttachmentsCmd(issueId: string) → ImageAttachment[]
|
||||
```
|
||||
Lists all image attachments for an issue.
|
||||
|
||||
### `delete_image_attachment`
|
||||
```typescript
|
||||
deleteImageAttachmentCmd(imageId: string) → void
|
||||
```
|
||||
Deletes an image attachment from disk and database.
|
||||
|
||||
### `upload_paste_image`
|
||||
```typescript
|
||||
uploadPasteImageCmd(issueId: string, base64Data: string, fileName: string, piiWarningAcknowledged: boolean) → ImageAttachment
|
||||
```
|
||||
Uploads an image from clipboard paste (base64). Returns `ImageAttachment` record.
|
||||
|
||||
---
|
||||
|
||||
## AI Commands
|
||||
|
||||
### `analyze_logs`
|
||||
@ -109,9 +153,9 @@ Sends selected (redacted) log files to the AI provider with an analysis prompt.
|
||||
|
||||
### `chat_message`
|
||||
```typescript
|
||||
chatMessageCmd(issueId: string, message: string, providerConfig: ProviderConfig) → ChatResponse
|
||||
chatMessageCmd(issueId: string, message: string, providerConfig: ProviderConfig, systemPrompt?: string) → ChatResponse
|
||||
```
|
||||
Sends a message in the ongoing triage conversation. Domain system prompt is injected automatically on first message. AI response is parsed for why-level indicators (1–5).
|
||||
Sends a message in the ongoing triage conversation. Optional `systemPrompt` parameter allows prepending domain expertise before conversation history. If not provided, the domain-specific system prompt for the issue category is injected automatically on first message. AI response is parsed for why-level indicators (1–5).
|
||||
|
||||
### `list_providers`
|
||||
```typescript
|
||||
@ -127,13 +171,13 @@ Returns the list of supported providers with their available models and configur
|
||||
```typescript
|
||||
generateRcaCmd(issueId: string) → Document
|
||||
```
|
||||
Builds an RCA Markdown document from the issue data, 5-Whys answers, and timeline.
|
||||
Builds an RCA Markdown document from the issue data, 5-Whys answers, and timeline events. Uses real incident response timeline (log uploads, why-level progression, root cause identification) instead of placeholders.
|
||||
|
||||
### `generate_postmortem`
|
||||
```typescript
|
||||
generatePostmortemCmd(issueId: string) → Document
|
||||
```
|
||||
Builds a blameless post-mortem Markdown document.
|
||||
Builds a blameless post-mortem Markdown document. Incorporates timeline events to show the full incident lifecycle: detection, diagnosis, resolution, and post-incident review phases.
|
||||
|
||||
### `update_document`
|
||||
```typescript
|
||||
|
||||
142
eslint.config.js
Normal file
142
eslint.config.js
Normal file
@ -0,0 +1,142 @@
|
||||
import globals from "globals";
|
||||
import pluginReact from "eslint-plugin-react";
|
||||
import pluginReactHooks from "eslint-plugin-react-hooks";
|
||||
import pluginTs from "@typescript-eslint/eslint-plugin";
|
||||
import parserTs from "@typescript-eslint/parser";
|
||||
|
||||
export default [
|
||||
{
|
||||
files: ["src/**/*.{ts,tsx}"],
|
||||
languageOptions: {
|
||||
ecmaVersion: "latest",
|
||||
sourceType: "module",
|
||||
globals: {
|
||||
...globals.browser,
|
||||
...globals.node,
|
||||
},
|
||||
parser: parserTs,
|
||||
parserOptions: {
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
project: "./tsconfig.json",
|
||||
},
|
||||
},
|
||||
plugins: {
|
||||
react: pluginReact,
|
||||
"react-hooks": pluginReactHooks,
|
||||
"@typescript-eslint": pluginTs,
|
||||
},
|
||||
settings: {
|
||||
react: {
|
||||
version: "detect",
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
...pluginReact.configs.recommended.rules,
|
||||
...pluginReactHooks.configs.recommended.rules,
|
||||
...pluginTs.configs.recommended.rules,
|
||||
"no-unused-vars": "off",
|
||||
"@typescript-eslint/no-unused-vars": ["error", { argsIgnorePattern: "^_" }],
|
||||
"no-console": ["warn", { allow: ["warn", "error"] }],
|
||||
"react/react-in-jsx-scope": "off",
|
||||
"react/prop-types": "off",
|
||||
"react/no-unescaped-entities": "off",
|
||||
},
|
||||
},
|
||||
{
|
||||
files: ["tests/unit/**/*.test.{ts,tsx}"],
|
||||
languageOptions: {
|
||||
ecmaVersion: "latest",
|
||||
sourceType: "module",
|
||||
globals: {
|
||||
...globals.browser,
|
||||
...globals.node,
|
||||
...globals.vitest,
|
||||
},
|
||||
parser: parserTs,
|
||||
parserOptions: {
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
project: "./tsconfig.json",
|
||||
},
|
||||
},
|
||||
plugins: {
|
||||
react: pluginReact,
|
||||
"react-hooks": pluginReactHooks,
|
||||
"@typescript-eslint": pluginTs,
|
||||
},
|
||||
settings: {
|
||||
react: {
|
||||
version: "detect",
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
...pluginReact.configs.recommended.rules,
|
||||
...pluginReactHooks.configs.recommended.rules,
|
||||
...pluginTs.configs.recommended.rules,
|
||||
"no-unused-vars": "off",
|
||||
"@typescript-eslint/no-unused-vars": ["error", { argsIgnorePattern: "^_" }],
|
||||
"no-console": ["warn", { allow: ["warn", "error"] }],
|
||||
"react/react-in-jsx-scope": "off",
|
||||
"react/prop-types": "off",
|
||||
"react/no-unescaped-entities": "off",
|
||||
},
|
||||
},
|
||||
{
|
||||
files: ["tests/e2e/**/*.ts", "tests/e2e/**/*.tsx"],
|
||||
languageOptions: {
|
||||
ecmaVersion: "latest",
|
||||
sourceType: "module",
|
||||
globals: {
|
||||
...globals.node,
|
||||
},
|
||||
parser: parserTs,
|
||||
parserOptions: {
|
||||
ecmaFeatures: {
|
||||
jsx: false,
|
||||
},
|
||||
},
|
||||
},
|
||||
plugins: {
|
||||
"@typescript-eslint": pluginTs,
|
||||
},
|
||||
rules: {
|
||||
...pluginTs.configs.recommended.rules,
|
||||
"no-unused-vars": "off",
|
||||
"@typescript-eslint/no-unused-vars": ["error", { argsIgnorePattern: "^_" }],
|
||||
"no-console": ["warn", { allow: ["warn", "error"] }],
|
||||
},
|
||||
},
|
||||
{
|
||||
files: ["cli/**/*.{ts,tsx}"],
|
||||
languageOptions: {
|
||||
ecmaVersion: "latest",
|
||||
sourceType: "module",
|
||||
globals: {
|
||||
...globals.node,
|
||||
},
|
||||
parser: parserTs,
|
||||
parserOptions: {
|
||||
ecmaFeatures: {
|
||||
jsx: false,
|
||||
},
|
||||
},
|
||||
},
|
||||
plugins: {
|
||||
"@typescript-eslint": pluginTs,
|
||||
},
|
||||
rules: {
|
||||
...pluginTs.configs.recommended.rules,
|
||||
"no-unused-vars": "off",
|
||||
"@typescript-eslint/no-unused-vars": ["error", { argsIgnorePattern: "^_" }],
|
||||
"no-console": ["warn", { allow: ["warn", "error"] }],
|
||||
"react/no-unescaped-entities": "off",
|
||||
},
|
||||
},
|
||||
{
|
||||
files: ["**/*.ts", "**/*.tsx"],
|
||||
ignores: ["dist/", "node_modules/", "src-tauri/", "target/", "coverage/", "tailwind.config.ts"],
|
||||
},
|
||||
];
|
||||
3181
package-lock.json
generated
3181
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@ -1,11 +1,12 @@
|
||||
{
|
||||
"name": "tftsr",
|
||||
"private": true,
|
||||
"version": "0.1.0",
|
||||
"version": "0.2.62",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
"build": "tsc && vite build",
|
||||
"version:update": "node scripts/update-version.mjs",
|
||||
"preview": "vite preview",
|
||||
"tauri": "tauri",
|
||||
"test": "vitest",
|
||||
@ -37,11 +38,17 @@
|
||||
"@testing-library/user-event": "^14",
|
||||
"@types/react": "^18",
|
||||
"@types/react-dom": "^18",
|
||||
"@types/testing-library__react": "^10",
|
||||
"@typescript-eslint/eslint-plugin": "^8.58.1",
|
||||
"@typescript-eslint/parser": "^8.58.1",
|
||||
"@vitejs/plugin-react": "^4",
|
||||
"@vitest/coverage-v8": "^2",
|
||||
"@wdio/cli": "^9",
|
||||
"@wdio/mocha-framework": "^9",
|
||||
"autoprefixer": "^10",
|
||||
"eslint": "^9.39.4",
|
||||
"eslint-plugin-react": "^7.37.5",
|
||||
"eslint-plugin-react-hooks": "^7.0.1",
|
||||
"jsdom": "^26",
|
||||
"postcss": "^8",
|
||||
"typescript": "^5",
|
||||
|
||||
111
scripts/update-version.mjs
Normal file
111
scripts/update-version.mjs
Normal file
@ -0,0 +1,111 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
import { execSync } from 'child_process';
|
||||
import { readFileSync, writeFileSync, existsSync, mkdirSync } from 'fs';
|
||||
import { resolve, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
const projectRoot = resolve(__dirname, '..');
|
||||
|
||||
/**
|
||||
* Validate version is semver-compliant (X.Y.Z)
|
||||
*/
|
||||
function isValidSemver(version) {
|
||||
return /^[0-9]+\.[0-9]+\.[0-9]+$/.test(version);
|
||||
}
|
||||
|
||||
function validateGitRepo(root) {
|
||||
if (!existsSync(resolve(root, '.git'))) {
|
||||
throw new Error(`Not a Git repository: ${root}`);
|
||||
}
|
||||
}
|
||||
|
||||
function getVersionFromGit() {
|
||||
validateGitRepo(projectRoot);
|
||||
try {
|
||||
const output = execSync('git describe --tags --abbrev=0', {
|
||||
encoding: 'utf-8',
|
||||
cwd: projectRoot,
|
||||
shell: false
|
||||
});
|
||||
let version = output.trim();
|
||||
|
||||
// Remove v prefix
|
||||
version = version.replace(/^v/, '');
|
||||
|
||||
// Validate it's a valid semver
|
||||
if (!isValidSemver(version)) {
|
||||
const pkgJsonVersion = getFallbackVersion();
|
||||
console.warn(`Invalid version format "${version}" from git describe, using package.json fallback: ${pkgJsonVersion}`);
|
||||
return pkgJsonVersion;
|
||||
}
|
||||
|
||||
return version;
|
||||
} catch (e) {
|
||||
const pkgJsonVersion = getFallbackVersion();
|
||||
console.warn(`Failed to get version from Git tags, using package.json fallback: ${pkgJsonVersion}`);
|
||||
return pkgJsonVersion;
|
||||
}
|
||||
}
|
||||
|
||||
function getFallbackVersion() {
|
||||
const pkgPath = resolve(projectRoot, 'package.json');
|
||||
if (!existsSync(pkgPath)) {
|
||||
return '0.2.50';
|
||||
}
|
||||
try {
|
||||
const content = readFileSync(pkgPath, 'utf-8');
|
||||
const json = JSON.parse(content);
|
||||
return json.version || '0.2.50';
|
||||
} catch {
|
||||
return '0.2.50';
|
||||
}
|
||||
}
|
||||
|
||||
function updatePackageJson(version) {
|
||||
const fullPath = resolve(projectRoot, 'package.json');
|
||||
if (!existsSync(fullPath)) {
|
||||
throw new Error(`File not found: ${fullPath}`);
|
||||
}
|
||||
|
||||
const content = readFileSync(fullPath, 'utf-8');
|
||||
const json = JSON.parse(content);
|
||||
json.version = version;
|
||||
|
||||
// Write with 2-space indentation
|
||||
writeFileSync(fullPath, JSON.stringify(json, null, 2) + '\n', 'utf-8');
|
||||
console.log(`✓ Updated package.json to ${version}`);
|
||||
}
|
||||
|
||||
function updateTOML(path, version) {
|
||||
const fullPath = resolve(projectRoot, path);
|
||||
if (!existsSync(fullPath)) {
|
||||
throw new Error(`File not found: ${fullPath}`);
|
||||
}
|
||||
|
||||
const content = readFileSync(fullPath, 'utf-8');
|
||||
const lines = content.split('\n');
|
||||
const output = [];
|
||||
|
||||
for (const line of lines) {
|
||||
if (line.match(/^\s*version\s*=\s*"/)) {
|
||||
output.push(`version = "${version}"`);
|
||||
} else {
|
||||
output.push(line);
|
||||
}
|
||||
}
|
||||
|
||||
writeFileSync(fullPath, output.join('\n') + '\n', 'utf-8');
|
||||
console.log(`✓ Updated ${path} to ${version}`);
|
||||
}
|
||||
|
||||
const version = getVersionFromGit();
|
||||
console.log(`Setting version to: ${version}`);
|
||||
|
||||
updatePackageJson(version);
|
||||
updateTOML('src-tauri/Cargo.toml', version);
|
||||
updateTOML('src-tauri/tauri.conf.json', version);
|
||||
|
||||
console.log(`✓ All version fields updated to ${version}`);
|
||||
100
src-tauri/Cargo.lock
generated
100
src-tauri/Cargo.lock
generated
@ -263,12 +263,6 @@ dependencies = [
|
||||
"constant_time_eq 0.4.2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "block"
|
||||
version = "0.1.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0d8c1fef690941d3e7788d328517591fecc684c084084702d6ff1641e993699a"
|
||||
|
||||
[[package]]
|
||||
name = "block-buffer"
|
||||
version = "0.10.4"
|
||||
@ -526,36 +520,6 @@ dependencies = [
|
||||
"zeroize",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cocoa"
|
||||
version = "0.25.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f6140449f97a6e97f9511815c5632d84c8aacf8ac271ad77c559218161a1373c"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"block",
|
||||
"cocoa-foundation",
|
||||
"core-foundation 0.9.4",
|
||||
"core-graphics 0.23.2",
|
||||
"foreign-types 0.5.0",
|
||||
"libc",
|
||||
"objc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cocoa-foundation"
|
||||
version = "0.1.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "8c6234cbb2e4c785b456c0644748b1ac416dd045799740356f8363dfe00c93f7"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"block",
|
||||
"core-foundation 0.9.4",
|
||||
"core-graphics-types 0.1.3",
|
||||
"libc",
|
||||
"objc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "color_quant"
|
||||
version = "1.1.0"
|
||||
@ -684,19 +648,6 @@ version = "0.8.7"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "773648b94d0e5d620f64f280777445740e61fe701025087ec8b57f45c791888b"
|
||||
|
||||
[[package]]
|
||||
name = "core-graphics"
|
||||
version = "0.23.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c07782be35f9e1140080c6b96f0d44b739e2278479f64e02fdab4e32dfd8b081"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"core-foundation 0.9.4",
|
||||
"core-graphics-types 0.1.3",
|
||||
"foreign-types 0.5.0",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "core-graphics"
|
||||
version = "0.25.0"
|
||||
@ -705,22 +656,11 @@ checksum = "064badf302c3194842cf2c5d61f56cc88e54a759313879cdf03abdd27d0c3b97"
|
||||
dependencies = [
|
||||
"bitflags 2.11.0",
|
||||
"core-foundation 0.10.1",
|
||||
"core-graphics-types 0.2.0",
|
||||
"core-graphics-types",
|
||||
"foreign-types 0.5.0",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "core-graphics-types"
|
||||
version = "0.1.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "45390e6114f68f718cc7a830514a96f903cccd70d02a8f6d9f643ac4ba45afaf"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"core-foundation 0.9.4",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "core-graphics-types"
|
||||
version = "0.2.0"
|
||||
@ -2476,6 +2416,15 @@ dependencies = [
|
||||
"serde_core",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "infer"
|
||||
version = "0.15.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "cb33622da908807a06f9513c19b3c1ad50fab3e4137d82a78107d502075aa199"
|
||||
dependencies = [
|
||||
"cfb",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "infer"
|
||||
version = "0.19.0"
|
||||
@ -2892,15 +2841,6 @@ version = "0.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c41e0c4fef86961ac6d6f8a82609f55f31b05e4fce149ac5710e439df7619ba4"
|
||||
|
||||
[[package]]
|
||||
name = "malloc_buf"
|
||||
version = "0.0.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "62bb907fe88d54d8d9ce32a3cceab4218ed2f6b7d35617cafe9adf84e43919cb"
|
||||
dependencies = [
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "markup5ever"
|
||||
version = "0.14.1"
|
||||
@ -3216,15 +3156,6 @@ dependencies = [
|
||||
"syn 2.0.117",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "objc"
|
||||
version = "0.2.7"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "915b1b472bc21c53464d6c8461c9d3af805ba1ef837e1cac254428f4a77177b1"
|
||||
dependencies = [
|
||||
"malloc_buf",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "objc2"
|
||||
version = "0.6.4"
|
||||
@ -4311,6 +4242,7 @@ dependencies = [
|
||||
"js-sys",
|
||||
"log",
|
||||
"mime",
|
||||
"mime_guess",
|
||||
"native-tls",
|
||||
"percent-encoding",
|
||||
"pin-project-lite",
|
||||
@ -5330,7 +5262,7 @@ dependencies = [
|
||||
"bitflags 2.11.0",
|
||||
"block2",
|
||||
"core-foundation 0.10.1",
|
||||
"core-graphics 0.25.0",
|
||||
"core-graphics",
|
||||
"crossbeam-channel",
|
||||
"dispatch2",
|
||||
"dlopen2",
|
||||
@ -5689,7 +5621,7 @@ dependencies = [
|
||||
"glob",
|
||||
"html5ever 0.29.1",
|
||||
"http 1.4.0",
|
||||
"infer",
|
||||
"infer 0.19.0",
|
||||
"json-patch",
|
||||
"kuchikiki",
|
||||
"log",
|
||||
@ -6207,7 +6139,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "trcaa"
|
||||
version = "0.1.0"
|
||||
version = "0.2.62"
|
||||
dependencies = [
|
||||
"aes-gcm",
|
||||
"aho-corasick",
|
||||
@ -6215,14 +6147,13 @@ dependencies = [
|
||||
"async-trait",
|
||||
"base64 0.22.1",
|
||||
"chrono",
|
||||
"cocoa",
|
||||
"dirs 5.0.1",
|
||||
"docx-rs",
|
||||
"futures",
|
||||
"hex",
|
||||
"infer 0.15.0",
|
||||
"lazy_static",
|
||||
"mockito",
|
||||
"objc",
|
||||
"printpdf",
|
||||
"rand 0.8.5",
|
||||
"regex",
|
||||
@ -6243,6 +6174,7 @@ dependencies = [
|
||||
"tokio-test",
|
||||
"tracing",
|
||||
"tracing-subscriber",
|
||||
"url",
|
||||
"urlencoding",
|
||||
"uuid",
|
||||
"warp",
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "trcaa"
|
||||
version = "0.1.0"
|
||||
version = "0.2.62"
|
||||
edition = "2021"
|
||||
|
||||
[lib]
|
||||
@ -21,7 +21,7 @@ rusqlite = { version = "0.31", features = ["bundled-sqlcipher-vendored-openssl"]
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
tokio = { version = "1", features = ["full"] }
|
||||
reqwest = { version = "0.12", features = ["json", "stream"] }
|
||||
reqwest = { version = "0.12", features = ["json", "stream", "multipart"] }
|
||||
regex = "1"
|
||||
aho-corasick = "1"
|
||||
uuid = { version = "1", features = ["v7"] }
|
||||
@ -43,11 +43,8 @@ rand = "0.8"
|
||||
lazy_static = "1.4"
|
||||
warp = "0.3"
|
||||
urlencoding = "2"
|
||||
|
||||
# Platform-specific dependencies for native cookie extraction
|
||||
[target.'cfg(target_os = "macos")'.dependencies]
|
||||
cocoa = "0.25"
|
||||
objc = "0.2"
|
||||
infer = "0.15"
|
||||
url = "2.5.8"
|
||||
|
||||
[dev-dependencies]
|
||||
tokio-test = "0.4"
|
||||
@ -56,3 +53,7 @@ mockito = "1.2"
|
||||
[profile.release]
|
||||
opt-level = "s"
|
||||
strip = true
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@ -1,3 +1,30 @@
|
||||
fn main() {
|
||||
let version = get_version_from_git();
|
||||
|
||||
println!("cargo:rustc-env=APP_VERSION={version}");
|
||||
println!("cargo:rerun-if-changed=.git/refs/heads/master");
|
||||
println!("cargo:rerun-if-changed=.git/refs/tags");
|
||||
|
||||
tauri_build::build()
|
||||
}
|
||||
|
||||
fn get_version_from_git() -> String {
|
||||
if let Ok(output) = std::process::Command::new("git")
|
||||
.arg("describe")
|
||||
.arg("--tags")
|
||||
.arg("--abbrev=0")
|
||||
.output()
|
||||
{
|
||||
if output.status.success() {
|
||||
let version = String::from_utf8_lossy(&output.stdout)
|
||||
.trim()
|
||||
.trim_start_matches('v')
|
||||
.to_string();
|
||||
if !version.is_empty() {
|
||||
return version;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
"0.2.50".to_string()
|
||||
}
|
||||
|
||||
@ -1 +1 @@
|
||||
{"default":{"identifier":"default","description":"Default capabilities for TFTSR — least-privilege","local":true,"windows":["main"],"permissions":["core:path:default","core:event:default","core:window:default","core:app:default","core:resources:default","core:menu:default","core:tray:default","dialog:allow-open","dialog:allow-save","fs:allow-read-text-file","fs:allow-write-text-file","fs:allow-read","fs:allow-write","fs:allow-mkdir","fs:allow-app-read-recursive","fs:allow-app-write-recursive","fs:allow-temp-read-recursive","fs:allow-temp-write-recursive","fs:scope-app-recursive","fs:scope-temp-recursive","shell:allow-open","http:default"]}}
|
||||
{"default":{"identifier":"default","description":"Default capabilities for TFTSR — least-privilege","local":true,"windows":["main"],"permissions":["core:path:default","core:event:default","core:window:default","core:app:default","core:resources:default","core:menu:default","core:tray:default","dialog:allow-open","dialog:allow-save","fs:allow-read-text-file","fs:allow-write-text-file","fs:allow-mkdir","fs:allow-app-read-recursive","fs:allow-app-write-recursive","fs:allow-temp-read-recursive","fs:allow-temp-write-recursive","fs:scope-app-recursive","fs:scope-temp-recursive","shell:allow-open","http:default"]}}
|
||||
@ -82,8 +82,17 @@ impl OpenAiProvider {
|
||||
let api_url = config.api_url.trim_end_matches('/');
|
||||
let url = format!("{api_url}{endpoint_path}");
|
||||
|
||||
tracing::debug!(
|
||||
url = %url,
|
||||
model = %config.model,
|
||||
max_tokens = ?config.max_tokens,
|
||||
temperature = ?config.temperature,
|
||||
"OpenAI API request"
|
||||
);
|
||||
|
||||
let model = config.model.trim_end_matches('.');
|
||||
let mut body = serde_json::json!({
|
||||
"model": config.model,
|
||||
"model": model,
|
||||
"messages": messages,
|
||||
});
|
||||
|
||||
@ -128,11 +137,23 @@ impl OpenAiProvider {
|
||||
.header("Content-Type", "application/json")
|
||||
.json(&body)
|
||||
.send()
|
||||
.await?;
|
||||
.await;
|
||||
|
||||
let resp = match resp {
|
||||
Ok(response) => response,
|
||||
Err(e) => {
|
||||
tracing::error!(url = %url, error = %e, "OpenAI API request failed");
|
||||
anyhow::bail!("OpenAI API request failed: {e}");
|
||||
}
|
||||
};
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp.text().await?;
|
||||
let text = resp
|
||||
.text()
|
||||
.await
|
||||
.unwrap_or_else(|_| "unable to read response body".to_string());
|
||||
tracing::error!(url = %url, status = %status, response = %text, "OpenAI API error response");
|
||||
anyhow::bail!("OpenAI API error {status}: {text}");
|
||||
}
|
||||
|
||||
|
||||
@ -165,6 +165,7 @@ pub async fn chat_message(
|
||||
issue_id: String,
|
||||
message: String,
|
||||
provider_config: ProviderConfig,
|
||||
system_prompt: Option<String>,
|
||||
app_handle: tauri::AppHandle,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<ChatResponse, String> {
|
||||
@ -232,7 +233,21 @@ pub async fn chat_message(
|
||||
// Search integration sources for relevant context
|
||||
let integration_context = search_integration_sources(&message, &app_handle, &state).await;
|
||||
|
||||
let mut messages = history;
|
||||
let mut messages = Vec::new();
|
||||
|
||||
// Inject domain system prompt if provided
|
||||
if let Some(ref prompt) = system_prompt {
|
||||
if !prompt.is_empty() {
|
||||
messages.push(Message {
|
||||
role: "system".into(),
|
||||
content: prompt.clone(),
|
||||
tool_call_id: None,
|
||||
tool_calls: None,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
messages.extend(history);
|
||||
|
||||
// If we found integration content, add it to the conversation context
|
||||
if !integration_context.is_empty() {
|
||||
|
||||
@ -97,6 +97,77 @@ pub async fn upload_log_file(
|
||||
Ok(log_file)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn upload_log_file_by_content(
|
||||
issue_id: String,
|
||||
file_name: String,
|
||||
content: String,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<LogFile, String> {
|
||||
let content_bytes = content.as_bytes();
|
||||
let content_hash = format!("{:x}", Sha256::digest(content_bytes));
|
||||
let file_size = content_bytes.len() as i64;
|
||||
|
||||
// Determine mime type based on file extension
|
||||
let mime_type = if file_name.ends_with(".json") {
|
||||
"application/json"
|
||||
} else if file_name.ends_with(".xml") {
|
||||
"application/xml"
|
||||
} else {
|
||||
"text/plain"
|
||||
};
|
||||
|
||||
// Use the file_name as the file_path for DB storage
|
||||
let log_file = LogFile::new(
|
||||
issue_id.clone(),
|
||||
file_name.clone(),
|
||||
file_name.clone(),
|
||||
file_size,
|
||||
);
|
||||
let log_file = LogFile {
|
||||
content_hash: content_hash.clone(),
|
||||
mime_type: mime_type.to_string(),
|
||||
..log_file
|
||||
};
|
||||
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
db.execute(
|
||||
"INSERT INTO log_files (id, issue_id, file_name, file_path, file_size, mime_type, content_hash, uploaded_at, redacted) \
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9)",
|
||||
rusqlite::params![
|
||||
log_file.id,
|
||||
log_file.issue_id,
|
||||
log_file.file_name,
|
||||
log_file.file_path,
|
||||
log_file.file_size,
|
||||
log_file.mime_type,
|
||||
log_file.content_hash,
|
||||
log_file.uploaded_at,
|
||||
log_file.redacted as i32,
|
||||
],
|
||||
)
|
||||
.map_err(|_| "Failed to store uploaded log metadata".to_string())?;
|
||||
|
||||
// Audit
|
||||
let entry = AuditEntry::new(
|
||||
"upload_log_file".to_string(),
|
||||
"log_file".to_string(),
|
||||
log_file.id.clone(),
|
||||
serde_json::json!({ "issue_id": issue_id, "file_name": log_file.file_name }).to_string(),
|
||||
);
|
||||
if let Err(err) = crate::audit::log::write_audit_event(
|
||||
&db,
|
||||
&entry.action,
|
||||
&entry.entity_type,
|
||||
&entry.entity_id,
|
||||
&entry.details,
|
||||
) {
|
||||
warn!(error = %err, "failed to write upload_log_file audit entry");
|
||||
}
|
||||
|
||||
Ok(log_file)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn detect_pii(
|
||||
log_file_id: String,
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
use tauri::State;
|
||||
|
||||
use crate::db::models::{
|
||||
AiConversation, AiMessage, Issue, IssueDetail, IssueFilter, IssueSummary, IssueUpdate, LogFile,
|
||||
ResolutionStep,
|
||||
AiConversation, AiMessage, ImageAttachment, Issue, IssueDetail, IssueFilter, IssueSummary,
|
||||
IssueUpdate, LogFile, ResolutionStep, TimelineEvent,
|
||||
};
|
||||
use crate::state::AppState;
|
||||
|
||||
@ -100,6 +100,32 @@ pub async fn get_issue(
|
||||
.filter_map(|r| r.ok())
|
||||
.collect();
|
||||
|
||||
// Load image attachments
|
||||
let mut img_stmt = db
|
||||
.prepare(
|
||||
"SELECT id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste \
|
||||
FROM image_attachments WHERE issue_id = ?1 ORDER BY uploaded_at ASC",
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
let image_attachments: Vec<ImageAttachment> = img_stmt
|
||||
.query_map([&issue_id], |row| {
|
||||
Ok(ImageAttachment {
|
||||
id: row.get(0)?,
|
||||
issue_id: row.get(1)?,
|
||||
file_name: row.get(2)?,
|
||||
file_path: row.get(3)?,
|
||||
file_size: row.get(4)?,
|
||||
mime_type: row.get(5)?,
|
||||
upload_hash: row.get(6)?,
|
||||
uploaded_at: row.get(7)?,
|
||||
pii_warning_acknowledged: row.get::<_, i32>(8)? != 0,
|
||||
is_paste: row.get::<_, i32>(9)? != 0,
|
||||
})
|
||||
})
|
||||
.map_err(|e| e.to_string())?
|
||||
.filter_map(|r| r.ok())
|
||||
.collect();
|
||||
|
||||
// Load resolution steps (5-whys)
|
||||
let mut rs_stmt = db
|
||||
.prepare(
|
||||
@ -145,11 +171,35 @@ pub async fn get_issue(
|
||||
.filter_map(|r| r.ok())
|
||||
.collect();
|
||||
|
||||
// Load timeline events
|
||||
let mut te_stmt = db
|
||||
.prepare(
|
||||
"SELECT id, issue_id, event_type, description, metadata, created_at \
|
||||
FROM timeline_events WHERE issue_id = ?1 ORDER BY created_at ASC",
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
let timeline_events: Vec<TimelineEvent> = te_stmt
|
||||
.query_map([&issue_id], |row| {
|
||||
Ok(TimelineEvent {
|
||||
id: row.get(0)?,
|
||||
issue_id: row.get(1)?,
|
||||
event_type: row.get(2)?,
|
||||
description: row.get(3)?,
|
||||
metadata: row.get(4)?,
|
||||
created_at: row.get(5)?,
|
||||
})
|
||||
})
|
||||
.map_err(|e| e.to_string())?
|
||||
.filter_map(|r| r.ok())
|
||||
.collect();
|
||||
|
||||
Ok(IssueDetail {
|
||||
issue,
|
||||
log_files,
|
||||
image_attachments,
|
||||
resolution_steps,
|
||||
conversations,
|
||||
timeline_events,
|
||||
})
|
||||
}
|
||||
|
||||
@ -265,11 +315,21 @@ pub async fn delete_issue(issue_id: String, state: State<'_, AppState>) -> Resul
|
||||
.map_err(|e| e.to_string())?;
|
||||
db.execute("DELETE FROM log_files WHERE issue_id = ?1", [&issue_id])
|
||||
.map_err(|e| e.to_string())?;
|
||||
db.execute(
|
||||
"DELETE FROM image_attachments WHERE issue_id = ?1",
|
||||
[&issue_id],
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
db.execute(
|
||||
"DELETE FROM resolution_steps WHERE issue_id = ?1",
|
||||
[&issue_id],
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
db.execute(
|
||||
"DELETE FROM timeline_events WHERE issue_id = ?1",
|
||||
[&issue_id],
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
db.execute("DELETE FROM issues WHERE id = ?1", [&issue_id])
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
@ -473,37 +533,105 @@ pub async fn update_five_why(
|
||||
Ok(())
|
||||
}
|
||||
|
||||
const VALID_EVENT_TYPES: &[&str] = &[
|
||||
"triage_started",
|
||||
"log_uploaded",
|
||||
"why_level_advanced",
|
||||
"root_cause_identified",
|
||||
"rca_generated",
|
||||
"postmortem_generated",
|
||||
"document_exported",
|
||||
];
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn add_timeline_event(
|
||||
issue_id: String,
|
||||
event_type: String,
|
||||
description: String,
|
||||
metadata: Option<String>,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<(), String> {
|
||||
// Use audit_log for timeline tracking
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
let entry = crate::db::models::AuditEntry::new(
|
||||
event_type,
|
||||
"issue".to_string(),
|
||||
) -> Result<TimelineEvent, String> {
|
||||
if !VALID_EVENT_TYPES.contains(&event_type.as_str()) {
|
||||
return Err(format!("Invalid event_type: {event_type}"));
|
||||
}
|
||||
|
||||
let meta = metadata.unwrap_or_else(|| "{}".to_string());
|
||||
if meta.len() > 10240 {
|
||||
return Err("metadata exceeds maximum size of 10KB".to_string());
|
||||
}
|
||||
serde_json::from_str::<serde_json::Value>(&meta)
|
||||
.map_err(|_| "metadata must be valid JSON".to_string())?;
|
||||
|
||||
let event = TimelineEvent::new(
|
||||
issue_id.clone(),
|
||||
serde_json::json!({ "description": description }).to_string(),
|
||||
event_type.clone(),
|
||||
description.clone(),
|
||||
meta,
|
||||
);
|
||||
|
||||
let mut db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
let tx = db.transaction().map_err(|e| e.to_string())?;
|
||||
|
||||
tx.execute(
|
||||
"INSERT INTO timeline_events (id, issue_id, event_type, description, metadata, created_at) \
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6)",
|
||||
rusqlite::params![
|
||||
event.id,
|
||||
event.issue_id,
|
||||
event.event_type,
|
||||
event.description,
|
||||
event.metadata,
|
||||
event.created_at,
|
||||
],
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
crate::audit::log::write_audit_event(
|
||||
&db,
|
||||
&entry.action,
|
||||
&entry.entity_type,
|
||||
&entry.entity_id,
|
||||
&entry.details,
|
||||
&tx,
|
||||
&event_type,
|
||||
"issue",
|
||||
&issue_id,
|
||||
&serde_json::json!({ "description": description, "metadata": event.metadata }).to_string(),
|
||||
)
|
||||
.map_err(|_| "Failed to write security audit entry".to_string())?;
|
||||
|
||||
// Update issue timestamp
|
||||
let now = chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string();
|
||||
db.execute(
|
||||
tx.execute(
|
||||
"UPDATE issues SET updated_at = ?1 WHERE id = ?2",
|
||||
rusqlite::params![now, issue_id],
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
Ok(())
|
||||
tx.commit().map_err(|e| e.to_string())?;
|
||||
|
||||
Ok(event)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn get_timeline_events(
|
||||
issue_id: String,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<Vec<TimelineEvent>, String> {
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
let mut stmt = db
|
||||
.prepare(
|
||||
"SELECT id, issue_id, event_type, description, metadata, created_at \
|
||||
FROM timeline_events WHERE issue_id = ?1 ORDER BY created_at ASC",
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
let events = stmt
|
||||
.query_map([&issue_id], |row| {
|
||||
Ok(TimelineEvent {
|
||||
id: row.get(0)?,
|
||||
issue_id: row.get(1)?,
|
||||
event_type: row.get(2)?,
|
||||
description: row.get(3)?,
|
||||
metadata: row.get(4)?,
|
||||
created_at: row.get(5)?,
|
||||
})
|
||||
})
|
||||
.map_err(|e| e.to_string())?
|
||||
.filter_map(|r| r.ok())
|
||||
.collect();
|
||||
Ok(events)
|
||||
}
|
||||
|
||||
608
src-tauri/src/commands/image.rs
Normal file
608
src-tauri/src/commands/image.rs
Normal file
@ -0,0 +1,608 @@
|
||||
use base64::Engine;
|
||||
use sha2::Digest;
|
||||
use std::path::Path;
|
||||
use tauri::State;
|
||||
|
||||
use crate::audit::log::write_audit_event;
|
||||
use crate::db::models::{AuditEntry, ImageAttachment};
|
||||
use crate::state::AppState;
|
||||
|
||||
const MAX_IMAGE_FILE_BYTES: u64 = 10 * 1024 * 1024;
|
||||
const SUPPORTED_IMAGE_MIME_TYPES: [&str; 6] = [
|
||||
"image/png",
|
||||
"image/jpeg",
|
||||
"image/gif",
|
||||
"image/webp",
|
||||
"image/svg+xml",
|
||||
"image/bmp",
|
||||
];
|
||||
|
||||
fn validate_image_file_path(file_path: &str) -> Result<std::path::PathBuf, String> {
|
||||
let path = Path::new(file_path);
|
||||
let canonical = std::fs::canonicalize(path).map_err(|_| "Unable to access selected file")?;
|
||||
let metadata = std::fs::metadata(&canonical).map_err(|_| "Unable to read file metadata")?;
|
||||
|
||||
if !metadata.is_file() {
|
||||
return Err("Selected path is not a file".to_string());
|
||||
}
|
||||
|
||||
if metadata.len() > MAX_IMAGE_FILE_BYTES {
|
||||
return Err(format!(
|
||||
"Image file exceeds maximum supported size ({} MB)",
|
||||
MAX_IMAGE_FILE_BYTES / 1024 / 1024
|
||||
));
|
||||
}
|
||||
|
||||
Ok(canonical)
|
||||
}
|
||||
|
||||
fn is_supported_image_format(mime_type: &str) -> bool {
|
||||
SUPPORTED_IMAGE_MIME_TYPES.contains(&mime_type)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn upload_image_attachment(
|
||||
issue_id: String,
|
||||
file_path: String,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<ImageAttachment, String> {
|
||||
let canonical_path = validate_image_file_path(&file_path)?;
|
||||
let content =
|
||||
std::fs::read(&canonical_path).map_err(|_| "Failed to read selected image file")?;
|
||||
let content_hash = format!("{:x}", sha2::Sha256::digest(&content));
|
||||
let file_name = canonical_path
|
||||
.file_name()
|
||||
.and_then(|n| n.to_str())
|
||||
.unwrap_or("unknown")
|
||||
.to_string();
|
||||
let file_size = content.len() as i64;
|
||||
let mime_type: String = infer::get(&content)
|
||||
.map(|m| m.mime_type().to_string())
|
||||
.unwrap_or_else(|| "image/png".to_string());
|
||||
|
||||
if !is_supported_image_format(mime_type.as_str()) {
|
||||
return Err(format!(
|
||||
"Unsupported image format: {}. Supported formats: {}",
|
||||
mime_type,
|
||||
SUPPORTED_IMAGE_MIME_TYPES.join(", ")
|
||||
));
|
||||
}
|
||||
|
||||
let canonical_file_path = canonical_path.to_string_lossy().to_string();
|
||||
let attachment = ImageAttachment::new(
|
||||
issue_id.clone(),
|
||||
file_name,
|
||||
canonical_file_path,
|
||||
file_size,
|
||||
mime_type,
|
||||
content_hash.clone(),
|
||||
true,
|
||||
false,
|
||||
);
|
||||
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
db.execute(
|
||||
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste) \
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
|
||||
rusqlite::params![
|
||||
attachment.id,
|
||||
attachment.issue_id,
|
||||
attachment.file_name,
|
||||
attachment.file_path,
|
||||
attachment.file_size,
|
||||
attachment.mime_type,
|
||||
attachment.upload_hash,
|
||||
attachment.uploaded_at,
|
||||
attachment.pii_warning_acknowledged as i32,
|
||||
attachment.is_paste as i32,
|
||||
],
|
||||
)
|
||||
.map_err(|_| "Failed to store uploaded image metadata".to_string())?;
|
||||
|
||||
let entry = AuditEntry::new(
|
||||
"upload_image_attachment".to_string(),
|
||||
"image_attachment".to_string(),
|
||||
attachment.id.clone(),
|
||||
serde_json::json!({
|
||||
"issue_id": issue_id,
|
||||
"file_name": attachment.file_name,
|
||||
"is_paste": false,
|
||||
})
|
||||
.to_string(),
|
||||
);
|
||||
if let Err(err) = write_audit_event(
|
||||
&db,
|
||||
&entry.action,
|
||||
&entry.entity_type,
|
||||
&entry.entity_id,
|
||||
&entry.details,
|
||||
) {
|
||||
tracing::warn!(error = %err, "failed to write upload_image_attachment audit entry");
|
||||
}
|
||||
|
||||
Ok(attachment)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn upload_image_attachment_by_content(
|
||||
issue_id: String,
|
||||
file_name: String,
|
||||
base64_content: String,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<ImageAttachment, String> {
|
||||
let data_part = base64_content
|
||||
.split(',')
|
||||
.nth(1)
|
||||
.ok_or("Invalid image data format - missing base64 content")?;
|
||||
|
||||
let decoded = base64::engine::general_purpose::STANDARD
|
||||
.decode(data_part)
|
||||
.map_err(|_| "Failed to decode base64 image data")?;
|
||||
|
||||
let content_hash = format!("{:x}", sha2::Sha256::digest(&decoded));
|
||||
let file_size = decoded.len() as i64;
|
||||
|
||||
let mime_type: String = infer::get(&decoded)
|
||||
.map(|m| m.mime_type().to_string())
|
||||
.unwrap_or_else(|| "image/png".to_string());
|
||||
|
||||
if !is_supported_image_format(mime_type.as_str()) {
|
||||
return Err(format!(
|
||||
"Unsupported image format: {}. Supported formats: {}",
|
||||
mime_type,
|
||||
SUPPORTED_IMAGE_MIME_TYPES.join(", ")
|
||||
));
|
||||
}
|
||||
|
||||
// Use the file_name as file_path for DB storage
|
||||
let attachment = ImageAttachment::new(
|
||||
issue_id.clone(),
|
||||
file_name.clone(),
|
||||
file_name,
|
||||
file_size,
|
||||
mime_type,
|
||||
content_hash.clone(),
|
||||
true,
|
||||
false,
|
||||
);
|
||||
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
db.execute(
|
||||
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste) \
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
|
||||
rusqlite::params![
|
||||
attachment.id,
|
||||
attachment.issue_id,
|
||||
attachment.file_name,
|
||||
attachment.file_path,
|
||||
attachment.file_size,
|
||||
attachment.mime_type,
|
||||
attachment.upload_hash,
|
||||
attachment.uploaded_at,
|
||||
attachment.pii_warning_acknowledged as i32,
|
||||
attachment.is_paste as i32,
|
||||
],
|
||||
)
|
||||
.map_err(|_| "Failed to store uploaded image metadata".to_string())?;
|
||||
|
||||
let entry = AuditEntry::new(
|
||||
"upload_image_attachment".to_string(),
|
||||
"image_attachment".to_string(),
|
||||
attachment.id.clone(),
|
||||
serde_json::json!({
|
||||
"issue_id": issue_id,
|
||||
"file_name": attachment.file_name,
|
||||
"is_paste": false,
|
||||
})
|
||||
.to_string(),
|
||||
);
|
||||
if let Err(err) = write_audit_event(
|
||||
&db,
|
||||
&entry.action,
|
||||
&entry.entity_type,
|
||||
&entry.entity_id,
|
||||
&entry.details,
|
||||
) {
|
||||
tracing::warn!(error = %err, "failed to write upload_image_attachment audit entry");
|
||||
}
|
||||
|
||||
Ok(attachment)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn upload_paste_image(
|
||||
issue_id: String,
|
||||
base64_image: String,
|
||||
mime_type: String,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<ImageAttachment, String> {
|
||||
if !base64_image.starts_with("data:image/") {
|
||||
return Err("Invalid image data - must be a data URL".to_string());
|
||||
}
|
||||
|
||||
let data_part = base64_image
|
||||
.split(',')
|
||||
.nth(1)
|
||||
.ok_or("Invalid image data format - missing base64 content")?;
|
||||
|
||||
let decoded = base64::engine::general_purpose::STANDARD
|
||||
.decode(data_part)
|
||||
.map_err(|_| "Failed to decode base64 image data")?;
|
||||
|
||||
let content_hash = format!("{:x}", sha2::Sha256::digest(&decoded));
|
||||
let file_size = decoded.len() as i64;
|
||||
let file_name = format!("pasted-image-{}.png", uuid::Uuid::now_v7());
|
||||
|
||||
if !is_supported_image_format(mime_type.as_str()) {
|
||||
return Err(format!(
|
||||
"Unsupported image format: {}. Supported formats: {}",
|
||||
mime_type,
|
||||
SUPPORTED_IMAGE_MIME_TYPES.join(", ")
|
||||
));
|
||||
}
|
||||
|
||||
let attachment = ImageAttachment::new(
|
||||
issue_id.clone(),
|
||||
file_name.clone(),
|
||||
String::new(),
|
||||
file_size,
|
||||
mime_type,
|
||||
content_hash,
|
||||
true,
|
||||
true,
|
||||
);
|
||||
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
db.execute(
|
||||
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste) \
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
|
||||
rusqlite::params![
|
||||
attachment.id,
|
||||
attachment.issue_id,
|
||||
attachment.file_name,
|
||||
attachment.file_path,
|
||||
attachment.file_size,
|
||||
attachment.mime_type,
|
||||
attachment.upload_hash,
|
||||
attachment.uploaded_at,
|
||||
attachment.pii_warning_acknowledged as i32,
|
||||
attachment.is_paste as i32,
|
||||
],
|
||||
)
|
||||
.map_err(|_| "Failed to store pasted image metadata".to_string())?;
|
||||
|
||||
let entry = AuditEntry::new(
|
||||
"upload_paste_image".to_string(),
|
||||
"image_attachment".to_string(),
|
||||
attachment.id.clone(),
|
||||
serde_json::json!({
|
||||
"issue_id": issue_id,
|
||||
"file_name": attachment.file_name,
|
||||
"is_paste": true,
|
||||
})
|
||||
.to_string(),
|
||||
);
|
||||
if let Err(err) = write_audit_event(
|
||||
&db,
|
||||
&entry.action,
|
||||
&entry.entity_type,
|
||||
&entry.entity_id,
|
||||
&entry.details,
|
||||
) {
|
||||
tracing::warn!(error = %err, "failed to write upload_paste_image audit entry");
|
||||
}
|
||||
|
||||
Ok(attachment)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn list_image_attachments(
|
||||
issue_id: String,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<Vec<ImageAttachment>, String> {
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
|
||||
let mut stmt = db
|
||||
.prepare(
|
||||
"SELECT id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste \
|
||||
FROM image_attachments WHERE issue_id = ?1 ORDER BY uploaded_at ASC",
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
let attachments = stmt
|
||||
.query_map([&issue_id], |row| {
|
||||
Ok(ImageAttachment {
|
||||
id: row.get(0)?,
|
||||
issue_id: row.get(1)?,
|
||||
file_name: row.get(2)?,
|
||||
file_path: row.get(3)?,
|
||||
file_size: row.get(4)?,
|
||||
mime_type: row.get(5)?,
|
||||
upload_hash: row.get(6)?,
|
||||
uploaded_at: row.get(7)?,
|
||||
pii_warning_acknowledged: row.get::<_, i32>(8)? != 0,
|
||||
is_paste: row.get::<_, i32>(9)? != 0,
|
||||
})
|
||||
})
|
||||
.map_err(|e| e.to_string())?
|
||||
.filter_map(|r| r.ok())
|
||||
.collect();
|
||||
|
||||
Ok(attachments)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn delete_image_attachment(
|
||||
attachment_id: String,
|
||||
state: State<'_, AppState>,
|
||||
) -> Result<(), String> {
|
||||
let db = state.db.lock().map_err(|e| e.to_string())?;
|
||||
|
||||
let affected = db
|
||||
.execute(
|
||||
"DELETE FROM image_attachments WHERE id = ?1",
|
||||
[&attachment_id],
|
||||
)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
if affected == 0 {
|
||||
return Err("Image attachment not found".to_string());
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub async fn upload_file_to_datastore(
|
||||
provider_config: serde_json::Value,
|
||||
file_path: String,
|
||||
_state: State<'_, AppState>,
|
||||
) -> Result<String, String> {
|
||||
use reqwest::multipart::Form;
|
||||
|
||||
let canonical_path = validate_image_file_path(&file_path)?;
|
||||
let content =
|
||||
std::fs::read(&canonical_path).map_err(|_| "Failed to read file for datastore upload")?;
|
||||
|
||||
let file_name = canonical_path
|
||||
.file_name()
|
||||
.and_then(|n| n.to_str())
|
||||
.unwrap_or("unknown")
|
||||
.to_string();
|
||||
|
||||
let _file_size = content.len() as i64;
|
||||
|
||||
// Extract API URL and auth header from provider config
|
||||
let api_url = provider_config
|
||||
.get("api_url")
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or("Provider config missing api_url")?
|
||||
.to_string();
|
||||
|
||||
// Extract use_datastore_upload flag
|
||||
let use_datastore = provider_config
|
||||
.get("use_datastore_upload")
|
||||
.and_then(|v| v.as_bool())
|
||||
.unwrap_or(false);
|
||||
|
||||
if !use_datastore {
|
||||
return Err("use_datastore_upload is not enabled for this provider".to_string());
|
||||
}
|
||||
|
||||
// Get datastore ID from custom_endpoint_path (stored as datastore ID)
|
||||
let datastore_id = provider_config
|
||||
.get("custom_endpoint_path")
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or("Provider config missing datastore ID in custom_endpoint_path")?
|
||||
.to_string();
|
||||
|
||||
// Build upload endpoint: POST /api/v2/upload/<DATASTORE-ID>
|
||||
let api_url = api_url.trim_end_matches('/');
|
||||
let upload_url = format!("{api_url}/upload/{datastore_id}");
|
||||
|
||||
// Read auth header and value
|
||||
let auth_header = provider_config
|
||||
.get("custom_auth_header")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("x-generic-api-key");
|
||||
|
||||
let auth_prefix = provider_config
|
||||
.get("custom_auth_prefix")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("");
|
||||
|
||||
let api_key = provider_config
|
||||
.get("api_key")
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or("Provider config missing api_key")?;
|
||||
|
||||
let auth_value = format!("{auth_prefix}{api_key}");
|
||||
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
// Create multipart form
|
||||
let part = reqwest::multipart::Part::bytes(content)
|
||||
.file_name(file_name)
|
||||
.mime_str("application/octet-stream")
|
||||
.map_err(|e| format!("Failed to create multipart part: {e}"))?;
|
||||
|
||||
let form = Form::new().part("file", part);
|
||||
|
||||
let resp = client
|
||||
.post(&upload_url)
|
||||
.header(auth_header, auth_value)
|
||||
.multipart(form)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Upload request failed: {e}"))?;
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp
|
||||
.text()
|
||||
.await
|
||||
.unwrap_or_else(|_| "unable to read response".to_string());
|
||||
return Err(format!("Datastore upload error {status}: {text}"));
|
||||
}
|
||||
|
||||
// Parse response to get file ID
|
||||
let json = resp
|
||||
.json::<serde_json::Value>()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse upload response: {e}"))?;
|
||||
|
||||
// Response should have file_id or id field
|
||||
let file_id = json
|
||||
.get("file_id")
|
||||
.or_else(|| json.get("id"))
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or_else(|| {
|
||||
format!(
|
||||
"Response missing file_id: {}",
|
||||
serde_json::to_string_pretty(&json).unwrap_or_default()
|
||||
)
|
||||
})?
|
||||
.to_string();
|
||||
|
||||
Ok(file_id)
|
||||
}
|
||||
|
||||
/// Upload any file (not just images) to GenAI datastore
|
||||
#[tauri::command]
|
||||
pub async fn upload_file_to_datastore_any(
|
||||
provider_config: serde_json::Value,
|
||||
file_path: String,
|
||||
_state: State<'_, AppState>,
|
||||
) -> Result<String, String> {
|
||||
use reqwest::multipart::Form;
|
||||
|
||||
// Validate file exists and is accessible
|
||||
let path = Path::new(&file_path);
|
||||
let canonical = std::fs::canonicalize(path).map_err(|_| "Unable to access selected file")?;
|
||||
let metadata = std::fs::metadata(&canonical).map_err(|_| "Unable to read file metadata")?;
|
||||
|
||||
if !metadata.is_file() {
|
||||
return Err("Selected path is not a file".to_string());
|
||||
}
|
||||
|
||||
let content =
|
||||
std::fs::read(&canonical).map_err(|_| "Failed to read file for datastore upload")?;
|
||||
|
||||
let file_name = canonical
|
||||
.file_name()
|
||||
.and_then(|n| n.to_str())
|
||||
.unwrap_or("unknown")
|
||||
.to_string();
|
||||
|
||||
let _file_size = content.len() as i64;
|
||||
|
||||
// Extract API URL and auth header from provider config
|
||||
let api_url = provider_config
|
||||
.get("api_url")
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or("Provider config missing api_url")?
|
||||
.to_string();
|
||||
|
||||
// Extract use_datastore_upload flag
|
||||
let use_datastore = provider_config
|
||||
.get("use_datastore_upload")
|
||||
.and_then(|v| v.as_bool())
|
||||
.unwrap_or(false);
|
||||
|
||||
if !use_datastore {
|
||||
return Err("use_datastore_upload is not enabled for this provider".to_string());
|
||||
}
|
||||
|
||||
// Get datastore ID from custom_endpoint_path (stored as datastore ID)
|
||||
let datastore_id = provider_config
|
||||
.get("custom_endpoint_path")
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or("Provider config missing datastore ID in custom_endpoint_path")?
|
||||
.to_string();
|
||||
|
||||
// Build upload endpoint: POST /api/v2/upload/<DATASTORE-ID>
|
||||
let api_url = api_url.trim_end_matches('/');
|
||||
let upload_url = format!("{api_url}/upload/{datastore_id}");
|
||||
|
||||
// Read auth header and value
|
||||
let auth_header = provider_config
|
||||
.get("custom_auth_header")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("x-generic-api-key");
|
||||
|
||||
let auth_prefix = provider_config
|
||||
.get("custom_auth_prefix")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("");
|
||||
|
||||
let api_key = provider_config
|
||||
.get("api_key")
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or("Provider config missing api_key")?;
|
||||
|
||||
let auth_value = format!("{auth_prefix}{api_key}");
|
||||
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
// Create multipart form
|
||||
let part = reqwest::multipart::Part::bytes(content)
|
||||
.file_name(file_name)
|
||||
.mime_str("application/octet-stream")
|
||||
.map_err(|e| format!("Failed to create multipart part: {e}"))?;
|
||||
|
||||
let form = Form::new().part("file", part);
|
||||
|
||||
let resp = client
|
||||
.post(&upload_url)
|
||||
.header(auth_header, auth_value)
|
||||
.multipart(form)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Upload request failed: {e}"))?;
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp
|
||||
.text()
|
||||
.await
|
||||
.unwrap_or_else(|_| "unable to read response".to_string());
|
||||
return Err(format!("Datastore upload error {status}: {text}"));
|
||||
}
|
||||
|
||||
// Parse response to get file ID
|
||||
let json = resp
|
||||
.json::<serde_json::Value>()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse upload response: {e}"))?;
|
||||
|
||||
// Response should have file_id or id field
|
||||
let file_id = json
|
||||
.get("file_id")
|
||||
.or_else(|| json.get("id"))
|
||||
.and_then(|v| v.as_str())
|
||||
.ok_or_else(|| {
|
||||
format!(
|
||||
"Response missing file_id: {}",
|
||||
serde_json::to_string_pretty(&json).unwrap_or_default()
|
||||
)
|
||||
})?
|
||||
.to_string();
|
||||
|
||||
Ok(file_id)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_is_supported_image_format() {
|
||||
assert!(is_supported_image_format("image/png"));
|
||||
assert!(is_supported_image_format("image/jpeg"));
|
||||
assert!(is_supported_image_format("image/gif"));
|
||||
assert!(is_supported_image_format("image/webp"));
|
||||
assert!(is_supported_image_format("image/svg+xml"));
|
||||
assert!(is_supported_image_format("image/bmp"));
|
||||
assert!(!is_supported_image_format("text/plain"));
|
||||
}
|
||||
}
|
||||
@ -2,5 +2,6 @@ pub mod ai;
|
||||
pub mod analysis;
|
||||
pub mod db;
|
||||
pub mod docs;
|
||||
pub mod image;
|
||||
pub mod integrations;
|
||||
pub mod system;
|
||||
|
||||
@ -4,6 +4,7 @@ use crate::ollama::{
|
||||
OllamaStatus,
|
||||
};
|
||||
use crate::state::{AppSettings, AppState, ProviderConfig};
|
||||
use std::env;
|
||||
|
||||
// --- Ollama commands ---
|
||||
|
||||
@ -158,8 +159,8 @@ pub async fn save_ai_provider(
|
||||
db.execute(
|
||||
"INSERT OR REPLACE INTO ai_providers
|
||||
(id, name, provider_type, api_url, encrypted_api_key, model, max_tokens, temperature,
|
||||
custom_endpoint_path, custom_auth_header, custom_auth_prefix, api_format, user_id, updated_at)
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10, ?11, ?12, ?13, datetime('now'))",
|
||||
custom_endpoint_path, custom_auth_header, custom_auth_prefix, api_format, user_id, use_datastore_upload, updated_at)
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10, ?11, ?12, ?13, ?14, datetime('now'))",
|
||||
rusqlite::params![
|
||||
uuid::Uuid::now_v7().to_string(),
|
||||
provider.name,
|
||||
@ -174,6 +175,7 @@ pub async fn save_ai_provider(
|
||||
provider.custom_auth_prefix,
|
||||
provider.api_format,
|
||||
provider.user_id,
|
||||
provider.use_datastore_upload,
|
||||
],
|
||||
)
|
||||
.map_err(|e| format!("Failed to save AI provider: {e}"))?;
|
||||
@ -191,7 +193,7 @@ pub async fn load_ai_providers(
|
||||
let mut stmt = db
|
||||
.prepare(
|
||||
"SELECT name, provider_type, api_url, encrypted_api_key, model, max_tokens, temperature,
|
||||
custom_endpoint_path, custom_auth_header, custom_auth_prefix, api_format, user_id
|
||||
custom_endpoint_path, custom_auth_header, custom_auth_prefix, api_format, user_id, use_datastore_upload
|
||||
FROM ai_providers
|
||||
ORDER BY name",
|
||||
)
|
||||
@ -214,6 +216,7 @@ pub async fn load_ai_providers(
|
||||
row.get::<_, Option<String>>(9)?, // custom_auth_prefix
|
||||
row.get::<_, Option<String>>(10)?, // api_format
|
||||
row.get::<_, Option<String>>(11)?, // user_id
|
||||
row.get::<_, Option<bool>>(12)?, // use_datastore_upload
|
||||
))
|
||||
})
|
||||
.map_err(|e| e.to_string())?
|
||||
@ -232,6 +235,7 @@ pub async fn load_ai_providers(
|
||||
custom_auth_prefix,
|
||||
api_format,
|
||||
user_id,
|
||||
use_datastore_upload,
|
||||
)| {
|
||||
// Decrypt the API key
|
||||
let api_key = crate::integrations::auth::decrypt_token(&encrypted_key).ok()?;
|
||||
@ -250,6 +254,7 @@ pub async fn load_ai_providers(
|
||||
api_format,
|
||||
session_id: None, // Session IDs are not persisted
|
||||
user_id,
|
||||
use_datastore_upload,
|
||||
})
|
||||
},
|
||||
)
|
||||
@ -271,3 +276,11 @@ pub async fn delete_ai_provider(
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get the application version from build-time environment
|
||||
#[tauri::command]
|
||||
pub async fn get_app_version() -> Result<String, String> {
|
||||
env::var("APP_VERSION")
|
||||
.or_else(|_| env::var("CARGO_PKG_VERSION"))
|
||||
.map_err(|e| format!("Failed to get version: {e}"))
|
||||
}
|
||||
|
||||
@ -156,18 +156,18 @@ pub fn run_migrations(conn: &Connection) -> anyhow::Result<()> {
|
||||
ALTER TABLE audit_log ADD COLUMN entry_hash TEXT NOT NULL DEFAULT '';",
|
||||
),
|
||||
(
|
||||
"013_create_persistent_webviews",
|
||||
"CREATE TABLE IF NOT EXISTS persistent_webviews (
|
||||
"013_image_attachments",
|
||||
"CREATE TABLE IF NOT EXISTS image_attachments (
|
||||
id TEXT PRIMARY KEY,
|
||||
service TEXT NOT NULL CHECK(service IN ('confluence','servicenow','azuredevops')),
|
||||
webview_label TEXT NOT NULL,
|
||||
base_url TEXT NOT NULL,
|
||||
last_active TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
window_x INTEGER,
|
||||
window_y INTEGER,
|
||||
window_width INTEGER,
|
||||
window_height INTEGER,
|
||||
UNIQUE(service)
|
||||
issue_id TEXT NOT NULL REFERENCES issues(id) ON DELETE CASCADE,
|
||||
file_name TEXT NOT NULL,
|
||||
file_path TEXT NOT NULL DEFAULT '',
|
||||
file_size INTEGER NOT NULL DEFAULT 0,
|
||||
mime_type TEXT NOT NULL DEFAULT 'image/png',
|
||||
upload_hash TEXT NOT NULL DEFAULT '',
|
||||
uploaded_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
pii_warning_acknowledged INTEGER NOT NULL DEFAULT 1,
|
||||
is_paste INTEGER NOT NULL DEFAULT 0
|
||||
);",
|
||||
),
|
||||
(
|
||||
@ -186,10 +186,33 @@ pub fn run_migrations(conn: &Connection) -> anyhow::Result<()> {
|
||||
custom_auth_prefix TEXT,
|
||||
api_format TEXT,
|
||||
user_id TEXT,
|
||||
use_datastore_upload INTEGER,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);",
|
||||
),
|
||||
(
|
||||
"015_add_use_datastore_upload",
|
||||
"ALTER TABLE ai_providers ADD COLUMN use_datastore_upload INTEGER DEFAULT 0",
|
||||
),
|
||||
(
|
||||
"016_add_created_at",
|
||||
"ALTER TABLE ai_providers ADD COLUMN created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%d %H:%M:%S', 'now'))",
|
||||
),
|
||||
(
|
||||
"017_create_timeline_events",
|
||||
"CREATE TABLE IF NOT EXISTS timeline_events (
|
||||
id TEXT PRIMARY KEY,
|
||||
issue_id TEXT NOT NULL,
|
||||
event_type TEXT NOT NULL,
|
||||
description TEXT NOT NULL DEFAULT '',
|
||||
metadata TEXT NOT NULL DEFAULT '{}',
|
||||
created_at TEXT NOT NULL,
|
||||
FOREIGN KEY (issue_id) REFERENCES issues(id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE INDEX idx_timeline_events_issue ON timeline_events(issue_id);
|
||||
CREATE INDEX idx_timeline_events_time ON timeline_events(created_at);",
|
||||
),
|
||||
];
|
||||
|
||||
for (name, sql) in migrations {
|
||||
@ -200,10 +223,27 @@ pub fn run_migrations(conn: &Connection) -> anyhow::Result<()> {
|
||||
|
||||
if !already_applied {
|
||||
// FTS5 virtual table creation can be skipped if FTS5 is not compiled in
|
||||
if let Err(e) = conn.execute_batch(sql) {
|
||||
if name.contains("fts") {
|
||||
// Also handle column-already-exists errors for migrations 015-016
|
||||
if name.contains("fts") {
|
||||
if let Err(e) = conn.execute_batch(sql) {
|
||||
tracing::warn!("FTS5 not available, skipping: {e}");
|
||||
} else {
|
||||
}
|
||||
} else if name.ends_with("_add_use_datastore_upload")
|
||||
|| name.ends_with("_add_created_at")
|
||||
{
|
||||
// Use execute for ALTER TABLE (SQLite only allows one statement per command)
|
||||
// Skip error if column already exists (SQLITE_ERROR with "duplicate column name")
|
||||
if let Err(e) = conn.execute(sql, []) {
|
||||
let err_str = e.to_string();
|
||||
if err_str.contains("duplicate column name") {
|
||||
tracing::info!("Column may already exist, skipping migration {name}: {e}");
|
||||
} else {
|
||||
return Err(e.into());
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Use execute_batch for other migrations (FTS5, CREATE TABLE, etc.)
|
||||
if let Err(e) = conn.execute_batch(sql) {
|
||||
return Err(e.into());
|
||||
}
|
||||
}
|
||||
@ -227,21 +267,21 @@ mod tests {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_create_credentials_table() {
|
||||
fn test_create_image_attachments_table() {
|
||||
let conn = setup_test_db();
|
||||
|
||||
// Verify table exists
|
||||
let count: i64 = conn
|
||||
.query_row(
|
||||
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='credentials'",
|
||||
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='image_attachments'",
|
||||
[],
|
||||
|r| r.get(0),
|
||||
)
|
||||
.unwrap();
|
||||
assert_eq!(count, 1);
|
||||
|
||||
// Verify columns
|
||||
let mut stmt = conn.prepare("PRAGMA table_info(credentials)").unwrap();
|
||||
let mut stmt = conn
|
||||
.prepare("PRAGMA table_info(image_attachments)")
|
||||
.unwrap();
|
||||
let columns: Vec<String> = stmt
|
||||
.query_map([], |row| row.get::<_, String>(1))
|
||||
.unwrap()
|
||||
@ -249,11 +289,15 @@ mod tests {
|
||||
.unwrap();
|
||||
|
||||
assert!(columns.contains(&"id".to_string()));
|
||||
assert!(columns.contains(&"service".to_string()));
|
||||
assert!(columns.contains(&"token_hash".to_string()));
|
||||
assert!(columns.contains(&"encrypted_token".to_string()));
|
||||
assert!(columns.contains(&"created_at".to_string()));
|
||||
assert!(columns.contains(&"expires_at".to_string()));
|
||||
assert!(columns.contains(&"issue_id".to_string()));
|
||||
assert!(columns.contains(&"file_name".to_string()));
|
||||
assert!(columns.contains(&"file_path".to_string()));
|
||||
assert!(columns.contains(&"file_size".to_string()));
|
||||
assert!(columns.contains(&"mime_type".to_string()));
|
||||
assert!(columns.contains(&"upload_hash".to_string()));
|
||||
assert!(columns.contains(&"uploaded_at".to_string()));
|
||||
assert!(columns.contains(&"pii_warning_acknowledged".to_string()));
|
||||
assert!(columns.contains(&"is_paste".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@ -424,4 +468,326 @@ mod tests {
|
||||
|
||||
assert_eq!(count, 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_store_and_retrieve_image_attachment() {
|
||||
let conn = setup_test_db();
|
||||
|
||||
// Create an issue first (required for foreign key)
|
||||
let now = chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string();
|
||||
conn.execute(
|
||||
"INSERT INTO issues (id, title, description, severity, status, category, source, created_at, updated_at, resolved_at, assigned_to, tags)
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10, ?11, ?12)",
|
||||
rusqlite::params![
|
||||
"test-issue-1",
|
||||
"Test Issue",
|
||||
"Test description",
|
||||
"medium",
|
||||
"open",
|
||||
"test",
|
||||
"manual",
|
||||
now.clone(),
|
||||
now.clone(),
|
||||
None::<Option<String>>,
|
||||
"",
|
||||
"[]",
|
||||
],
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
// Now insert the image attachment
|
||||
conn.execute(
|
||||
"INSERT INTO image_attachments (id, issue_id, file_name, file_path, file_size, mime_type, upload_hash, uploaded_at, pii_warning_acknowledged, is_paste)
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10)",
|
||||
rusqlite::params![
|
||||
"test-img-1",
|
||||
"test-issue-1",
|
||||
"screenshot.png",
|
||||
"/path/to/screenshot.png",
|
||||
102400,
|
||||
"image/png",
|
||||
"abc123hash",
|
||||
now,
|
||||
1,
|
||||
0,
|
||||
],
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
let (id, issue_id, file_name, mime_type, is_paste): (String, String, String, String, i32) = conn
|
||||
.query_row(
|
||||
"SELECT id, issue_id, file_name, mime_type, is_paste FROM image_attachments WHERE id = ?1",
|
||||
["test-img-1"],
|
||||
|r| Ok((r.get(0)?, r.get(1)?, r.get(2)?, r.get(3)?, r.get(4)?)),
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(id, "test-img-1");
|
||||
assert_eq!(issue_id, "test-issue-1");
|
||||
assert_eq!(file_name, "screenshot.png");
|
||||
assert_eq!(mime_type, "image/png");
|
||||
assert_eq!(is_paste, 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_create_ai_providers_table() {
|
||||
let conn = setup_test_db();
|
||||
|
||||
let count: i64 = conn
|
||||
.query_row(
|
||||
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='ai_providers'",
|
||||
[],
|
||||
|r| r.get(0),
|
||||
)
|
||||
.unwrap();
|
||||
assert_eq!(count, 1);
|
||||
|
||||
let mut stmt = conn.prepare("PRAGMA table_info(ai_providers)").unwrap();
|
||||
let columns: Vec<String> = stmt
|
||||
.query_map([], |row| row.get::<_, String>(1))
|
||||
.unwrap()
|
||||
.collect::<Result<Vec<_>, _>>()
|
||||
.unwrap();
|
||||
|
||||
assert!(columns.contains(&"id".to_string()));
|
||||
assert!(columns.contains(&"name".to_string()));
|
||||
assert!(columns.contains(&"provider_type".to_string()));
|
||||
assert!(columns.contains(&"api_url".to_string()));
|
||||
assert!(columns.contains(&"encrypted_api_key".to_string()));
|
||||
assert!(columns.contains(&"model".to_string()));
|
||||
assert!(columns.contains(&"max_tokens".to_string()));
|
||||
assert!(columns.contains(&"temperature".to_string()));
|
||||
assert!(columns.contains(&"custom_endpoint_path".to_string()));
|
||||
assert!(columns.contains(&"custom_auth_header".to_string()));
|
||||
assert!(columns.contains(&"custom_auth_prefix".to_string()));
|
||||
assert!(columns.contains(&"api_format".to_string()));
|
||||
assert!(columns.contains(&"user_id".to_string()));
|
||||
assert!(columns.contains(&"use_datastore_upload".to_string()));
|
||||
assert!(columns.contains(&"created_at".to_string()));
|
||||
assert!(columns.contains(&"updated_at".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_store_and_retrieve_ai_provider() {
|
||||
let conn = setup_test_db();
|
||||
|
||||
conn.execute(
|
||||
"INSERT INTO ai_providers (id, name, provider_type, api_url, encrypted_api_key, model)
|
||||
VALUES (?1, ?2, ?3, ?4, ?5, ?6)",
|
||||
rusqlite::params![
|
||||
"test-provider-1",
|
||||
"My OpenAI",
|
||||
"openai",
|
||||
"https://api.openai.com/v1",
|
||||
"encrypted_key_123",
|
||||
"gpt-4o"
|
||||
],
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
let (name, provider_type, api_url, encrypted_key, model): (String, String, String, String, String) = conn
|
||||
.query_row(
|
||||
"SELECT name, provider_type, api_url, encrypted_api_key, model FROM ai_providers WHERE name = ?1",
|
||||
["My OpenAI"],
|
||||
|r| Ok((r.get(0)?, r.get(1)?, r.get(2)?, r.get(3)?, r.get(4)?)),
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(name, "My OpenAI");
|
||||
assert_eq!(provider_type, "openai");
|
||||
assert_eq!(api_url, "https://api.openai.com/v1");
|
||||
assert_eq!(encrypted_key, "encrypted_key_123");
|
||||
assert_eq!(model, "gpt-4o");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_add_missing_columns_to_existing_table() {
|
||||
let conn = Connection::open_in_memory().unwrap();
|
||||
|
||||
// Simulate existing table without use_datastore_upload and created_at
|
||||
conn.execute_batch(
|
||||
"CREATE TABLE IF NOT EXISTS ai_providers (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL UNIQUE,
|
||||
provider_type TEXT NOT NULL,
|
||||
api_url TEXT NOT NULL,
|
||||
encrypted_api_key TEXT NOT NULL,
|
||||
model TEXT NOT NULL,
|
||||
max_tokens INTEGER,
|
||||
temperature REAL,
|
||||
custom_endpoint_path TEXT,
|
||||
custom_auth_header TEXT,
|
||||
custom_auth_prefix TEXT,
|
||||
api_format TEXT,
|
||||
user_id TEXT,
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);",
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
// Verify columns BEFORE migration
|
||||
let mut stmt = conn.prepare("PRAGMA table_info(ai_providers)").unwrap();
|
||||
let columns: Vec<String> = stmt
|
||||
.query_map([], |row| row.get::<_, String>(1))
|
||||
.unwrap()
|
||||
.collect::<Result<Vec<_>, _>>()
|
||||
.unwrap();
|
||||
|
||||
assert!(columns.contains(&"name".to_string()));
|
||||
assert!(columns.contains(&"model".to_string()));
|
||||
assert!(!columns.contains(&"use_datastore_upload".to_string()));
|
||||
assert!(!columns.contains(&"created_at".to_string()));
|
||||
|
||||
// Run migrations (should apply 015 to add missing columns)
|
||||
run_migrations(&conn).unwrap();
|
||||
|
||||
// Verify columns AFTER migration
|
||||
let mut stmt = conn.prepare("PRAGMA table_info(ai_providers)").unwrap();
|
||||
let columns: Vec<String> = stmt
|
||||
.query_map([], |row| row.get::<_, String>(1))
|
||||
.unwrap()
|
||||
.collect::<Result<Vec<_>, _>>()
|
||||
.unwrap();
|
||||
|
||||
assert!(columns.contains(&"name".to_string()));
|
||||
assert!(columns.contains(&"model".to_string()));
|
||||
assert!(columns.contains(&"use_datastore_upload".to_string()));
|
||||
assert!(columns.contains(&"created_at".to_string()));
|
||||
|
||||
// Verify data integrity - existing rows should have default values
|
||||
conn.execute(
|
||||
"INSERT INTO ai_providers (id, name, provider_type, api_url, encrypted_api_key, model)
|
||||
VALUES (?, ?, ?, ?, ?, ?)",
|
||||
rusqlite::params![
|
||||
"test-provider-2",
|
||||
"Test Provider",
|
||||
"openai",
|
||||
"https://api.example.com",
|
||||
"encrypted_key_456",
|
||||
"gpt-3.5-turbo"
|
||||
],
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
let (name, use_datastore_upload, created_at): (String, bool, String) = conn
|
||||
.query_row(
|
||||
"SELECT name, use_datastore_upload, created_at FROM ai_providers WHERE name = ?1",
|
||||
["Test Provider"],
|
||||
|r| Ok((r.get(0)?, r.get(1)?, r.get(2)?)),
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(name, "Test Provider");
|
||||
assert!(!use_datastore_upload);
|
||||
assert!(created_at.len() > 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_idempotent_add_missing_columns() {
|
||||
let conn = Connection::open_in_memory().unwrap();
|
||||
|
||||
// Create table with both columns already present (simulating prior migration run)
|
||||
conn.execute_batch(
|
||||
"CREATE TABLE IF NOT EXISTS ai_providers (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL UNIQUE,
|
||||
provider_type TEXT NOT NULL,
|
||||
api_url TEXT NOT NULL,
|
||||
encrypted_api_key TEXT NOT NULL,
|
||||
model TEXT NOT NULL,
|
||||
max_tokens INTEGER,
|
||||
temperature REAL,
|
||||
custom_endpoint_path TEXT,
|
||||
custom_auth_header TEXT,
|
||||
custom_auth_prefix TEXT,
|
||||
api_format TEXT,
|
||||
user_id TEXT,
|
||||
use_datastore_upload INTEGER DEFAULT 0,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);",
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
// Should not fail even though columns already exist
|
||||
run_migrations(&conn).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_timeline_events_table_exists() {
|
||||
let conn = setup_test_db();
|
||||
let count: i64 = conn
|
||||
.query_row(
|
||||
"SELECT COUNT(*) FROM sqlite_master WHERE type='table' AND name='timeline_events'",
|
||||
[],
|
||||
|r| r.get(0),
|
||||
)
|
||||
.unwrap();
|
||||
assert_eq!(count, 1);
|
||||
|
||||
let mut stmt = conn.prepare("PRAGMA table_info(timeline_events)").unwrap();
|
||||
let columns: Vec<String> = stmt
|
||||
.query_map([], |row| row.get::<_, String>(1))
|
||||
.unwrap()
|
||||
.collect::<Result<Vec<_>, _>>()
|
||||
.unwrap();
|
||||
|
||||
assert!(columns.contains(&"id".to_string()));
|
||||
assert!(columns.contains(&"issue_id".to_string()));
|
||||
assert!(columns.contains(&"event_type".to_string()));
|
||||
assert!(columns.contains(&"description".to_string()));
|
||||
assert!(columns.contains(&"metadata".to_string()));
|
||||
assert!(columns.contains(&"created_at".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_timeline_events_cascade_delete() {
|
||||
let conn = setup_test_db();
|
||||
conn.execute("PRAGMA foreign_keys = ON", []).unwrap();
|
||||
|
||||
let now = chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string();
|
||||
conn.execute(
|
||||
"INSERT INTO issues (id, title, created_at, updated_at) VALUES (?1, ?2, ?3, ?4)",
|
||||
rusqlite::params!["issue-1", "Test Issue", now, now],
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
conn.execute(
|
||||
"INSERT INTO timeline_events (id, issue_id, event_type, description, metadata, created_at) VALUES (?1, ?2, ?3, ?4, ?5, ?6)",
|
||||
rusqlite::params!["te-1", "issue-1", "triage_started", "Started triage", "{}", "2025-01-15 10:00:00 UTC"],
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
// Verify event exists
|
||||
let count: i64 = conn
|
||||
.query_row("SELECT COUNT(*) FROM timeline_events", [], |r| r.get(0))
|
||||
.unwrap();
|
||||
assert_eq!(count, 1);
|
||||
|
||||
// Delete issue — cascade should remove timeline event
|
||||
conn.execute("DELETE FROM issues WHERE id = 'issue-1'", [])
|
||||
.unwrap();
|
||||
|
||||
let count: i64 = conn
|
||||
.query_row("SELECT COUNT(*) FROM timeline_events", [], |r| r.get(0))
|
||||
.unwrap();
|
||||
assert_eq!(count, 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_timeline_events_indexes() {
|
||||
let conn = setup_test_db();
|
||||
let mut stmt = conn
|
||||
.prepare(
|
||||
"SELECT name FROM sqlite_master WHERE type='index' AND tbl_name='timeline_events'",
|
||||
)
|
||||
.unwrap();
|
||||
let indexes: Vec<String> = stmt
|
||||
.query_map([], |row| row.get(0))
|
||||
.unwrap()
|
||||
.filter_map(|r| r.ok())
|
||||
.collect();
|
||||
assert!(indexes.contains(&"idx_timeline_events_issue".to_string()));
|
||||
assert!(indexes.contains(&"idx_timeline_events_time".to_string()));
|
||||
}
|
||||
}
|
||||
|
||||
@ -44,8 +44,10 @@ impl Issue {
|
||||
pub struct IssueDetail {
|
||||
pub issue: Issue,
|
||||
pub log_files: Vec<LogFile>,
|
||||
pub image_attachments: Vec<ImageAttachment>,
|
||||
pub resolution_steps: Vec<ResolutionStep>,
|
||||
pub conversations: Vec<AiConversation>,
|
||||
pub timeline_events: Vec<TimelineEvent>,
|
||||
}
|
||||
|
||||
/// Lightweight row returned by list/search commands.
|
||||
@ -120,9 +122,31 @@ pub struct FiveWhyEntry {
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct TimelineEvent {
|
||||
pub id: String,
|
||||
pub issue_id: String,
|
||||
pub event_type: String,
|
||||
pub description: String,
|
||||
pub created_at: i64,
|
||||
pub metadata: String,
|
||||
pub created_at: String,
|
||||
}
|
||||
|
||||
impl TimelineEvent {
|
||||
pub fn new(
|
||||
issue_id: String,
|
||||
event_type: String,
|
||||
description: String,
|
||||
metadata: String,
|
||||
) -> Self {
|
||||
TimelineEvent {
|
||||
id: Uuid::now_v7().to_string(),
|
||||
issue_id,
|
||||
event_type,
|
||||
description,
|
||||
metadata,
|
||||
created_at: chrono::Utc::now()
|
||||
.format("%Y-%m-%d %H:%M:%S UTC")
|
||||
.to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Log File ───────────────────────────────────────────────────────────────
|
||||
@ -392,3 +416,46 @@ impl IntegrationConfig {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Image Attachment ────────────────────────────────────────────────────────────
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ImageAttachment {
|
||||
pub id: String,
|
||||
pub issue_id: String,
|
||||
pub file_name: String,
|
||||
pub file_path: String,
|
||||
pub file_size: i64,
|
||||
pub mime_type: String,
|
||||
pub upload_hash: String,
|
||||
pub uploaded_at: String,
|
||||
pub pii_warning_acknowledged: bool,
|
||||
pub is_paste: bool,
|
||||
}
|
||||
|
||||
impl ImageAttachment {
|
||||
#[allow(clippy::too_many_arguments)]
|
||||
pub fn new(
|
||||
issue_id: String,
|
||||
file_name: String,
|
||||
file_path: String,
|
||||
file_size: i64,
|
||||
mime_type: String,
|
||||
upload_hash: String,
|
||||
pii_warning_acknowledged: bool,
|
||||
is_paste: bool,
|
||||
) -> Self {
|
||||
ImageAttachment {
|
||||
id: Uuid::now_v7().to_string(),
|
||||
issue_id,
|
||||
file_name,
|
||||
file_path,
|
||||
file_size,
|
||||
mime_type,
|
||||
upload_hash,
|
||||
uploaded_at: chrono::Utc::now().format("%Y-%m-%d %H:%M:%S").to_string(),
|
||||
pii_warning_acknowledged,
|
||||
is_paste,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
use crate::db::models::IssueDetail;
|
||||
use crate::docs::rca::{calculate_duration, format_event_type};
|
||||
|
||||
pub fn generate_postmortem_markdown(detail: &IssueDetail) -> String {
|
||||
let issue = &detail.issue;
|
||||
@ -51,7 +52,16 @@ pub fn generate_postmortem_markdown(detail: &IssueDetail) -> String {
|
||||
|
||||
// Impact
|
||||
md.push_str("## Impact\n\n");
|
||||
md.push_str("- **Duration:** _[How long did the incident last?]_\n");
|
||||
if detail.timeline_events.len() >= 2 {
|
||||
let first = &detail.timeline_events[0].created_at;
|
||||
let last = &detail.timeline_events[detail.timeline_events.len() - 1].created_at;
|
||||
md.push_str(&format!(
|
||||
"- **Duration:** {}\n",
|
||||
calculate_duration(first, last)
|
||||
));
|
||||
} else {
|
||||
md.push_str("- **Duration:** _[How long did the incident last?]_\n");
|
||||
}
|
||||
md.push_str("- **Users Affected:** _[Number/percentage of affected users]_\n");
|
||||
md.push_str("- **Revenue Impact:** _[Financial impact, if applicable]_\n");
|
||||
md.push_str("- **SLA Impact:** _[Were any SLAs breached?]_\n\n");
|
||||
@ -67,7 +77,19 @@ pub fn generate_postmortem_markdown(detail: &IssueDetail) -> String {
|
||||
if let Some(ref resolved) = issue.resolved_at {
|
||||
md.push_str(&format!("| {resolved} | Issue resolved |\n"));
|
||||
}
|
||||
md.push_str("| _HH:MM_ | _[Add additional timeline events]_ |\n\n");
|
||||
if detail.timeline_events.is_empty() {
|
||||
md.push_str("| _HH:MM_ | _[Add additional timeline events]_ |\n");
|
||||
} else {
|
||||
for event in &detail.timeline_events {
|
||||
md.push_str(&format!(
|
||||
"| {} | {} - {} |\n",
|
||||
event.created_at,
|
||||
format_event_type(&event.event_type),
|
||||
event.description
|
||||
));
|
||||
}
|
||||
}
|
||||
md.push('\n');
|
||||
|
||||
// Root Cause Analysis
|
||||
md.push_str("## Root Cause Analysis\n\n");
|
||||
@ -114,6 +136,19 @@ pub fn generate_postmortem_markdown(detail: &IssueDetail) -> String {
|
||||
|
||||
// What Went Well
|
||||
md.push_str("## What Went Well\n\n");
|
||||
if !detail.resolution_steps.is_empty() {
|
||||
md.push_str(&format!(
|
||||
"- Systematic 5-whys analysis conducted ({} steps completed)\n",
|
||||
detail.resolution_steps.len()
|
||||
));
|
||||
}
|
||||
if detail
|
||||
.timeline_events
|
||||
.iter()
|
||||
.any(|e| e.event_type == "root_cause_identified")
|
||||
{
|
||||
md.push_str("- Root cause was identified during triage\n");
|
||||
}
|
||||
md.push_str("- _[e.g., Quick detection through existing alerts]_\n");
|
||||
md.push_str("- _[e.g., Effective cross-team collaboration]_\n");
|
||||
md.push_str("- _[e.g., Smooth communication with stakeholders]_\n\n");
|
||||
@ -158,7 +193,7 @@ pub fn generate_postmortem_markdown(detail: &IssueDetail) -> String {
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::db::models::{Issue, IssueDetail, ResolutionStep};
|
||||
use crate::db::models::{Issue, IssueDetail, ResolutionStep, TimelineEvent};
|
||||
|
||||
fn make_test_detail() -> IssueDetail {
|
||||
IssueDetail {
|
||||
@ -177,6 +212,7 @@ mod tests {
|
||||
tags: "[]".to_string(),
|
||||
},
|
||||
log_files: vec![],
|
||||
image_attachments: vec![],
|
||||
resolution_steps: vec![ResolutionStep {
|
||||
id: "rs-pm-1".to_string(),
|
||||
issue_id: "pm-456".to_string(),
|
||||
@ -187,6 +223,7 @@ mod tests {
|
||||
created_at: "2025-02-10 09:00:00".to_string(),
|
||||
}],
|
||||
conversations: vec![],
|
||||
timeline_events: vec![],
|
||||
}
|
||||
}
|
||||
|
||||
@ -245,4 +282,76 @@ mod tests {
|
||||
assert!(md.contains("| Priority | Action | Owner | Due Date | Status |"));
|
||||
assert!(md.contains("| P0 |"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postmortem_timeline_with_real_events() {
|
||||
let mut detail = make_test_detail();
|
||||
detail.timeline_events = vec![
|
||||
TimelineEvent {
|
||||
id: "te-1".to_string(),
|
||||
issue_id: "pm-456".to_string(),
|
||||
event_type: "triage_started".to_string(),
|
||||
description: "Triage initiated".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-02-10 08:05:00 UTC".to_string(),
|
||||
},
|
||||
TimelineEvent {
|
||||
id: "te-2".to_string(),
|
||||
issue_id: "pm-456".to_string(),
|
||||
event_type: "root_cause_identified".to_string(),
|
||||
description: "Certificate expiry confirmed".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-02-10 10:30:00 UTC".to_string(),
|
||||
},
|
||||
];
|
||||
let md = generate_postmortem_markdown(&detail);
|
||||
assert!(md.contains("## Timeline"));
|
||||
assert!(md.contains("| 2025-02-10 08:05:00 UTC | Triage Started - Triage initiated |"));
|
||||
assert!(md.contains(
|
||||
"| 2025-02-10 10:30:00 UTC | Root Cause Identified - Certificate expiry confirmed |"
|
||||
));
|
||||
assert!(!md.contains("_[Add additional timeline events]_"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postmortem_impact_with_duration() {
|
||||
let mut detail = make_test_detail();
|
||||
detail.timeline_events = vec![
|
||||
TimelineEvent {
|
||||
id: "te-1".to_string(),
|
||||
issue_id: "pm-456".to_string(),
|
||||
event_type: "triage_started".to_string(),
|
||||
description: "Triage initiated".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-02-10 08:00:00 UTC".to_string(),
|
||||
},
|
||||
TimelineEvent {
|
||||
id: "te-2".to_string(),
|
||||
issue_id: "pm-456".to_string(),
|
||||
event_type: "root_cause_identified".to_string(),
|
||||
description: "Found it".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-02-10 10:30:00 UTC".to_string(),
|
||||
},
|
||||
];
|
||||
let md = generate_postmortem_markdown(&detail);
|
||||
assert!(md.contains("**Duration:** 2h 30m"));
|
||||
assert!(!md.contains("_[How long did the incident last?]_"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_postmortem_what_went_well_with_steps() {
|
||||
let mut detail = make_test_detail();
|
||||
detail.timeline_events = vec![TimelineEvent {
|
||||
id: "te-1".to_string(),
|
||||
issue_id: "pm-456".to_string(),
|
||||
event_type: "root_cause_identified".to_string(),
|
||||
description: "Root cause found".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-02-10 10:00:00 UTC".to_string(),
|
||||
}];
|
||||
let md = generate_postmortem_markdown(&detail);
|
||||
assert!(md.contains("Systematic 5-whys analysis conducted (1 steps completed)"));
|
||||
assert!(md.contains("Root cause was identified during triage"));
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,5 +1,48 @@
|
||||
use crate::db::models::IssueDetail;
|
||||
|
||||
pub fn format_event_type(event_type: &str) -> &str {
|
||||
match event_type {
|
||||
"triage_started" => "Triage Started",
|
||||
"log_uploaded" => "Log File Uploaded",
|
||||
"why_level_advanced" => "Why Level Advanced",
|
||||
"root_cause_identified" => "Root Cause Identified",
|
||||
"rca_generated" => "RCA Document Generated",
|
||||
"postmortem_generated" => "Post-Mortem Generated",
|
||||
"document_exported" => "Document Exported",
|
||||
other => other,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn calculate_duration(start: &str, end: &str) -> String {
|
||||
let fmt = "%Y-%m-%d %H:%M:%S UTC";
|
||||
let start_dt = match chrono::NaiveDateTime::parse_from_str(start, fmt) {
|
||||
Ok(dt) => dt,
|
||||
Err(_) => return "N/A".to_string(),
|
||||
};
|
||||
let end_dt = match chrono::NaiveDateTime::parse_from_str(end, fmt) {
|
||||
Ok(dt) => dt,
|
||||
Err(_) => return "N/A".to_string(),
|
||||
};
|
||||
|
||||
let duration = end_dt.signed_duration_since(start_dt);
|
||||
let total_minutes = duration.num_minutes();
|
||||
if total_minutes < 0 {
|
||||
return "N/A".to_string();
|
||||
}
|
||||
|
||||
let days = total_minutes / (24 * 60);
|
||||
let hours = (total_minutes % (24 * 60)) / 60;
|
||||
let minutes = total_minutes % 60;
|
||||
|
||||
if days > 0 {
|
||||
format!("{days}d {hours}h")
|
||||
} else if hours > 0 {
|
||||
format!("{hours}h {minutes}m")
|
||||
} else {
|
||||
format!("{minutes}m")
|
||||
}
|
||||
}
|
||||
|
||||
pub fn generate_rca_markdown(detail: &IssueDetail) -> String {
|
||||
let issue = &detail.issue;
|
||||
|
||||
@ -57,6 +100,52 @@ pub fn generate_rca_markdown(detail: &IssueDetail) -> String {
|
||||
md.push_str("\n\n");
|
||||
}
|
||||
|
||||
// Incident Timeline
|
||||
md.push_str("## Incident Timeline\n\n");
|
||||
if detail.timeline_events.is_empty() {
|
||||
md.push_str("_No timeline events recorded._\n\n");
|
||||
} else {
|
||||
md.push_str("| Time (UTC) | Event | Description |\n");
|
||||
md.push_str("|------------|-------|-------------|\n");
|
||||
for event in &detail.timeline_events {
|
||||
md.push_str(&format!(
|
||||
"| {} | {} | {} |\n",
|
||||
event.created_at,
|
||||
format_event_type(&event.event_type),
|
||||
event.description
|
||||
));
|
||||
}
|
||||
md.push('\n');
|
||||
}
|
||||
|
||||
// Incident Metrics
|
||||
md.push_str("## Incident Metrics\n\n");
|
||||
md.push_str(&format!(
|
||||
"- **Total Events:** {}\n",
|
||||
detail.timeline_events.len()
|
||||
));
|
||||
if detail.timeline_events.len() >= 2 {
|
||||
let first = &detail.timeline_events[0].created_at;
|
||||
let last = &detail.timeline_events[detail.timeline_events.len() - 1].created_at;
|
||||
md.push_str(&format!(
|
||||
"- **Incident Duration:** {}\n",
|
||||
calculate_duration(first, last)
|
||||
));
|
||||
} else {
|
||||
md.push_str("- **Incident Duration:** N/A\n");
|
||||
}
|
||||
let root_cause_event = detail
|
||||
.timeline_events
|
||||
.iter()
|
||||
.find(|e| e.event_type == "root_cause_identified");
|
||||
if let (Some(first), Some(rc)) = (detail.timeline_events.first(), root_cause_event) {
|
||||
md.push_str(&format!(
|
||||
"- **Time to Root Cause:** {}\n",
|
||||
calculate_duration(&first.created_at, &rc.created_at)
|
||||
));
|
||||
}
|
||||
md.push('\n');
|
||||
|
||||
// 5 Whys Analysis
|
||||
md.push_str("## 5 Whys Analysis\n\n");
|
||||
if detail.resolution_steps.is_empty() {
|
||||
@ -143,7 +232,7 @@ pub fn generate_rca_markdown(detail: &IssueDetail) -> String {
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::db::models::{Issue, IssueDetail, LogFile, ResolutionStep};
|
||||
use crate::db::models::{Issue, IssueDetail, LogFile, ResolutionStep, TimelineEvent};
|
||||
|
||||
fn make_test_detail() -> IssueDetail {
|
||||
IssueDetail {
|
||||
@ -172,6 +261,7 @@ mod tests {
|
||||
uploaded_at: "2025-01-15 10:30:00".to_string(),
|
||||
redacted: false,
|
||||
}],
|
||||
image_attachments: vec![],
|
||||
resolution_steps: vec![
|
||||
ResolutionStep {
|
||||
id: "rs-1".to_string(),
|
||||
@ -193,6 +283,7 @@ mod tests {
|
||||
},
|
||||
],
|
||||
conversations: vec![],
|
||||
timeline_events: vec![],
|
||||
}
|
||||
}
|
||||
|
||||
@ -246,4 +337,135 @@ mod tests {
|
||||
let md = generate_rca_markdown(&detail);
|
||||
assert!(md.contains("Unassigned"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_rca_timeline_section_with_events() {
|
||||
let mut detail = make_test_detail();
|
||||
detail.timeline_events = vec![
|
||||
TimelineEvent {
|
||||
id: "te-1".to_string(),
|
||||
issue_id: "test-123".to_string(),
|
||||
event_type: "triage_started".to_string(),
|
||||
description: "Triage initiated by oncall".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-01-15 10:00:00 UTC".to_string(),
|
||||
},
|
||||
TimelineEvent {
|
||||
id: "te-2".to_string(),
|
||||
issue_id: "test-123".to_string(),
|
||||
event_type: "log_uploaded".to_string(),
|
||||
description: "app.log uploaded".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-01-15 10:30:00 UTC".to_string(),
|
||||
},
|
||||
TimelineEvent {
|
||||
id: "te-3".to_string(),
|
||||
issue_id: "test-123".to_string(),
|
||||
event_type: "root_cause_identified".to_string(),
|
||||
description: "Connection pool leak found".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-01-15 12:15:00 UTC".to_string(),
|
||||
},
|
||||
];
|
||||
let md = generate_rca_markdown(&detail);
|
||||
assert!(md.contains("## Incident Timeline"));
|
||||
assert!(md.contains("| Time (UTC) | Event | Description |"));
|
||||
assert!(md
|
||||
.contains("| 2025-01-15 10:00:00 UTC | Triage Started | Triage initiated by oncall |"));
|
||||
assert!(md.contains("| 2025-01-15 10:30:00 UTC | Log File Uploaded | app.log uploaded |"));
|
||||
assert!(md.contains(
|
||||
"| 2025-01-15 12:15:00 UTC | Root Cause Identified | Connection pool leak found |"
|
||||
));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_rca_timeline_section_empty() {
|
||||
let detail = make_test_detail();
|
||||
let md = generate_rca_markdown(&detail);
|
||||
assert!(md.contains("## Incident Timeline"));
|
||||
assert!(md.contains("_No timeline events recorded._"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_rca_metrics_section() {
|
||||
let mut detail = make_test_detail();
|
||||
detail.timeline_events = vec![
|
||||
TimelineEvent {
|
||||
id: "te-1".to_string(),
|
||||
issue_id: "test-123".to_string(),
|
||||
event_type: "triage_started".to_string(),
|
||||
description: "Triage started".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-01-15 10:00:00 UTC".to_string(),
|
||||
},
|
||||
TimelineEvent {
|
||||
id: "te-2".to_string(),
|
||||
issue_id: "test-123".to_string(),
|
||||
event_type: "root_cause_identified".to_string(),
|
||||
description: "Root cause found".to_string(),
|
||||
metadata: "{}".to_string(),
|
||||
created_at: "2025-01-15 12:15:00 UTC".to_string(),
|
||||
},
|
||||
];
|
||||
let md = generate_rca_markdown(&detail);
|
||||
assert!(md.contains("## Incident Metrics"));
|
||||
assert!(md.contains("**Total Events:** 2"));
|
||||
assert!(md.contains("**Incident Duration:** 2h 15m"));
|
||||
assert!(md.contains("**Time to Root Cause:** 2h 15m"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_calculate_duration_hours_minutes() {
|
||||
assert_eq!(
|
||||
calculate_duration("2025-01-15 10:00:00 UTC", "2025-01-15 12:15:00 UTC"),
|
||||
"2h 15m"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_calculate_duration_days() {
|
||||
assert_eq!(
|
||||
calculate_duration("2025-01-15 10:00:00 UTC", "2025-01-18 11:00:00 UTC"),
|
||||
"3d 1h"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_calculate_duration_minutes_only() {
|
||||
assert_eq!(
|
||||
calculate_duration("2025-01-15 10:00:00 UTC", "2025-01-15 10:45:00 UTC"),
|
||||
"45m"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_calculate_duration_invalid() {
|
||||
assert_eq!(calculate_duration("bad-date", "also-bad"), "N/A");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_format_event_type_known() {
|
||||
assert_eq!(format_event_type("triage_started"), "Triage Started");
|
||||
assert_eq!(format_event_type("log_uploaded"), "Log File Uploaded");
|
||||
assert_eq!(
|
||||
format_event_type("why_level_advanced"),
|
||||
"Why Level Advanced"
|
||||
);
|
||||
assert_eq!(
|
||||
format_event_type("root_cause_identified"),
|
||||
"Root Cause Identified"
|
||||
);
|
||||
assert_eq!(format_event_type("rca_generated"), "RCA Document Generated");
|
||||
assert_eq!(
|
||||
format_event_type("postmortem_generated"),
|
||||
"Post-Mortem Generated"
|
||||
);
|
||||
assert_eq!(format_event_type("document_exported"), "Document Exported");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_format_event_type_unknown() {
|
||||
assert_eq!(format_event_type("custom_event"), "custom_event");
|
||||
assert_eq!(format_event_type(""), "");
|
||||
}
|
||||
}
|
||||
|
||||
@ -503,25 +503,31 @@ mod tests {
|
||||
|
||||
#[test]
|
||||
fn test_encrypt_decrypt_roundtrip() {
|
||||
// Use a deterministic key derived from the test function name
|
||||
// This avoids env var conflicts with parallel tests
|
||||
let test_key = "test-key-encrypt-decrypt-roundtrip-12345";
|
||||
let key_material = derive_aes_key_from_str(test_key).unwrap();
|
||||
|
||||
let original = "my-secret-token-12345";
|
||||
let encrypted = encrypt_token(original).unwrap();
|
||||
let decrypted = decrypt_token(&encrypted).unwrap();
|
||||
let encrypted = encrypt_token_with_key(original, &key_material).unwrap();
|
||||
let decrypted = decrypt_token_with_key(&encrypted, &key_material).unwrap();
|
||||
assert_eq!(original, decrypted);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_encrypt_produces_different_output_each_time() {
|
||||
// Ensure env var is not set from other tests
|
||||
std::env::remove_var("TFTSR_ENCRYPTION_KEY");
|
||||
// Use a deterministic key derived from the test function name
|
||||
let test_key = "test-key-encrypt-different-67890";
|
||||
let key_material = derive_aes_key_from_str(test_key).unwrap();
|
||||
|
||||
let token = "same-token";
|
||||
let enc1 = encrypt_token(token).unwrap();
|
||||
let enc2 = encrypt_token(token).unwrap();
|
||||
let enc1 = encrypt_token_with_key(token, &key_material).unwrap();
|
||||
let enc2 = encrypt_token_with_key(token, &key_material).unwrap();
|
||||
// Different nonces mean different ciphertext
|
||||
assert_ne!(enc1, enc2);
|
||||
// But both decrypt to the same value
|
||||
assert_eq!(decrypt_token(&enc1).unwrap(), token);
|
||||
assert_eq!(decrypt_token(&enc2).unwrap(), token);
|
||||
assert_eq!(decrypt_token_with_key(&enc1, &key_material).unwrap(), token);
|
||||
assert_eq!(decrypt_token_with_key(&enc2, &key_material).unwrap(), token);
|
||||
}
|
||||
|
||||
#[test]
|
||||
@ -623,10 +629,82 @@ mod tests {
|
||||
|
||||
#[test]
|
||||
fn test_derive_aes_key_is_stable_for_same_input() {
|
||||
std::env::set_var("TFTSR_ENCRYPTION_KEY", "stable-test-key");
|
||||
let k1 = derive_aes_key().unwrap();
|
||||
let k2 = derive_aes_key().unwrap();
|
||||
// Use deterministic helper to avoid env var race conditions in parallel tests
|
||||
let k1 = derive_aes_key_from_str("stable-test-key").unwrap();
|
||||
let k2 = derive_aes_key_from_str("stable-test-key").unwrap();
|
||||
assert_eq!(k1, k2);
|
||||
std::env::remove_var("TFTSR_ENCRYPTION_KEY");
|
||||
}
|
||||
|
||||
// Test helper functions that accept key directly (bypass env var)
|
||||
#[cfg(test)]
|
||||
fn derive_aes_key_from_str(key: &str) -> Result<[u8; 32], String> {
|
||||
let digest = sha2::Sha256::digest(key.as_bytes());
|
||||
let mut key_bytes = [0u8; 32];
|
||||
key_bytes.copy_from_slice(&digest);
|
||||
Ok(key_bytes)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
fn encrypt_token_with_key(token: &str, key_bytes: &[u8; 32]) -> Result<String, String> {
|
||||
use aes_gcm::{
|
||||
aead::{Aead, KeyInit},
|
||||
Aes256Gcm, Nonce,
|
||||
};
|
||||
use rand::{thread_rng, RngCore};
|
||||
|
||||
let cipher = Aes256Gcm::new_from_slice(key_bytes)
|
||||
.map_err(|e| format!("Failed to create cipher: {e}"))?;
|
||||
|
||||
// Generate random nonce
|
||||
let mut nonce_bytes = [0u8; 12];
|
||||
thread_rng().fill_bytes(&mut nonce_bytes);
|
||||
let nonce = Nonce::from_slice(&nonce_bytes);
|
||||
|
||||
// Encrypt
|
||||
let ciphertext = cipher
|
||||
.encrypt(nonce, token.as_bytes())
|
||||
.map_err(|e| format!("Encryption failed: {e}"))?;
|
||||
|
||||
// Prepend nonce to ciphertext
|
||||
let mut result = nonce_bytes.to_vec();
|
||||
result.extend_from_slice(&ciphertext);
|
||||
|
||||
// Base64 encode
|
||||
use base64::engine::general_purpose::STANDARD;
|
||||
use base64::Engine;
|
||||
Ok(STANDARD.encode(&result))
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
fn decrypt_token_with_key(encrypted: &str, key_bytes: &[u8; 32]) -> Result<String, String> {
|
||||
use aes_gcm::{
|
||||
aead::{Aead, KeyInit},
|
||||
Aes256Gcm, Nonce,
|
||||
};
|
||||
|
||||
// Base64 decode
|
||||
use base64::engine::general_purpose::STANDARD;
|
||||
use base64::Engine;
|
||||
let data = STANDARD
|
||||
.decode(encrypted)
|
||||
.map_err(|e| format!("Base64 decode failed: {e}"))?;
|
||||
|
||||
if data.len() < 12 {
|
||||
return Err("Invalid encrypted data: too short".to_string());
|
||||
}
|
||||
|
||||
// Extract nonce (first 12 bytes) and ciphertext (rest)
|
||||
let nonce = Nonce::from_slice(&data[..12]);
|
||||
let ciphertext = &data[12..];
|
||||
|
||||
let cipher = Aes256Gcm::new_from_slice(key_bytes)
|
||||
.map_err(|e| format!("Failed to create cipher: {e}"))?;
|
||||
|
||||
// Decrypt
|
||||
let plaintext = cipher
|
||||
.decrypt(nonce, ciphertext)
|
||||
.map_err(|e| format!("Decryption failed: {e}"))?;
|
||||
|
||||
String::from_utf8(plaintext).map_err(|e| format!("Invalid UTF-8: {e}"))
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,4 +1,40 @@
|
||||
use super::confluence_search::SearchResult;
|
||||
use crate::integrations::query_expansion::expand_query;
|
||||
|
||||
const MAX_EXPANDED_QUERIES: usize = 3;
|
||||
|
||||
fn escape_wiql(s: &str) -> String {
|
||||
s.replace('\'', "''")
|
||||
.replace('"', "\\\"")
|
||||
.replace('\\', "\\\\")
|
||||
.replace('(', "\\(")
|
||||
.replace(')', "\\)")
|
||||
.replace(';', "\\;")
|
||||
.replace('=', "\\=")
|
||||
}
|
||||
|
||||
/// Basic HTML tag stripping to prevent XSS in excerpts
|
||||
fn strip_html_tags(html: &str) -> String {
|
||||
let mut result = String::new();
|
||||
let mut in_tag = false;
|
||||
|
||||
for ch in html.chars() {
|
||||
match ch {
|
||||
'<' => in_tag = true,
|
||||
'>' => in_tag = false,
|
||||
_ if !in_tag => result.push(ch),
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up whitespace
|
||||
result
|
||||
.split_whitespace()
|
||||
.collect::<Vec<_>>()
|
||||
.join(" ")
|
||||
.trim()
|
||||
.to_string()
|
||||
}
|
||||
|
||||
/// Search Azure DevOps Wiki for content matching the query
|
||||
pub async fn search_wiki(
|
||||
@ -10,90 +46,94 @@ pub async fn search_wiki(
|
||||
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
// Use Azure DevOps Search API
|
||||
let search_url = format!(
|
||||
"{}/_apis/search/wikisearchresults?api-version=7.0",
|
||||
org_url.trim_end_matches('/')
|
||||
);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
let search_body = serde_json::json!({
|
||||
"searchText": query,
|
||||
"$top": 5,
|
||||
"filters": {
|
||||
"ProjectFilters": [project]
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
for expanded_query in expanded_queries.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
// Use Azure DevOps Search API
|
||||
let search_url = format!(
|
||||
"{}/_apis/search/wikisearchresults?api-version=7.0",
|
||||
org_url.trim_end_matches('/')
|
||||
);
|
||||
|
||||
let search_body = serde_json::json!({
|
||||
"searchText": expanded_query,
|
||||
"$top": 5,
|
||||
"filters": {
|
||||
"ProjectFilters": [project]
|
||||
}
|
||||
});
|
||||
|
||||
tracing::info!("Searching Azure DevOps Wiki with query: {}", expanded_query);
|
||||
|
||||
let resp = client
|
||||
.post(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.header("Content-Type", "application/json")
|
||||
.json(&search_body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Azure DevOps wiki search failed: {e}"))?;
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp.text().await.unwrap_or_default();
|
||||
tracing::warn!("Azure DevOps wiki search failed with status {status}: {text}");
|
||||
continue;
|
||||
}
|
||||
});
|
||||
|
||||
tracing::info!("Searching Azure DevOps Wiki: {}", search_url);
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse ADO wiki search response: {e}"))?;
|
||||
|
||||
let resp = client
|
||||
.post(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.header("Content-Type", "application/json")
|
||||
.json(&search_body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Azure DevOps wiki search failed: {e}"))?;
|
||||
if let Some(results_array) = json["results"].as_array() {
|
||||
for item in results_array.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
let title = item["fileName"].as_str().unwrap_or("Untitled").to_string();
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp.text().await.unwrap_or_default();
|
||||
return Err(format!(
|
||||
"Azure DevOps wiki search failed with status {status}: {text}"
|
||||
));
|
||||
}
|
||||
let path = item["path"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/_wiki/wikis/{}/{}",
|
||||
org_url.trim_end_matches('/'),
|
||||
project,
|
||||
path
|
||||
);
|
||||
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse ADO wiki search response: {e}"))?;
|
||||
let excerpt = strip_html_tags(item["content"].as_str().unwrap_or(""))
|
||||
.chars()
|
||||
.take(300)
|
||||
.collect::<String>();
|
||||
|
||||
let mut results = Vec::new();
|
||||
|
||||
if let Some(results_array) = json["results"].as_array() {
|
||||
for item in results_array.iter().take(3) {
|
||||
let title = item["fileName"].as_str().unwrap_or("Untitled").to_string();
|
||||
|
||||
let path = item["path"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/_wiki/wikis/{}/{}",
|
||||
org_url.trim_end_matches('/'),
|
||||
project,
|
||||
path
|
||||
);
|
||||
|
||||
let excerpt = item["content"]
|
||||
.as_str()
|
||||
.unwrap_or("")
|
||||
.chars()
|
||||
.take(300)
|
||||
.collect::<String>();
|
||||
|
||||
// Fetch full wiki page content
|
||||
let content = if let Some(wiki_id) = item["wiki"]["id"].as_str() {
|
||||
if let Some(page_path) = item["path"].as_str() {
|
||||
fetch_wiki_page(org_url, wiki_id, page_path, &cookie_header)
|
||||
.await
|
||||
.ok()
|
||||
// Fetch full wiki page content
|
||||
let content = if let Some(wiki_id) = item["wiki"]["id"].as_str() {
|
||||
if let Some(page_path) = item["path"].as_str() {
|
||||
fetch_wiki_page(org_url, wiki_id, page_path, &cookie_header)
|
||||
.await
|
||||
.ok()
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
};
|
||||
};
|
||||
|
||||
results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "Azure DevOps".to_string(),
|
||||
});
|
||||
all_results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "Azure DevOps".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(results)
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Fetch full wiki page content
|
||||
@ -151,55 +191,68 @@ pub async fn search_work_items(
|
||||
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
// Use WIQL (Work Item Query Language)
|
||||
let wiql_url = format!(
|
||||
"{}/_apis/wit/wiql?api-version=7.0",
|
||||
org_url.trim_end_matches('/')
|
||||
);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
let wiql_query = format!(
|
||||
"SELECT [System.Id], [System.Title], [System.Description], [System.State] FROM WorkItems WHERE [System.TeamProject] = '{project}' AND ([System.Title] CONTAINS '{query}' OR [System.Description] CONTAINS '{query}') ORDER BY [System.ChangedDate] DESC"
|
||||
);
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
let wiql_body = serde_json::json!({
|
||||
"query": wiql_query
|
||||
});
|
||||
for expanded_query in expanded_queries.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
// Use WIQL (Work Item Query Language)
|
||||
let wiql_url = format!(
|
||||
"{}/_apis/wit/wiql?api-version=7.0",
|
||||
org_url.trim_end_matches('/')
|
||||
);
|
||||
|
||||
tracing::info!("Searching Azure DevOps work items");
|
||||
let safe_query = escape_wiql(expanded_query);
|
||||
let wiql_query = format!(
|
||||
"SELECT [System.Id], [System.Title], [System.Description], [System.State] FROM WorkItems WHERE [System.TeamProject] = '{project}' AND ([System.Title] ~ '{safe_query}' OR [System.Description] ~ '{safe_query}') ORDER BY [System.ChangedDate] DESC"
|
||||
);
|
||||
|
||||
let resp = client
|
||||
.post(&wiql_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.header("Content-Type", "application/json")
|
||||
.json(&wiql_body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("ADO work item search failed: {e}"))?;
|
||||
let wiql_body = serde_json::json!({
|
||||
"query": wiql_query
|
||||
});
|
||||
|
||||
if !resp.status().is_success() {
|
||||
return Ok(Vec::new()); // Don't fail if work item search fails
|
||||
}
|
||||
tracing::info!(
|
||||
"Searching Azure DevOps work items with query: {}",
|
||||
expanded_query
|
||||
);
|
||||
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|_| "Failed to parse work item response".to_string())?;
|
||||
let resp = client
|
||||
.post(&wiql_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.header("Content-Type", "application/json")
|
||||
.json(&wiql_body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("ADO work item search failed: {e}"))?;
|
||||
|
||||
let mut results = Vec::new();
|
||||
if !resp.status().is_success() {
|
||||
continue; // Don't fail if work item search fails
|
||||
}
|
||||
|
||||
if let Some(work_items) = json["workItems"].as_array() {
|
||||
// Fetch details for top 3 work items
|
||||
for item in work_items.iter().take(3) {
|
||||
if let Some(id) = item["id"].as_i64() {
|
||||
if let Ok(work_item) = fetch_work_item_details(org_url, id, &cookie_header).await {
|
||||
results.push(work_item);
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|_| "Failed to parse work item response".to_string())?;
|
||||
|
||||
if let Some(work_items) = json["workItems"].as_array() {
|
||||
// Fetch details for top 3 work items
|
||||
for item in work_items.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
if let Some(id) = item["id"].as_i64() {
|
||||
if let Ok(work_item) =
|
||||
fetch_work_item_details(org_url, id, &cookie_header).await
|
||||
{
|
||||
all_results.push(work_item);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(results)
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Fetch work item details
|
||||
@ -263,3 +316,53 @@ async fn fetch_work_item_details(
|
||||
source: "Azure DevOps".to_string(),
|
||||
})
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_escape_wiql_escapes_single_quotes() {
|
||||
assert_eq!(escape_wiql("test'single"), "test''single");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_wiql_escapes_double_quotes() {
|
||||
assert_eq!(escape_wiql("test\"double"), "test\\\\\"double");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_wiql_escapes_backslash() {
|
||||
assert_eq!(escape_wiql("test\\backslash"), r#"test\\backslash"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_wiql_escapes_parens() {
|
||||
assert_eq!(escape_wiql("test(paren"), r#"test\(paren"#);
|
||||
assert_eq!(escape_wiql("test)paren"), r#"test\)paren"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_wiql_escapes_semicolon() {
|
||||
assert_eq!(escape_wiql("test;semi"), r#"test\;semi"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_wiql_escapes_equals() {
|
||||
assert_eq!(escape_wiql("test=equal"), r#"test\=equal"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_wiql_no_special_chars() {
|
||||
assert_eq!(escape_wiql("simple query"), "simple query");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_strip_html_tags() {
|
||||
let html = "<p>Hello <strong>world</strong>!</p>";
|
||||
assert_eq!(strip_html_tags(html), "Hello world!");
|
||||
|
||||
let html2 = "<div><h1>Title</h1><p>Content</p></div>";
|
||||
assert_eq!(strip_html_tags(html2), "TitleContent");
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,4 +1,9 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
use url::Url;
|
||||
|
||||
use super::query_expansion::expand_query;
|
||||
|
||||
const MAX_EXPANDED_QUERIES: usize = 3;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SearchResult {
|
||||
@ -6,10 +11,36 @@ pub struct SearchResult {
|
||||
pub url: String,
|
||||
pub excerpt: String,
|
||||
pub content: Option<String>,
|
||||
pub source: String, // "confluence", "servicenow", "azuredevops"
|
||||
pub source: String,
|
||||
}
|
||||
|
||||
fn canonicalize_url(url: &str) -> String {
|
||||
Url::parse(url)
|
||||
.ok()
|
||||
.map(|u| {
|
||||
let mut u = u.clone();
|
||||
u.set_fragment(None);
|
||||
u.set_query(None);
|
||||
u.to_string()
|
||||
})
|
||||
.unwrap_or_else(|| url.to_string())
|
||||
}
|
||||
|
||||
fn escape_cql(s: &str) -> String {
|
||||
s.replace('"', "\\\"")
|
||||
.replace(')', "\\)")
|
||||
.replace('(', "\\(")
|
||||
.replace('~', "\\~")
|
||||
.replace('&', "\\&")
|
||||
.replace('|', "\\|")
|
||||
.replace('+', "\\+")
|
||||
.replace('-', "\\-")
|
||||
}
|
||||
|
||||
/// Search Confluence for content matching the query
|
||||
///
|
||||
/// This function expands the user query with related terms, synonyms, and variations
|
||||
/// to improve search coverage across Confluence spaces.
|
||||
pub async fn search_confluence(
|
||||
base_url: &str,
|
||||
query: &str,
|
||||
@ -18,86 +49,89 @@ pub async fn search_confluence(
|
||||
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
// Use Confluence CQL search
|
||||
let search_url = format!(
|
||||
"{}/rest/api/search?cql=text~\"{}\"&limit=5",
|
||||
base_url.trim_end_matches('/'),
|
||||
urlencoding::encode(query)
|
||||
);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
tracing::info!("Searching Confluence: {}", search_url);
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
let resp = client
|
||||
.get(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Confluence search request failed: {e}"))?;
|
||||
for expanded_query in expanded_queries.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
let safe_query = escape_cql(expanded_query);
|
||||
let search_url = format!(
|
||||
"{}/rest/api/search?cql=text~\"{}\"&limit=5",
|
||||
base_url.trim_end_matches('/'),
|
||||
urlencoding::encode(&safe_query)
|
||||
);
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp.text().await.unwrap_or_default();
|
||||
return Err(format!(
|
||||
"Confluence search failed with status {status}: {text}"
|
||||
));
|
||||
}
|
||||
tracing::info!(
|
||||
"Searching Confluence with expanded query: {}",
|
||||
expanded_query
|
||||
);
|
||||
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse Confluence search response: {e}"))?;
|
||||
let resp = client
|
||||
.get(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Confluence search request failed: {e}"))?;
|
||||
|
||||
let mut results = Vec::new();
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp.text().await.unwrap_or_default();
|
||||
tracing::warn!("Confluence search failed with status {status}: {text}");
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Some(results_array) = json["results"].as_array() {
|
||||
for item in results_array.iter().take(3) {
|
||||
// Take top 3 results
|
||||
let title = item["title"].as_str().unwrap_or("Untitled").to_string();
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse Confluence search response: {e}"))?;
|
||||
|
||||
let id = item["content"]["id"].as_str();
|
||||
let space_key = item["content"]["space"]["key"].as_str();
|
||||
if let Some(results_array) = json["results"].as_array() {
|
||||
for item in results_array.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
let title = item["title"].as_str().unwrap_or("Untitled").to_string();
|
||||
|
||||
// Build URL
|
||||
let url = if let (Some(id_str), Some(space)) = (id, space_key) {
|
||||
format!(
|
||||
"{}/display/{}/{}",
|
||||
base_url.trim_end_matches('/'),
|
||||
space,
|
||||
id_str
|
||||
)
|
||||
} else {
|
||||
base_url.to_string()
|
||||
};
|
||||
let id = item["content"]["id"].as_str();
|
||||
let space_key = item["content"]["space"]["key"].as_str();
|
||||
|
||||
// Get excerpt from search result
|
||||
let excerpt = item["excerpt"]
|
||||
.as_str()
|
||||
.unwrap_or("")
|
||||
.to_string()
|
||||
.replace("<span class=\"highlight\">", "")
|
||||
.replace("</span>", "");
|
||||
let url = if let (Some(id_str), Some(space)) = (id, space_key) {
|
||||
format!(
|
||||
"{}/display/{}/{}",
|
||||
base_url.trim_end_matches('/'),
|
||||
space,
|
||||
id_str
|
||||
)
|
||||
} else {
|
||||
base_url.to_string()
|
||||
};
|
||||
|
||||
// Fetch full page content
|
||||
let content = if let Some(content_id) = id {
|
||||
fetch_page_content(base_url, content_id, &cookie_header)
|
||||
.await
|
||||
.ok()
|
||||
} else {
|
||||
None
|
||||
};
|
||||
let excerpt = strip_html_tags(item["excerpt"].as_str().unwrap_or(""))
|
||||
.chars()
|
||||
.take(300)
|
||||
.collect::<String>();
|
||||
|
||||
results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "Confluence".to_string(),
|
||||
});
|
||||
let content = if let Some(content_id) = id {
|
||||
fetch_page_content(base_url, content_id, &cookie_header)
|
||||
.await
|
||||
.ok()
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
all_results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "Confluence".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(results)
|
||||
all_results.sort_by(|a, b| canonicalize_url(&a.url).cmp(&canonicalize_url(&b.url)));
|
||||
all_results.dedup_by(|a, b| canonicalize_url(&a.url) == canonicalize_url(&b.url));
|
||||
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Fetch full content of a Confluence page
|
||||
@ -185,4 +219,43 @@ mod tests {
|
||||
let html2 = "<div><h1>Title</h1><p>Content</p></div>";
|
||||
assert_eq!(strip_html_tags(html2), "TitleContent");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_cql_escapes_special_chars() {
|
||||
assert_eq!(escape_cql("test\"quote"), r#"test\"quote"#);
|
||||
assert_eq!(escape_cql("test(paren"), r#"test\(paren"#);
|
||||
assert_eq!(escape_cql("test)paren"), r#"test\)paren"#);
|
||||
assert_eq!(escape_cql("test~tilde"), r#"test\~tilde"#);
|
||||
assert_eq!(escape_cql("test&and"), r#"test\&and"#);
|
||||
assert_eq!(escape_cql("test|or"), r#"test\|or"#);
|
||||
assert_eq!(escape_cql("test+plus"), r#"test\+plus"#);
|
||||
assert_eq!(escape_cql("test-minus"), r#"test\-minus"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_escape_cql_no_special_chars() {
|
||||
assert_eq!(escape_cql("simple query"), "simple query");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_canonicalize_url_removes_fragment() {
|
||||
assert_eq!(
|
||||
canonicalize_url("https://example.com/page#section"),
|
||||
"https://example.com/page"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_canonicalize_url_removes_query() {
|
||||
assert_eq!(
|
||||
canonicalize_url("https://example.com/page?param=value"),
|
||||
"https://example.com/page"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_canonicalize_url_handles_malformed() {
|
||||
// Malformed URLs fall back to original
|
||||
assert_eq!(canonicalize_url("not a url"), "not a url");
|
||||
}
|
||||
}
|
||||
|
||||
@ -4,6 +4,7 @@ pub mod azuredevops_search;
|
||||
pub mod callback_server;
|
||||
pub mod confluence;
|
||||
pub mod confluence_search;
|
||||
pub mod query_expansion;
|
||||
pub mod servicenow;
|
||||
pub mod servicenow_search;
|
||||
pub mod webview_auth;
|
||||
|
||||
290
src-tauri/src/integrations/query_expansion.rs
Normal file
290
src-tauri/src/integrations/query_expansion.rs
Normal file
@ -0,0 +1,290 @@
|
||||
/// Query expansion module for integration search
|
||||
///
|
||||
/// This module provides functionality to expand user queries with related terms,
|
||||
/// synonyms, and variations to improve search results across integrations like
|
||||
/// Confluence, ServiceNow, and Azure DevOps.
|
||||
use std::collections::HashSet;
|
||||
|
||||
/// Product name synonyms for common product variations
|
||||
/// Maps common abbreviations/variants to their full names for search expansion
|
||||
fn get_product_synonyms(query: &str) -> Vec<String> {
|
||||
let mut synonyms = Vec::new();
|
||||
|
||||
// VESTA NXT related synonyms
|
||||
if query.to_lowercase().contains("vesta") || query.to_lowercase().contains("vnxt") {
|
||||
synonyms.extend(vec![
|
||||
"VESTA NXT".to_string(),
|
||||
"Vesta NXT".to_string(),
|
||||
"VNXT".to_string(),
|
||||
"vnxt".to_string(),
|
||||
"Vesta".to_string(),
|
||||
"vesta".to_string(),
|
||||
"VNX".to_string(),
|
||||
"vnx".to_string(),
|
||||
]);
|
||||
}
|
||||
|
||||
// Version number patterns (e.g., 1.0.12, 1.1.9)
|
||||
if query.contains('.') {
|
||||
// Extract version-like patterns and add variations
|
||||
let version_parts: Vec<&str> = query.split('.').collect();
|
||||
if version_parts.len() >= 2 {
|
||||
// Add variations without dots
|
||||
let version_no_dots = version_parts.join("");
|
||||
synonyms.push(version_no_dots);
|
||||
|
||||
// Add partial versions
|
||||
if version_parts.len() >= 2 {
|
||||
synonyms.push(version_parts[0..2].join("."));
|
||||
}
|
||||
if version_parts.len() >= 3 {
|
||||
synonyms.push(version_parts[0..3].join("."));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Common upgrade-related terms
|
||||
if query.to_lowercase().contains("upgrade") || query.to_lowercase().contains("update") {
|
||||
synonyms.extend(vec![
|
||||
"upgrade".to_string(),
|
||||
"update".to_string(),
|
||||
"migration".to_string(),
|
||||
"patch".to_string(),
|
||||
"version".to_string(),
|
||||
"install".to_string(),
|
||||
"installation".to_string(),
|
||||
]);
|
||||
}
|
||||
|
||||
// Remove duplicates and empty strings
|
||||
synonyms.sort();
|
||||
synonyms.dedup();
|
||||
synonyms.retain(|s| !s.is_empty());
|
||||
|
||||
synonyms
|
||||
}
|
||||
|
||||
/// Expand a search query with related terms for better search coverage
|
||||
///
|
||||
/// This function takes a user query and expands it with:
|
||||
/// - Product name synonyms (e.g., "VNXT" -> "VESTA NXT", "Vesta NXT")
|
||||
/// - Version number variations
|
||||
/// - Related terms based on query content
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `query` - The original user query
|
||||
///
|
||||
/// # Returns
|
||||
/// A vector of query strings to search, with the original query first
|
||||
/// followed by expanded variations. Returns empty only if input is empty or
|
||||
/// whitespace-only. Otherwise, always returns at least the original query.
|
||||
pub fn expand_query(query: &str) -> Vec<String> {
|
||||
if query.trim().is_empty() {
|
||||
return Vec::new();
|
||||
}
|
||||
|
||||
let mut expanded = vec![query.to_string()];
|
||||
|
||||
// Get product synonyms
|
||||
let product_synonyms = get_product_synonyms(query);
|
||||
expanded.extend(product_synonyms);
|
||||
|
||||
// Extract keywords from query for additional expansion
|
||||
let keywords = extract_keywords(query);
|
||||
|
||||
// Add keyword variations
|
||||
for keyword in keywords.iter().take(5) {
|
||||
if !expanded.contains(keyword) {
|
||||
expanded.push(keyword.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// Add common related terms based on query content
|
||||
let query_lower = query.to_lowercase();
|
||||
|
||||
if query_lower.contains("confluence") || query_lower.contains("documentation") {
|
||||
expanded.push("docs".to_string());
|
||||
expanded.push("manual".to_string());
|
||||
expanded.push("guide".to_string());
|
||||
}
|
||||
|
||||
if query_lower.contains("deploy") || query_lower.contains("deployment") {
|
||||
expanded.push("deploy".to_string());
|
||||
expanded.push("deployment".to_string());
|
||||
expanded.push("release".to_string());
|
||||
expanded.push("build".to_string());
|
||||
}
|
||||
|
||||
if query_lower.contains("kubernetes") || query_lower.contains("k8s") {
|
||||
expanded.push("kubernetes".to_string());
|
||||
expanded.push("k8s".to_string());
|
||||
expanded.push("pod".to_string());
|
||||
expanded.push("container".to_string());
|
||||
}
|
||||
|
||||
// Remove duplicates and empty strings
|
||||
expanded.sort();
|
||||
expanded.dedup();
|
||||
expanded.retain(|s| !s.is_empty());
|
||||
|
||||
expanded
|
||||
}
|
||||
|
||||
/// Extract important keywords from a search query
|
||||
///
|
||||
/// This function removes stop words and extracts meaningful terms
|
||||
/// for search expansion.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `query` - The original user query
|
||||
///
|
||||
/// # Returns
|
||||
/// A vector of extracted keywords
|
||||
fn extract_keywords(query: &str) -> Vec<String> {
|
||||
let stop_words: HashSet<&str> = [
|
||||
"how", "do", "i", "the", "a", "an", "is", "are", "was", "were", "be", "been", "being",
|
||||
"have", "has", "had", "having", "do", "does", "did", "doing", "will", "would", "should",
|
||||
"could", "can", "may", "might", "must", "to", "from", "in", "on", "at", "by", "for",
|
||||
"with", "about", "as", "of", "or", "and", "but", "not", "what", "when", "where", "which",
|
||||
"who", "this", "that", "these", "those", "if", "then", "else", "for", "while", "until",
|
||||
"against", "between", "into", "through", "during", "before", "after", "above", "below",
|
||||
"up", "down", "out", "off", "over", "under", "again", "further", "then", "once", "here",
|
||||
"there", "why", "where", "all", "any", "both", "each", "few", "more", "most", "other",
|
||||
"some", "such", "no", "nor", "only", "own", "same", "so", "than", "too", "very", "can",
|
||||
"just", "should", "now",
|
||||
]
|
||||
.into_iter()
|
||||
.collect();
|
||||
|
||||
let mut keywords = Vec::new();
|
||||
let mut remaining = query.to_string();
|
||||
|
||||
while !remaining.is_empty() {
|
||||
// Skip leading whitespace
|
||||
if remaining.starts_with(char::is_whitespace) {
|
||||
remaining = remaining.trim_start().to_string();
|
||||
continue;
|
||||
}
|
||||
|
||||
// Try to extract version number (e.g., 1.0.12, 1.1.9)
|
||||
if remaining.starts_with(|c: char| c.is_ascii_digit()) {
|
||||
let mut end_pos = 0;
|
||||
let mut dot_count = 0;
|
||||
|
||||
for (i, c) in remaining.chars().enumerate() {
|
||||
if c.is_ascii_digit() {
|
||||
end_pos = i + 1;
|
||||
} else if c == '.' {
|
||||
end_pos = i + 1;
|
||||
dot_count += 1;
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Only extract if we have at least 2 dots (e.g., 1.0.12)
|
||||
if dot_count >= 2 && end_pos > 0 {
|
||||
let version = remaining[..end_pos].to_string();
|
||||
keywords.push(version.clone());
|
||||
remaining = remaining[end_pos..].to_string();
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// Find word boundary - split on whitespace or non-alphanumeric
|
||||
let mut split_pos = remaining.len();
|
||||
for (i, c) in remaining.chars().enumerate() {
|
||||
if c.is_whitespace() || !c.is_alphanumeric() {
|
||||
split_pos = i;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// If split_pos is 0, the string starts with a non-alphanumeric character
|
||||
// Skip it and continue
|
||||
if split_pos == 0 {
|
||||
remaining = remaining[1..].to_string();
|
||||
continue;
|
||||
}
|
||||
|
||||
let word = remaining[..split_pos].to_lowercase();
|
||||
remaining = remaining[split_pos..].to_string();
|
||||
|
||||
// Skip empty words, single chars, and stop words
|
||||
if word.is_empty() || word.len() < 2 || stop_words.contains(word.as_str()) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Add numeric words with 3+ digits
|
||||
if word.chars().all(|c| c.is_ascii_digit()) && word.len() >= 3 {
|
||||
keywords.push(word.clone());
|
||||
continue;
|
||||
}
|
||||
|
||||
// Add words with at least one alphabetic character
|
||||
if word.chars().any(|c| c.is_alphabetic()) {
|
||||
keywords.push(word.clone());
|
||||
}
|
||||
}
|
||||
|
||||
keywords.sort();
|
||||
keywords.dedup();
|
||||
|
||||
keywords
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_expand_query_with_product_synonyms() {
|
||||
let query = "upgrade vesta nxt to 1.1.9";
|
||||
let expanded = expand_query(query);
|
||||
|
||||
// Should contain original query
|
||||
assert!(expanded.contains(&query.to_string()));
|
||||
|
||||
// Should contain product synonyms
|
||||
assert!(expanded
|
||||
.iter()
|
||||
.any(|s| s.contains("vnxt") || s.contains("vnxt")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_expand_query_with_version_numbers() {
|
||||
let query = "version 1.0.12";
|
||||
let expanded = expand_query(query);
|
||||
|
||||
// Should contain original query
|
||||
assert!(expanded.contains(&query.to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_keywords() {
|
||||
let query = "How do I upgrade VESTA NXT from 1.0.12 to 1.1.9?";
|
||||
let keywords = extract_keywords(query);
|
||||
|
||||
assert!(keywords.contains(&"upgrade".to_string()));
|
||||
assert!(keywords.contains(&"vesta".to_string()));
|
||||
assert!(keywords.contains(&"nxt".to_string()));
|
||||
assert!(keywords.contains(&"1.0.12".to_string()));
|
||||
assert!(keywords.contains(&"1.1.9".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_product_synonyms() {
|
||||
let synonyms = get_product_synonyms("vesta nxt upgrade");
|
||||
|
||||
// Should contain VNXT synonym
|
||||
assert!(synonyms
|
||||
.iter()
|
||||
.any(|s| s.contains("VNXT") || s.contains("vnxt")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_empty_query() {
|
||||
let expanded = expand_query("");
|
||||
assert!(expanded.is_empty() || expanded.contains(&"".to_string()));
|
||||
}
|
||||
}
|
||||
@ -1,4 +1,7 @@
|
||||
use super::confluence_search::SearchResult;
|
||||
use crate::integrations::query_expansion::expand_query;
|
||||
|
||||
const MAX_EXPANDED_QUERIES: usize = 3;
|
||||
|
||||
/// Search ServiceNow Knowledge Base for content matching the query
|
||||
pub async fn search_servicenow(
|
||||
@ -9,82 +12,88 @@ pub async fn search_servicenow(
|
||||
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
// Search Knowledge Base articles
|
||||
let search_url = format!(
|
||||
"{}/api/now/table/kb_knowledge?sysparm_query=textLIKE{}^ORshort_descriptionLIKE{}&sysparm_limit=5",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(query),
|
||||
urlencoding::encode(query)
|
||||
);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
tracing::info!("Searching ServiceNow: {}", search_url);
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
let resp = client
|
||||
.get(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("ServiceNow search request failed: {e}"))?;
|
||||
for expanded_query in expanded_queries.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
// Search Knowledge Base articles
|
||||
let search_url = format!(
|
||||
"{}/api/now/table/kb_knowledge?sysparm_query=textLIKE{}^ORshort_descriptionLIKE{}&sysparm_limit=5",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(expanded_query),
|
||||
urlencoding::encode(expanded_query)
|
||||
);
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp.text().await.unwrap_or_default();
|
||||
return Err(format!(
|
||||
"ServiceNow search failed with status {status}: {text}"
|
||||
));
|
||||
}
|
||||
tracing::info!("Searching ServiceNow with query: {}", expanded_query);
|
||||
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse ServiceNow search response: {e}"))?;
|
||||
let resp = client
|
||||
.get(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("ServiceNow search request failed: {e}"))?;
|
||||
|
||||
let mut results = Vec::new();
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let text = resp.text().await.unwrap_or_default();
|
||||
tracing::warn!("ServiceNow search failed with status {status}: {text}");
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Some(result_array) = json["result"].as_array() {
|
||||
for item in result_array.iter().take(3) {
|
||||
// Take top 3 results
|
||||
let title = item["short_description"]
|
||||
.as_str()
|
||||
.unwrap_or("Untitled")
|
||||
.to_string();
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse ServiceNow search response: {e}"))?;
|
||||
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("").to_string();
|
||||
if let Some(result_array) = json["result"].as_array() {
|
||||
for item in result_array.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
// Take top 3 results
|
||||
let title = item["short_description"]
|
||||
.as_str()
|
||||
.unwrap_or("Untitled")
|
||||
.to_string();
|
||||
|
||||
let url = format!(
|
||||
"{}/kb_view.do?sysparm_article={}",
|
||||
instance_url.trim_end_matches('/'),
|
||||
sys_id
|
||||
);
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("").to_string();
|
||||
|
||||
let excerpt = item["text"]
|
||||
.as_str()
|
||||
.unwrap_or("")
|
||||
.chars()
|
||||
.take(300)
|
||||
.collect::<String>();
|
||||
let url = format!(
|
||||
"{}/kb_view.do?sysparm_article={}",
|
||||
instance_url.trim_end_matches('/'),
|
||||
sys_id
|
||||
);
|
||||
|
||||
// Get full article content
|
||||
let content = item["text"].as_str().map(|text| {
|
||||
if text.len() > 3000 {
|
||||
format!("{}...", &text[..3000])
|
||||
} else {
|
||||
text.to_string()
|
||||
}
|
||||
});
|
||||
let excerpt = item["text"]
|
||||
.as_str()
|
||||
.unwrap_or("")
|
||||
.chars()
|
||||
.take(300)
|
||||
.collect::<String>();
|
||||
|
||||
results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
// Get full article content
|
||||
let content = item["text"].as_str().map(|text| {
|
||||
if text.len() > 3000 {
|
||||
format!("{}...", &text[..3000])
|
||||
} else {
|
||||
text.to_string()
|
||||
}
|
||||
});
|
||||
|
||||
all_results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(results)
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Search ServiceNow Incidents for related issues
|
||||
@ -96,68 +105,78 @@ pub async fn search_incidents(
|
||||
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
// Search incidents
|
||||
let search_url = format!(
|
||||
"{}/api/now/table/incident?sysparm_query=short_descriptionLIKE{}^ORdescriptionLIKE{}&sysparm_limit=3&sysparm_display_value=true",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(query),
|
||||
urlencoding::encode(query)
|
||||
);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
tracing::info!("Searching ServiceNow incidents: {}", search_url);
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
let resp = client
|
||||
.get(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("ServiceNow incident search failed: {e}"))?;
|
||||
for expanded_query in expanded_queries.iter().take(MAX_EXPANDED_QUERIES) {
|
||||
// Search incidents
|
||||
let search_url = format!(
|
||||
"{}/api/now/table/incident?sysparm_query=short_descriptionLIKE{}^ORdescriptionLIKE{}&sysparm_limit=3&sysparm_display_value=true",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(expanded_query),
|
||||
urlencoding::encode(expanded_query)
|
||||
);
|
||||
|
||||
if !resp.status().is_success() {
|
||||
return Ok(Vec::new()); // Don't fail if incident search fails
|
||||
}
|
||||
tracing::info!(
|
||||
"Searching ServiceNow incidents with query: {}",
|
||||
expanded_query
|
||||
);
|
||||
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|_| "Failed to parse incident response".to_string())?;
|
||||
let resp = client
|
||||
.get(&search_url)
|
||||
.header("Cookie", &cookie_header)
|
||||
.header("Accept", "application/json")
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("ServiceNow incident search failed: {e}"))?;
|
||||
|
||||
let mut results = Vec::new();
|
||||
if !resp.status().is_success() {
|
||||
continue; // Don't fail if incident search fails
|
||||
}
|
||||
|
||||
if let Some(result_array) = json["result"].as_array() {
|
||||
for item in result_array.iter() {
|
||||
let number = item["number"].as_str().unwrap_or("Unknown");
|
||||
let title = format!(
|
||||
"Incident {}: {}",
|
||||
number,
|
||||
item["short_description"].as_str().unwrap_or("No title")
|
||||
);
|
||||
let json: serde_json::Value = resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|_| "Failed to parse incident response".to_string())?;
|
||||
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/incident.do?sys_id={}",
|
||||
instance_url.trim_end_matches('/'),
|
||||
sys_id
|
||||
);
|
||||
if let Some(result_array) = json["result"].as_array() {
|
||||
for item in result_array.iter() {
|
||||
let number = item["number"].as_str().unwrap_or("Unknown");
|
||||
let title = format!(
|
||||
"Incident {}: {}",
|
||||
number,
|
||||
item["short_description"].as_str().unwrap_or("No title")
|
||||
);
|
||||
|
||||
let description = item["description"].as_str().unwrap_or("").to_string();
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/incident.do?sys_id={}",
|
||||
instance_url.trim_end_matches('/'),
|
||||
sys_id
|
||||
);
|
||||
|
||||
let resolution = item["close_notes"].as_str().unwrap_or("").to_string();
|
||||
let description = item["description"].as_str().unwrap_or("").to_string();
|
||||
|
||||
let content = format!("Description: {description}\nResolution: {resolution}");
|
||||
let resolution = item["close_notes"].as_str().unwrap_or("").to_string();
|
||||
|
||||
let excerpt = content.chars().take(200).collect::<String>();
|
||||
let content = format!("Description: {description}\nResolution: {resolution}");
|
||||
|
||||
results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content: Some(content),
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
let excerpt = content.chars().take(200).collect::<String>();
|
||||
|
||||
all_results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content: Some(content),
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(results)
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
@ -6,6 +6,7 @@ use serde_json::Value;
|
||||
use tauri::WebviewWindow;
|
||||
|
||||
use super::confluence_search::SearchResult;
|
||||
use crate::integrations::query_expansion::expand_query;
|
||||
|
||||
/// Execute an HTTP request from within the webview context
|
||||
/// This automatically includes all cookies (including HttpOnly) from the authenticated session
|
||||
@ -123,106 +124,113 @@ pub async fn search_confluence_webview<R: tauri::Runtime>(
|
||||
base_url: &str,
|
||||
query: &str,
|
||||
) -> Result<Vec<SearchResult>, String> {
|
||||
// Extract keywords from the query for better search
|
||||
// Remove common words and extract important terms
|
||||
let keywords = extract_keywords(query);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
// Build CQL query with OR logic for keywords
|
||||
let cql = if keywords.len() > 1 {
|
||||
// Multiple keywords - search for any of them
|
||||
let keyword_conditions: Vec<String> =
|
||||
keywords.iter().map(|k| format!("text ~ \"{k}\"")).collect();
|
||||
keyword_conditions.join(" OR ")
|
||||
} else if !keywords.is_empty() {
|
||||
// Single keyword
|
||||
let keyword = &keywords[0];
|
||||
format!("text ~ \"{keyword}\"")
|
||||
} else {
|
||||
// Fallback to original query
|
||||
format!("text ~ \"{query}\"")
|
||||
};
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
let search_url = format!(
|
||||
"{}/rest/api/search?cql={}&limit=10",
|
||||
base_url.trim_end_matches('/'),
|
||||
urlencoding::encode(&cql)
|
||||
);
|
||||
for expanded_query in expanded_queries.iter().take(3) {
|
||||
// Extract keywords from the query for better search
|
||||
// Remove common words and extract important terms
|
||||
let keywords = extract_keywords(expanded_query);
|
||||
|
||||
tracing::info!("Executing Confluence search via webview with CQL: {}", cql);
|
||||
// Build CQL query with OR logic for keywords
|
||||
let cql = if keywords.len() > 1 {
|
||||
// Multiple keywords - search for any of them
|
||||
let keyword_conditions: Vec<String> =
|
||||
keywords.iter().map(|k| format!("text ~ \"{k}\"")).collect();
|
||||
keyword_conditions.join(" OR ")
|
||||
} else if !keywords.is_empty() {
|
||||
// Single keyword
|
||||
let keyword = &keywords[0];
|
||||
format!("text ~ \"{keyword}\"")
|
||||
} else {
|
||||
// Fallback to expanded query
|
||||
format!("text ~ \"{expanded_query}\"")
|
||||
};
|
||||
|
||||
let response = fetch_from_webview(webview_window, &search_url, "GET", None).await?;
|
||||
let search_url = format!(
|
||||
"{}/rest/api/search?cql={}&limit=10",
|
||||
base_url.trim_end_matches('/'),
|
||||
urlencoding::encode(&cql)
|
||||
);
|
||||
|
||||
let mut results = Vec::new();
|
||||
tracing::info!("Executing Confluence search via webview with CQL: {}", cql);
|
||||
|
||||
if let Some(results_array) = response.get("results").and_then(|v| v.as_array()) {
|
||||
for item in results_array.iter().take(5) {
|
||||
let title = item["title"].as_str().unwrap_or("Untitled").to_string();
|
||||
let content_id = item["content"]["id"].as_str();
|
||||
let space_key = item["content"]["space"]["key"].as_str();
|
||||
let response = fetch_from_webview(webview_window, &search_url, "GET", None).await?;
|
||||
|
||||
let url = if let (Some(id), Some(space)) = (content_id, space_key) {
|
||||
format!(
|
||||
"{}/display/{}/{}",
|
||||
base_url.trim_end_matches('/'),
|
||||
space,
|
||||
id
|
||||
)
|
||||
} else {
|
||||
base_url.to_string()
|
||||
};
|
||||
if let Some(results_array) = response.get("results").and_then(|v| v.as_array()) {
|
||||
for item in results_array.iter().take(5) {
|
||||
let title = item["title"].as_str().unwrap_or("Untitled").to_string();
|
||||
let content_id = item["content"]["id"].as_str();
|
||||
let space_key = item["content"]["space"]["key"].as_str();
|
||||
|
||||
let excerpt = item["excerpt"]
|
||||
.as_str()
|
||||
.unwrap_or("")
|
||||
.replace("<span class=\"highlight\">", "")
|
||||
.replace("</span>", "");
|
||||
let url = if let (Some(id), Some(space)) = (content_id, space_key) {
|
||||
format!(
|
||||
"{}/display/{}/{}",
|
||||
base_url.trim_end_matches('/'),
|
||||
space,
|
||||
id
|
||||
)
|
||||
} else {
|
||||
base_url.to_string()
|
||||
};
|
||||
|
||||
// Fetch full page content
|
||||
let content = if let Some(id) = content_id {
|
||||
let content_url = format!(
|
||||
"{}/rest/api/content/{id}?expand=body.storage",
|
||||
base_url.trim_end_matches('/')
|
||||
);
|
||||
if let Ok(content_resp) =
|
||||
fetch_from_webview(webview_window, &content_url, "GET", None).await
|
||||
{
|
||||
if let Some(body) = content_resp
|
||||
.get("body")
|
||||
.and_then(|b| b.get("storage"))
|
||||
.and_then(|s| s.get("value"))
|
||||
.and_then(|v| v.as_str())
|
||||
let excerpt = item["excerpt"]
|
||||
.as_str()
|
||||
.unwrap_or("")
|
||||
.replace("<span class=\"highlight\">", "")
|
||||
.replace("</span>", "");
|
||||
|
||||
// Fetch full page content
|
||||
let content = if let Some(id) = content_id {
|
||||
let content_url = format!(
|
||||
"{}/rest/api/content/{id}?expand=body.storage",
|
||||
base_url.trim_end_matches('/')
|
||||
);
|
||||
if let Ok(content_resp) =
|
||||
fetch_from_webview(webview_window, &content_url, "GET", None).await
|
||||
{
|
||||
let text = strip_html_simple(body);
|
||||
Some(if text.len() > 3000 {
|
||||
format!("{}...", &text[..3000])
|
||||
if let Some(body) = content_resp
|
||||
.get("body")
|
||||
.and_then(|b| b.get("storage"))
|
||||
.and_then(|s| s.get("value"))
|
||||
.and_then(|v| v.as_str())
|
||||
{
|
||||
let text = strip_html_simple(body);
|
||||
Some(if text.len() > 3000 {
|
||||
format!("{}...", &text[..3000])
|
||||
} else {
|
||||
text
|
||||
})
|
||||
} else {
|
||||
text
|
||||
})
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
};
|
||||
};
|
||||
|
||||
results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt: excerpt.chars().take(300).collect(),
|
||||
content,
|
||||
source: "Confluence".to_string(),
|
||||
});
|
||||
all_results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt: excerpt.chars().take(300).collect(),
|
||||
content,
|
||||
source: "Confluence".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
tracing::info!(
|
||||
"Confluence webview search returned {} results",
|
||||
results.len()
|
||||
all_results.len()
|
||||
);
|
||||
Ok(results)
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Extract keywords from a search query
|
||||
@ -296,92 +304,99 @@ pub async fn search_servicenow_webview<R: tauri::Runtime>(
|
||||
instance_url: &str,
|
||||
query: &str,
|
||||
) -> Result<Vec<SearchResult>, String> {
|
||||
let mut results = Vec::new();
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
// Search knowledge base
|
||||
let kb_url = format!(
|
||||
"{}/api/now/table/kb_knowledge?sysparm_query=textLIKE{}^ORshort_descriptionLIKE{}&sysparm_limit=3",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(query),
|
||||
urlencoding::encode(query)
|
||||
);
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
tracing::info!("Executing ServiceNow KB search via webview");
|
||||
for expanded_query in expanded_queries.iter().take(3) {
|
||||
// Search knowledge base
|
||||
let kb_url = format!(
|
||||
"{}/api/now/table/kb_knowledge?sysparm_query=textLIKE{}^ORshort_descriptionLIKE{}&sysparm_limit=3",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(expanded_query),
|
||||
urlencoding::encode(expanded_query)
|
||||
);
|
||||
|
||||
if let Ok(kb_response) = fetch_from_webview(webview_window, &kb_url, "GET", None).await {
|
||||
if let Some(kb_array) = kb_response.get("result").and_then(|v| v.as_array()) {
|
||||
for item in kb_array {
|
||||
let title = item["short_description"]
|
||||
.as_str()
|
||||
.unwrap_or("Untitled")
|
||||
.to_string();
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/kb_view.do?sysparm_article={sys_id}",
|
||||
instance_url.trim_end_matches('/')
|
||||
);
|
||||
let text = item["text"].as_str().unwrap_or("");
|
||||
let excerpt = text.chars().take(300).collect();
|
||||
let content = Some(if text.len() > 3000 {
|
||||
format!("{}...", &text[..3000])
|
||||
} else {
|
||||
text.to_string()
|
||||
});
|
||||
tracing::info!("Executing ServiceNow KB search via webview with expanded query");
|
||||
|
||||
results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
if let Ok(kb_response) = fetch_from_webview(webview_window, &kb_url, "GET", None).await {
|
||||
if let Some(kb_array) = kb_response.get("result").and_then(|v| v.as_array()) {
|
||||
for item in kb_array {
|
||||
let title = item["short_description"]
|
||||
.as_str()
|
||||
.unwrap_or("Untitled")
|
||||
.to_string();
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/kb_view.do?sysparm_article={sys_id}",
|
||||
instance_url.trim_end_matches('/')
|
||||
);
|
||||
let text = item["text"].as_str().unwrap_or("");
|
||||
let excerpt = text.chars().take(300).collect();
|
||||
let content = Some(if text.len() > 3000 {
|
||||
format!("{}...", &text[..3000])
|
||||
} else {
|
||||
text.to_string()
|
||||
});
|
||||
|
||||
all_results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content,
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Search incidents
|
||||
let inc_url = format!(
|
||||
"{}/api/now/table/incident?sysparm_query=short_descriptionLIKE{}^ORdescriptionLIKE{}&sysparm_limit=3&sysparm_display_value=true",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(expanded_query),
|
||||
urlencoding::encode(expanded_query)
|
||||
);
|
||||
|
||||
if let Ok(inc_response) = fetch_from_webview(webview_window, &inc_url, "GET", None).await {
|
||||
if let Some(inc_array) = inc_response.get("result").and_then(|v| v.as_array()) {
|
||||
for item in inc_array {
|
||||
let number = item["number"].as_str().unwrap_or("Unknown");
|
||||
let title = format!(
|
||||
"Incident {}: {}",
|
||||
number,
|
||||
item["short_description"].as_str().unwrap_or("No title")
|
||||
);
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/incident.do?sys_id={sys_id}",
|
||||
instance_url.trim_end_matches('/')
|
||||
);
|
||||
let description = item["description"].as_str().unwrap_or("");
|
||||
let resolution = item["close_notes"].as_str().unwrap_or("");
|
||||
let content = format!("Description: {description}\nResolution: {resolution}");
|
||||
let excerpt = content.chars().take(200).collect();
|
||||
|
||||
all_results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content: Some(content),
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Search incidents
|
||||
let inc_url = format!(
|
||||
"{}/api/now/table/incident?sysparm_query=short_descriptionLIKE{}^ORdescriptionLIKE{}&sysparm_limit=3&sysparm_display_value=true",
|
||||
instance_url.trim_end_matches('/'),
|
||||
urlencoding::encode(query),
|
||||
urlencoding::encode(query)
|
||||
);
|
||||
|
||||
if let Ok(inc_response) = fetch_from_webview(webview_window, &inc_url, "GET", None).await {
|
||||
if let Some(inc_array) = inc_response.get("result").and_then(|v| v.as_array()) {
|
||||
for item in inc_array {
|
||||
let number = item["number"].as_str().unwrap_or("Unknown");
|
||||
let title = format!(
|
||||
"Incident {}: {}",
|
||||
number,
|
||||
item["short_description"].as_str().unwrap_or("No title")
|
||||
);
|
||||
let sys_id = item["sys_id"].as_str().unwrap_or("");
|
||||
let url = format!(
|
||||
"{}/incident.do?sys_id={sys_id}",
|
||||
instance_url.trim_end_matches('/')
|
||||
);
|
||||
let description = item["description"].as_str().unwrap_or("");
|
||||
let resolution = item["close_notes"].as_str().unwrap_or("");
|
||||
let content = format!("Description: {description}\nResolution: {resolution}");
|
||||
let excerpt = content.chars().take(200).collect();
|
||||
|
||||
results.push(SearchResult {
|
||||
title,
|
||||
url,
|
||||
excerpt,
|
||||
content: Some(content),
|
||||
source: "ServiceNow".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
tracing::info!(
|
||||
"ServiceNow webview search returned {} results",
|
||||
results.len()
|
||||
all_results.len()
|
||||
);
|
||||
Ok(results)
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Search Azure DevOps wiki using webview fetch
|
||||
@ -391,82 +406,89 @@ pub async fn search_azuredevops_wiki_webview<R: tauri::Runtime>(
|
||||
project: &str,
|
||||
query: &str,
|
||||
) -> Result<Vec<SearchResult>, String> {
|
||||
// Extract keywords for better search
|
||||
let keywords = extract_keywords(query);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
let search_text = if !keywords.is_empty() {
|
||||
keywords.join(" ")
|
||||
} else {
|
||||
query.to_string()
|
||||
};
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
// Azure DevOps wiki search API
|
||||
let search_url = format!(
|
||||
"{}/{}/_apis/wiki/wikis?api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
urlencoding::encode(project)
|
||||
);
|
||||
for expanded_query in expanded_queries.iter().take(3) {
|
||||
// Extract keywords for better search
|
||||
let keywords = extract_keywords(expanded_query);
|
||||
|
||||
tracing::info!(
|
||||
"Executing Azure DevOps wiki search via webview for: {}",
|
||||
search_text
|
||||
);
|
||||
let search_text = if !keywords.is_empty() {
|
||||
keywords.join(" ")
|
||||
} else {
|
||||
expanded_query.clone()
|
||||
};
|
||||
|
||||
// First, get list of wikis
|
||||
let wikis_response = fetch_from_webview(webview_window, &search_url, "GET", None).await?;
|
||||
// Azure DevOps wiki search API
|
||||
let search_url = format!(
|
||||
"{}/{}/_apis/wiki/wikis?api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
urlencoding::encode(project)
|
||||
);
|
||||
|
||||
let mut results = Vec::new();
|
||||
tracing::info!(
|
||||
"Executing Azure DevOps wiki search via webview for: {}",
|
||||
search_text
|
||||
);
|
||||
|
||||
if let Some(wikis_array) = wikis_response.get("value").and_then(|v| v.as_array()) {
|
||||
// Search each wiki
|
||||
for wiki in wikis_array.iter().take(3) {
|
||||
let wiki_id = wiki["id"].as_str().unwrap_or("");
|
||||
// First, get list of wikis
|
||||
let wikis_response = fetch_from_webview(webview_window, &search_url, "GET", None).await?;
|
||||
|
||||
if wiki_id.is_empty() {
|
||||
continue;
|
||||
}
|
||||
if let Some(wikis_array) = wikis_response.get("value").and_then(|v| v.as_array()) {
|
||||
// Search each wiki
|
||||
for wiki in wikis_array.iter().take(3) {
|
||||
let wiki_id = wiki["id"].as_str().unwrap_or("");
|
||||
|
||||
// Search wiki pages
|
||||
let pages_url = format!(
|
||||
"{}/{}/_apis/wiki/wikis/{}/pages?recursionLevel=Full&includeContent=true&api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
urlencoding::encode(project),
|
||||
urlencoding::encode(wiki_id)
|
||||
);
|
||||
if wiki_id.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Ok(pages_response) =
|
||||
fetch_from_webview(webview_window, &pages_url, "GET", None).await
|
||||
{
|
||||
// Try to get "page" field, or use the response itself if it's the page object
|
||||
if let Some(page) = pages_response.get("page") {
|
||||
search_page_recursive(
|
||||
page,
|
||||
&search_text,
|
||||
org_url,
|
||||
project,
|
||||
wiki_id,
|
||||
&mut results,
|
||||
);
|
||||
} else {
|
||||
// Response might be the page object itself
|
||||
search_page_recursive(
|
||||
&pages_response,
|
||||
&search_text,
|
||||
org_url,
|
||||
project,
|
||||
wiki_id,
|
||||
&mut results,
|
||||
);
|
||||
// Search wiki pages
|
||||
let pages_url = format!(
|
||||
"{}/{}/_apis/wiki/wikis/{}/pages?recursionLevel=Full&includeContent=true&api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
urlencoding::encode(project),
|
||||
urlencoding::encode(wiki_id)
|
||||
);
|
||||
|
||||
if let Ok(pages_response) =
|
||||
fetch_from_webview(webview_window, &pages_url, "GET", None).await
|
||||
{
|
||||
// Try to get "page" field, or use the response itself if it's the page object
|
||||
if let Some(page) = pages_response.get("page") {
|
||||
search_page_recursive(
|
||||
page,
|
||||
&search_text,
|
||||
org_url,
|
||||
project,
|
||||
wiki_id,
|
||||
&mut all_results,
|
||||
);
|
||||
} else {
|
||||
// Response might be the page object itself
|
||||
search_page_recursive(
|
||||
&pages_response,
|
||||
&search_text,
|
||||
org_url,
|
||||
project,
|
||||
wiki_id,
|
||||
&mut all_results,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
tracing::info!(
|
||||
"Azure DevOps wiki webview search returned {} results",
|
||||
results.len()
|
||||
all_results.len()
|
||||
);
|
||||
Ok(results)
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Recursively search through wiki pages for matching content
|
||||
@ -544,115 +566,124 @@ pub async fn search_azuredevops_workitems_webview<R: tauri::Runtime>(
|
||||
project: &str,
|
||||
query: &str,
|
||||
) -> Result<Vec<SearchResult>, String> {
|
||||
// Extract keywords
|
||||
let keywords = extract_keywords(query);
|
||||
let expanded_queries = expand_query(query);
|
||||
|
||||
// Check if query contains a work item ID (pure number)
|
||||
let work_item_id: Option<i64> = keywords
|
||||
.iter()
|
||||
.filter(|k| k.chars().all(|c| c.is_numeric()))
|
||||
.filter_map(|k| k.parse::<i64>().ok())
|
||||
.next();
|
||||
let mut all_results = Vec::new();
|
||||
|
||||
// Build WIQL query
|
||||
let wiql_query = if let Some(id) = work_item_id {
|
||||
// Search by specific ID
|
||||
format!(
|
||||
"SELECT [System.Id], [System.Title], [System.Description], [System.WorkItemType] \
|
||||
FROM WorkItems WHERE [System.Id] = {id}"
|
||||
)
|
||||
} else {
|
||||
// Search by text in title/description
|
||||
let search_terms = if !keywords.is_empty() {
|
||||
keywords.join(" ")
|
||||
for expanded_query in expanded_queries.iter().take(3) {
|
||||
// Extract keywords
|
||||
let keywords = extract_keywords(expanded_query);
|
||||
|
||||
// Check if query contains a work item ID (pure number)
|
||||
let work_item_id: Option<i64> = keywords
|
||||
.iter()
|
||||
.filter(|k| k.chars().all(|c| c.is_numeric()))
|
||||
.filter_map(|k| k.parse::<i64>().ok())
|
||||
.next();
|
||||
|
||||
// Build WIQL query
|
||||
let wiql_query = if let Some(id) = work_item_id {
|
||||
// Search by specific ID
|
||||
format!(
|
||||
"SELECT [System.Id], [System.Title], [System.Description], [System.WorkItemType] \
|
||||
FROM WorkItems WHERE [System.Id] = {id}"
|
||||
)
|
||||
} else {
|
||||
query.to_string()
|
||||
// Search by text in title/description
|
||||
let search_terms = if !keywords.is_empty() {
|
||||
keywords.join(" ")
|
||||
} else {
|
||||
expanded_query.clone()
|
||||
};
|
||||
|
||||
// Use CONTAINS for text search (case-insensitive)
|
||||
format!(
|
||||
"SELECT [System.Id], [System.Title], [System.Description], [System.WorkItemType] \
|
||||
FROM WorkItems WHERE [System.TeamProject] = '{project}' \
|
||||
AND ([System.Title] CONTAINS '{search_terms}' OR [System.Description] CONTAINS '{search_terms}') \
|
||||
ORDER BY [System.ChangedDate] DESC"
|
||||
)
|
||||
};
|
||||
|
||||
// Use CONTAINS for text search (case-insensitive)
|
||||
format!(
|
||||
"SELECT [System.Id], [System.Title], [System.Description], [System.WorkItemType] \
|
||||
FROM WorkItems WHERE [System.TeamProject] = '{project}' \
|
||||
AND ([System.Title] CONTAINS '{search_terms}' OR [System.Description] CONTAINS '{search_terms}') \
|
||||
ORDER BY [System.ChangedDate] DESC"
|
||||
)
|
||||
};
|
||||
let wiql_url = format!(
|
||||
"{}/{}/_apis/wit/wiql?api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
urlencoding::encode(project)
|
||||
);
|
||||
|
||||
let wiql_url = format!(
|
||||
"{}/{}/_apis/wit/wiql?api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
urlencoding::encode(project)
|
||||
);
|
||||
let body = serde_json::json!({
|
||||
"query": wiql_query
|
||||
})
|
||||
.to_string();
|
||||
|
||||
let body = serde_json::json!({
|
||||
"query": wiql_query
|
||||
})
|
||||
.to_string();
|
||||
tracing::info!("Executing Azure DevOps work item search via webview");
|
||||
tracing::debug!("WIQL query: {}", wiql_query);
|
||||
tracing::debug!("Request URL: {}", wiql_url);
|
||||
|
||||
tracing::info!("Executing Azure DevOps work item search via webview");
|
||||
tracing::debug!("WIQL query: {}", wiql_query);
|
||||
tracing::debug!("Request URL: {}", wiql_url);
|
||||
let wiql_response =
|
||||
fetch_from_webview(webview_window, &wiql_url, "POST", Some(&body)).await?;
|
||||
|
||||
let wiql_response = fetch_from_webview(webview_window, &wiql_url, "POST", Some(&body)).await?;
|
||||
if let Some(work_items) = wiql_response.get("workItems").and_then(|v| v.as_array()) {
|
||||
// Fetch details for first 5 work items
|
||||
for item in work_items.iter().take(5) {
|
||||
if let Some(id) = item.get("id").and_then(|i| i.as_i64()) {
|
||||
let details_url = format!(
|
||||
"{}/_apis/wit/workitems/{}?api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
id
|
||||
);
|
||||
|
||||
let mut results = Vec::new();
|
||||
if let Ok(details) =
|
||||
fetch_from_webview(webview_window, &details_url, "GET", None).await
|
||||
{
|
||||
if let Some(fields) = details.get("fields") {
|
||||
let title = fields
|
||||
.get("System.Title")
|
||||
.and_then(|t| t.as_str())
|
||||
.unwrap_or("Untitled");
|
||||
let work_item_type = fields
|
||||
.get("System.WorkItemType")
|
||||
.and_then(|t| t.as_str())
|
||||
.unwrap_or("Item");
|
||||
let description = fields
|
||||
.get("System.Description")
|
||||
.and_then(|d| d.as_str())
|
||||
.unwrap_or("");
|
||||
|
||||
if let Some(work_items) = wiql_response.get("workItems").and_then(|v| v.as_array()) {
|
||||
// Fetch details for first 5 work items
|
||||
for item in work_items.iter().take(5) {
|
||||
if let Some(id) = item.get("id").and_then(|i| i.as_i64()) {
|
||||
let details_url = format!(
|
||||
"{}/_apis/wit/workitems/{}?api-version=7.0",
|
||||
org_url.trim_end_matches('/'),
|
||||
id
|
||||
);
|
||||
let clean_description = strip_html_simple(description);
|
||||
let excerpt = clean_description.chars().take(200).collect();
|
||||
|
||||
if let Ok(details) =
|
||||
fetch_from_webview(webview_window, &details_url, "GET", None).await
|
||||
{
|
||||
if let Some(fields) = details.get("fields") {
|
||||
let title = fields
|
||||
.get("System.Title")
|
||||
.and_then(|t| t.as_str())
|
||||
.unwrap_or("Untitled");
|
||||
let work_item_type = fields
|
||||
.get("System.WorkItemType")
|
||||
.and_then(|t| t.as_str())
|
||||
.unwrap_or("Item");
|
||||
let description = fields
|
||||
.get("System.Description")
|
||||
.and_then(|d| d.as_str())
|
||||
.unwrap_or("");
|
||||
let url =
|
||||
format!("{}/_workitems/edit/{id}", org_url.trim_end_matches('/'));
|
||||
|
||||
let clean_description = strip_html_simple(description);
|
||||
let excerpt = clean_description.chars().take(200).collect();
|
||||
let full_content = if clean_description.len() > 3000 {
|
||||
format!("{}...", &clean_description[..3000])
|
||||
} else {
|
||||
clean_description.clone()
|
||||
};
|
||||
|
||||
let url = format!("{}/_workitems/edit/{id}", org_url.trim_end_matches('/'));
|
||||
|
||||
let full_content = if clean_description.len() > 3000 {
|
||||
format!("{}...", &clean_description[..3000])
|
||||
} else {
|
||||
clean_description.clone()
|
||||
};
|
||||
|
||||
results.push(SearchResult {
|
||||
title: format!("{work_item_type} #{id}: {title}"),
|
||||
url,
|
||||
excerpt,
|
||||
content: Some(full_content),
|
||||
source: "Azure DevOps".to_string(),
|
||||
});
|
||||
all_results.push(SearchResult {
|
||||
title: format!("{work_item_type} #{id}: {title}"),
|
||||
url,
|
||||
excerpt,
|
||||
content: Some(full_content),
|
||||
source: "Azure DevOps".to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
all_results.sort_by(|a, b| a.url.cmp(&b.url));
|
||||
all_results.dedup_by(|a, b| a.url == b.url);
|
||||
|
||||
tracing::info!(
|
||||
"Azure DevOps work items webview search returned {} results",
|
||||
results.len()
|
||||
all_results.len()
|
||||
);
|
||||
Ok(results)
|
||||
Ok(all_results)
|
||||
}
|
||||
|
||||
/// Add a comment to an Azure DevOps work item
|
||||
|
||||
@ -11,7 +11,6 @@ pub mod state;
|
||||
use sha2::{Digest, Sha256};
|
||||
use state::AppState;
|
||||
use std::sync::{Arc, Mutex};
|
||||
use tauri::Manager;
|
||||
|
||||
#[cfg_attr(mobile, tauri::mobile_entry_point)]
|
||||
pub fn run() {
|
||||
@ -26,7 +25,7 @@ pub fn run() {
|
||||
tracing::info!("Starting Troubleshooting and RCA Assistant application");
|
||||
|
||||
// Determine data directory
|
||||
let data_dir = state::get_app_data_dir().expect("Failed to determine app data directory");
|
||||
let data_dir = dirs_data_dir();
|
||||
|
||||
// Initialize database
|
||||
let conn = db::connection::init_db(&data_dir).expect("Failed to initialize database");
|
||||
@ -58,35 +57,6 @@ pub fn run() {
|
||||
.plugin(tauri_plugin_shell::init())
|
||||
.plugin(tauri_plugin_http::init())
|
||||
.manage(app_state)
|
||||
.setup(|app| {
|
||||
// Restore persistent browser windows from previous session
|
||||
let app_handle = app.handle().clone();
|
||||
let state: tauri::State<AppState> = app.state();
|
||||
|
||||
// Clone Arc fields for 'static lifetime
|
||||
let db = state.db.clone();
|
||||
let settings = state.settings.clone();
|
||||
let app_data_dir = state.app_data_dir.clone();
|
||||
let integration_webviews = state.integration_webviews.clone();
|
||||
|
||||
tauri::async_runtime::spawn(async move {
|
||||
let app_state = AppState {
|
||||
db,
|
||||
settings,
|
||||
app_data_dir,
|
||||
integration_webviews,
|
||||
};
|
||||
|
||||
if let Err(e) =
|
||||
commands::integrations::restore_persistent_webviews(&app_handle, &app_state)
|
||||
.await
|
||||
{
|
||||
tracing::warn!("Failed to restore persistent webviews: {}", e);
|
||||
}
|
||||
});
|
||||
|
||||
Ok(())
|
||||
})
|
||||
.invoke_handler(tauri::generate_handler![
|
||||
// DB / Issue CRUD
|
||||
commands::db::create_issue,
|
||||
@ -99,15 +69,27 @@ pub fn run() {
|
||||
commands::db::add_five_why,
|
||||
commands::db::update_five_why,
|
||||
commands::db::add_timeline_event,
|
||||
commands::db::get_timeline_events,
|
||||
// Analysis / PII
|
||||
commands::analysis::upload_log_file,
|
||||
commands::analysis::upload_log_file_by_content,
|
||||
commands::analysis::detect_pii,
|
||||
commands::analysis::apply_redactions,
|
||||
commands::image::upload_image_attachment,
|
||||
commands::image::upload_image_attachment_by_content,
|
||||
commands::image::list_image_attachments,
|
||||
commands::image::delete_image_attachment,
|
||||
commands::image::upload_paste_image,
|
||||
commands::image::upload_file_to_datastore,
|
||||
commands::image::upload_file_to_datastore_any,
|
||||
// AI
|
||||
commands::ai::analyze_logs,
|
||||
commands::ai::chat_message,
|
||||
commands::ai::test_provider_connection,
|
||||
commands::ai::list_providers,
|
||||
commands::system::save_ai_provider,
|
||||
commands::system::load_ai_providers,
|
||||
commands::system::delete_ai_provider,
|
||||
// Docs
|
||||
commands::docs::generate_rca,
|
||||
commands::docs::generate_postmortem,
|
||||
@ -128,7 +110,6 @@ pub fn run() {
|
||||
commands::integrations::save_integration_config,
|
||||
commands::integrations::get_integration_config,
|
||||
commands::integrations::get_all_integration_configs,
|
||||
commands::integrations::add_ado_comment,
|
||||
// System / Settings
|
||||
commands::system::check_ollama_installed,
|
||||
commands::system::get_ollama_install_guide,
|
||||
@ -140,10 +121,49 @@ pub fn run() {
|
||||
commands::system::get_settings,
|
||||
commands::system::update_settings,
|
||||
commands::system::get_audit_log,
|
||||
commands::system::save_ai_provider,
|
||||
commands::system::load_ai_providers,
|
||||
commands::system::delete_ai_provider,
|
||||
commands::system::get_app_version,
|
||||
])
|
||||
.run(tauri::generate_context!())
|
||||
.expect("Error running Troubleshooting and RCA Assistant application");
|
||||
}
|
||||
|
||||
/// Determine the application data directory.
|
||||
fn dirs_data_dir() -> std::path::PathBuf {
|
||||
if let Ok(dir) = std::env::var("TFTSR_DATA_DIR") {
|
||||
return std::path::PathBuf::from(dir);
|
||||
}
|
||||
|
||||
// Use platform-appropriate data directory
|
||||
#[cfg(target_os = "linux")]
|
||||
{
|
||||
if let Ok(xdg) = std::env::var("XDG_DATA_HOME") {
|
||||
return std::path::PathBuf::from(xdg).join("trcaa");
|
||||
}
|
||||
if let Ok(home) = std::env::var("HOME") {
|
||||
return std::path::PathBuf::from(home)
|
||||
.join(".local")
|
||||
.join("share")
|
||||
.join("trcaa");
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
if let Ok(home) = std::env::var("HOME") {
|
||||
return std::path::PathBuf::from(home)
|
||||
.join("Library")
|
||||
.join("Application Support")
|
||||
.join("trcaa");
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
{
|
||||
if let Ok(appdata) = std::env::var("APPDATA") {
|
||||
return std::path::PathBuf::from(appdata).join("trcaa");
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback
|
||||
std::path::PathBuf::from("./trcaa-data")
|
||||
}
|
||||
|
||||
@ -39,6 +39,9 @@ pub struct ProviderConfig {
|
||||
/// Optional: User ID for custom REST API cost tracking (CORE ID email)
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub user_id: Option<String>,
|
||||
/// Optional: When true, file uploads go to GenAI datastore instead of prompt
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub use_datastore_upload: Option<bool>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
{
|
||||
"productName": "Troubleshooting and RCA Assistant",
|
||||
"version": "0.2.10",
|
||||
"version": "0.2.50",
|
||||
"identifier": "com.trcaa.app",
|
||||
"build": {
|
||||
"frontendDist": "../dist",
|
||||
"devUrl": "http://localhost:1420",
|
||||
"beforeDevCommand": "npm run dev",
|
||||
"beforeBuildCommand": "npm run build"
|
||||
"beforeBuildCommand": "npm run version:update && npm run build"
|
||||
},
|
||||
"app": {
|
||||
"security": {
|
||||
@ -26,7 +26,7 @@
|
||||
},
|
||||
"bundle": {
|
||||
"active": true,
|
||||
"targets": "all",
|
||||
"targets": ["deb", "rpm", "nsis"],
|
||||
"icon": [
|
||||
"icons/32x32.png",
|
||||
"icons/128x128.png",
|
||||
@ -41,4 +41,7 @@
|
||||
"shortDescription": "Troubleshooting and RCA Assistant",
|
||||
"longDescription": "Structured AI-backed assistant for IT troubleshooting, 5-whys root cause analysis, and post-mortem documentation with offline Ollama support."
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
||||
@ -1,5 +1,4 @@
|
||||
import React, { useState, useEffect } from "react";
|
||||
import { getVersion } from "@tauri-apps/api/app";
|
||||
import { Routes, Route, NavLink, useLocation } from "react-router-dom";
|
||||
import {
|
||||
Home,
|
||||
@ -15,7 +14,7 @@ import {
|
||||
Moon,
|
||||
} from "lucide-react";
|
||||
import { useSettingsStore } from "@/stores/settingsStore";
|
||||
import { loadAiProvidersCmd, testProviderConnectionCmd } from "@/lib/tauriCommands";
|
||||
import { getAppVersionCmd, loadAiProvidersCmd, testProviderConnectionCmd } from "@/lib/tauriCommands";
|
||||
|
||||
import Dashboard from "@/pages/Dashboard";
|
||||
import NewIssue from "@/pages/NewIssue";
|
||||
@ -47,10 +46,10 @@ export default function App() {
|
||||
const [collapsed, setCollapsed] = useState(false);
|
||||
const [appVersion, setAppVersion] = useState("");
|
||||
const { theme, setTheme, setProviders, getActiveProvider } = useSettingsStore();
|
||||
const location = useLocation();
|
||||
void useLocation();
|
||||
|
||||
useEffect(() => {
|
||||
getVersion().then(setAppVersion).catch(() => {});
|
||||
getAppVersionCmd().then(setAppVersion).catch(() => {});
|
||||
}, []);
|
||||
|
||||
// Load providers and auto-test active provider on startup
|
||||
|
||||
165
src/components/ImageGallery.tsx
Normal file
165
src/components/ImageGallery.tsx
Normal file
@ -0,0 +1,165 @@
|
||||
import React, { useState, useRef, useEffect } from "react";
|
||||
import { X, AlertTriangle, ExternalLink, Image as ImageIcon } from "lucide-react";
|
||||
import type { ImageAttachment } from "@/lib/tauriCommands";
|
||||
|
||||
interface ImageGalleryProps {
|
||||
images: ImageAttachment[];
|
||||
onDelete?: (attachment: ImageAttachment) => void;
|
||||
showWarning?: boolean;
|
||||
}
|
||||
|
||||
export function ImageGallery({ images, onDelete, showWarning = true }: ImageGalleryProps) {
|
||||
const [selectedImage, setSelectedImage] = useState<ImageAttachment | null>(null);
|
||||
const [isModalOpen, setIsModalOpen] = useState(false);
|
||||
const modalRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
useEffect(() => {
|
||||
const handleKeyDown = (e: KeyboardEvent) => {
|
||||
if (e.key === "Escape" && isModalOpen) {
|
||||
setIsModalOpen(false);
|
||||
setSelectedImage(null);
|
||||
}
|
||||
};
|
||||
window.addEventListener("keydown", handleKeyDown);
|
||||
return () => window.removeEventListener("keydown", handleKeyDown);
|
||||
}, [isModalOpen]);
|
||||
|
||||
if (images.length === 0) return null;
|
||||
|
||||
const base64ToDataUrl = (base64: string, mimeType: string): string => {
|
||||
if (base64.startsWith("data:image/")) {
|
||||
return base64;
|
||||
}
|
||||
return `data:${mimeType};base64,${base64}`;
|
||||
};
|
||||
|
||||
const getPreviewUrl = (attachment: ImageAttachment): string => {
|
||||
if (attachment.file_path && attachment.file_path.length > 0) {
|
||||
return `file://${attachment.file_path}`;
|
||||
}
|
||||
return base64ToDataUrl(attachment.upload_hash, attachment.mime_type);
|
||||
};
|
||||
|
||||
const isWebSource = (image: ImageAttachment): boolean => {
|
||||
return image.file_path.length > 0 &&
|
||||
(image.file_path.startsWith("http://") ||
|
||||
image.file_path.startsWith("https://"));
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="space-y-4">
|
||||
{showWarning && (
|
||||
<div className="bg-amber-100 border border-amber-300 text-amber-800 p-3 rounded-md flex items-center gap-2">
|
||||
<AlertTriangle className="w-5 h-5 flex-shrink-0" />
|
||||
<span className="text-sm">
|
||||
⚠️ PII cannot be automatically redacted from images. Use at your own risk.
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{images.some(img => isWebSource(img)) && (
|
||||
<div className="bg-red-100 border border-red-300 text-red-800 p-3 rounded-md flex items-center gap-2">
|
||||
<ExternalLink className="w-5 h-5 flex-shrink-0" />
|
||||
<span className="text-sm">
|
||||
⚠️ Some images appear to be from web sources. Ensure you have permission to share.
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-5 gap-4">
|
||||
{images.map((image) => (
|
||||
<div key={image.id} className="group relative rounded-lg overflow-hidden bg-gray-100 border border-gray-200">
|
||||
<button
|
||||
onClick={() => {
|
||||
setSelectedImage(image);
|
||||
setIsModalOpen(true);
|
||||
}}
|
||||
className="w-full aspect-video object-cover"
|
||||
>
|
||||
<img
|
||||
src={getPreviewUrl(image)}
|
||||
alt={image.file_name}
|
||||
className="w-full h-full object-cover transition-transform group-hover:scale-110"
|
||||
loading="lazy"
|
||||
/>
|
||||
</button>
|
||||
<div className="p-2">
|
||||
<p className="text-xs text-gray-700 truncate" title={image.file_name}>
|
||||
{image.file_name}
|
||||
</p>
|
||||
<p className="text-xs text-gray-500">
|
||||
{image.is_paste ? "Paste" : "Upload"} · {(image.file_size / 1024).toFixed(1)} KB
|
||||
</p>
|
||||
</div>
|
||||
{onDelete && (
|
||||
<button
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onDelete(image);
|
||||
}}
|
||||
className="absolute top-1 right-1 p-1 bg-white/80 hover:bg-white rounded-md text-gray-600 hover:text-red-600 transition-colors opacity-0 group-hover:opacity-100"
|
||||
title="Delete image"
|
||||
>
|
||||
<X className="w-4 h-4" />
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{isModalOpen && selectedImage && (
|
||||
<div
|
||||
className="fixed inset-0 bg-black/50 z-50 flex items-center justify-center p-4"
|
||||
onClick={() => {
|
||||
setIsModalOpen(false);
|
||||
setSelectedImage(null);
|
||||
}}
|
||||
>
|
||||
<div
|
||||
ref={modalRef}
|
||||
className="bg-white rounded-lg overflow-hidden max-w-4xl max-h-[90vh] flex flex-col"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
<div className="bg-gray-100 p-4 flex items-center justify-between border-b">
|
||||
<div className="flex items-center gap-2">
|
||||
<ImageIcon className="w-5 h-5 text-gray-600" />
|
||||
<h3 className="font-medium">{selectedImage.file_name}</h3>
|
||||
</div>
|
||||
<button
|
||||
onClick={() => {
|
||||
setIsModalOpen(false);
|
||||
setSelectedImage(null);
|
||||
}}
|
||||
className="p-2 hover:bg-gray-200 rounded-lg transition-colors"
|
||||
>
|
||||
<X className="w-5 h-5" />
|
||||
</button>
|
||||
</div>
|
||||
<div className="flex-1 overflow-auto bg-gray-900 flex items-center justify-center p-8">
|
||||
<img
|
||||
src={getPreviewUrl(selectedImage)}
|
||||
alt={selectedImage.file_name}
|
||||
className="max-w-full max-h-[60vh] object-contain"
|
||||
/>
|
||||
</div>
|
||||
<div className="bg-gray-50 p-4 border-t text-sm space-y-2">
|
||||
<div className="flex gap-4">
|
||||
<div>
|
||||
<span className="text-gray-500">Type:</span> {selectedImage.mime_type}
|
||||
</div>
|
||||
<div>
|
||||
<span className="text-gray-500">Size:</span> {(selectedImage.file_size / 1024).toFixed(2)} KB
|
||||
</div>
|
||||
<div>
|
||||
<span className="text-gray-500">Source:</span> {selectedImage.is_paste ? "Paste" : "File"}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default ImageGallery;
|
||||
@ -1,4 +1,4 @@
|
||||
import React from "react";
|
||||
import React, { HTMLAttributes } from "react";
|
||||
import { cva, type VariantProps } from "class-variance-authority";
|
||||
import { clsx, type ClassValue } from "clsx";
|
||||
|
||||
@ -6,6 +6,26 @@ function cn(...inputs: ClassValue[]) {
|
||||
return clsx(inputs);
|
||||
}
|
||||
|
||||
// ─── Separator (ForwardRef) ───────────────────────────────────────────────────
|
||||
|
||||
export const Separator = React.forwardRef<
|
||||
HTMLDivElement,
|
||||
HTMLAttributes<HTMLDivElement> & { orientation?: "horizontal" | "vertical" }
|
||||
>(({ className, orientation = "horizontal", ...props }, ref) => (
|
||||
<div
|
||||
ref={ref}
|
||||
role="separator"
|
||||
aria-orientation={orientation}
|
||||
className={cn(
|
||||
"shrink-0 bg-border",
|
||||
orientation === "horizontal" ? "h-[1px] w-full" : "h-full w-[1px]",
|
||||
className
|
||||
)}
|
||||
{...props}
|
||||
/>
|
||||
));
|
||||
Separator.displayName = "Separator";
|
||||
|
||||
// ─── Button ──────────────────────────────────────────────────────────────────
|
||||
|
||||
const buttonVariants = cva(
|
||||
@ -108,7 +128,7 @@ CardFooter.displayName = "CardFooter";
|
||||
|
||||
// ─── Input ───────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface InputProps extends React.InputHTMLAttributes<HTMLInputElement> {}
|
||||
export type InputProps = React.InputHTMLAttributes<HTMLInputElement>
|
||||
|
||||
export const Input = React.forwardRef<HTMLInputElement, InputProps>(
|
||||
({ className, type, ...props }, ref) => (
|
||||
@ -127,7 +147,7 @@ Input.displayName = "Input";
|
||||
|
||||
// ─── Label ───────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface LabelProps extends React.LabelHTMLAttributes<HTMLLabelElement> {}
|
||||
export type LabelProps = React.LabelHTMLAttributes<HTMLLabelElement>
|
||||
|
||||
export const Label = React.forwardRef<HTMLLabelElement, LabelProps>(
|
||||
({ className, ...props }, ref) => (
|
||||
@ -145,7 +165,7 @@ Label.displayName = "Label";
|
||||
|
||||
// ─── Textarea ────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface TextareaProps extends React.TextareaHTMLAttributes<HTMLTextAreaElement> {}
|
||||
export type TextareaProps = React.TextareaHTMLAttributes<HTMLTextAreaElement>
|
||||
|
||||
export const Textarea = React.forwardRef<HTMLTextAreaElement, TextareaProps>(
|
||||
({ className, ...props }, ref) => (
|
||||
@ -320,28 +340,7 @@ export function Progress({ value = 0, max = 100, className, ...props }: Progress
|
||||
);
|
||||
}
|
||||
|
||||
// ─── Separator ───────────────────────────────────────────────────────────────
|
||||
|
||||
interface SeparatorProps extends React.HTMLAttributes<HTMLDivElement> {
|
||||
orientation?: "horizontal" | "vertical";
|
||||
}
|
||||
|
||||
export function Separator({
|
||||
orientation = "horizontal",
|
||||
className,
|
||||
...props
|
||||
}: SeparatorProps) {
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
"shrink-0 bg-border",
|
||||
orientation === "horizontal" ? "h-[1px] w-full" : "h-full w-[1px]",
|
||||
className
|
||||
)}
|
||||
{...props}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
// ─── RadioGroup ──────────────────────────────────────────────────────────────
|
||||
|
||||
|
||||
@ -331,6 +331,58 @@ When analyzing identity and access issues, focus on these key areas:
|
||||
Always ask about the Keycloak version, realm configuration (external IdP vs local users vs LDAP), SSSD version and configured domains, and whether this is a first-time setup or a regression.`,
|
||||
};
|
||||
|
||||
export const INCIDENT_RESPONSE_FRAMEWORK = `
|
||||
|
||||
---
|
||||
|
||||
## INCIDENT RESPONSE METHODOLOGY
|
||||
|
||||
Follow this structured framework for every triage conversation. Each phase must be completed with evidence before advancing.
|
||||
|
||||
### Phase 1: Detection & Evidence Gathering
|
||||
- **Do NOT propose fixes** until the problem is fully understood
|
||||
- Gather: error messages, timestamps, affected systems, scope of impact, recent changes
|
||||
- Ask: "What changed? When did it start? Who/what is affected? What has been tried?"
|
||||
- Record all evidence with UTC timestamps
|
||||
- Establish a clear problem statement before proceeding
|
||||
|
||||
### Phase 2: Diagnosis & Hypothesis Testing
|
||||
- Apply the scientific method: form hypotheses, test them with evidence
|
||||
- **The 3-Fix Rule**: If you cannot confidently identify the root cause after 3 hypotheses, STOP and reassess your assumptions — you may be looking at the wrong system or the wrong layer
|
||||
- Check the most common causes first (Occam's Razor): DNS, certificates, disk space, permissions, recent deployments
|
||||
- Differentiate between symptoms and causes — treat causes, not symptoms
|
||||
- Use binary search to narrow scope: which component, which layer, which change
|
||||
|
||||
### Phase 3: Root Cause Analysis with 5-Whys
|
||||
- Each "Why" must be backed by evidence, not speculation
|
||||
- If you cannot provide evidence for a "Why", state what investigation is needed to confirm
|
||||
- Look for systemic issues, not just proximate causes
|
||||
- The root cause should explain ALL observed symptoms, not just some
|
||||
- Common root cause categories: configuration drift, capacity exhaustion, dependency failure, race condition, human error in process
|
||||
|
||||
### Phase 4: Resolution & Prevention
|
||||
- **Immediate fix**: What stops the bleeding right now? (rollback, restart, failover)
|
||||
- **Permanent fix**: What prevents recurrence? (code fix, config change, automation)
|
||||
- **Runbook update**: Document the fix for future oncall engineers
|
||||
- Verify the fix resolves ALL symptoms, not just the primary one
|
||||
- Monitor for regression after applying the fix
|
||||
|
||||
### Phase 5: Post-Incident Review
|
||||
- Calculate incident metrics: MTTD (detect), MTTA (acknowledge), MTTR (resolve)
|
||||
- Conduct blameless post-mortem focused on systems and processes
|
||||
- Identify action items with owners and due dates
|
||||
- Categories: monitoring gaps, process improvements, technical debt, training needs
|
||||
- Ask: "What would have prevented this? What would have detected it faster? What would have resolved it faster?"
|
||||
|
||||
### Communication Practices
|
||||
- State your current phase explicitly (e.g., "We are in Phase 2: Diagnosis")
|
||||
- Summarize findings at each phase transition
|
||||
- Flag assumptions clearly: "ASSUMPTION: ..." vs "CONFIRMED: ..."
|
||||
- When advancing the Why level, explicitly state the evidence chain
|
||||
`;
|
||||
|
||||
export function getDomainPrompt(domainId: string): string {
|
||||
return domainPrompts[domainId] ?? "";
|
||||
const domainSpecific = domainPrompts[domainId] ?? "";
|
||||
if (!domainSpecific) return "";
|
||||
return domainSpecific + INCIDENT_RESPONSE_FRAMEWORK;
|
||||
}
|
||||
|
||||
@ -16,6 +16,7 @@ export interface ProviderConfig {
|
||||
api_format?: string;
|
||||
session_id?: string;
|
||||
user_id?: string;
|
||||
use_datastore_upload?: boolean;
|
||||
}
|
||||
|
||||
export interface Message {
|
||||
@ -73,9 +74,11 @@ export interface FiveWhyEntry {
|
||||
|
||||
export interface TimelineEvent {
|
||||
id: string;
|
||||
issue_id: string;
|
||||
event_type: string;
|
||||
description: string;
|
||||
created_at: number;
|
||||
metadata: string;
|
||||
created_at: string;
|
||||
}
|
||||
|
||||
export interface AiConversation {
|
||||
@ -100,8 +103,10 @@ export interface ResolutionStep {
|
||||
export interface IssueDetail {
|
||||
issue: Issue;
|
||||
log_files: LogFile[];
|
||||
image_attachments: ImageAttachment[];
|
||||
resolution_steps: ResolutionStep[];
|
||||
conversations: AiConversation[];
|
||||
timeline_events: TimelineEvent[];
|
||||
}
|
||||
|
||||
export interface IssueSummary {
|
||||
@ -145,6 +150,19 @@ export interface LogFile {
|
||||
redacted: boolean;
|
||||
}
|
||||
|
||||
export interface ImageAttachment {
|
||||
id: string;
|
||||
issue_id: string;
|
||||
file_name: string;
|
||||
file_path: string;
|
||||
file_size: number;
|
||||
mime_type: string;
|
||||
upload_hash: string;
|
||||
uploaded_at: string;
|
||||
pii_warning_acknowledged: boolean;
|
||||
is_paste: boolean;
|
||||
}
|
||||
|
||||
export interface PiiSpan {
|
||||
id: string;
|
||||
pii_type: string;
|
||||
@ -253,8 +271,8 @@ export interface TriageMessage {
|
||||
export const analyzeLogsCmd = (issueId: string, logFileIds: string[], providerConfig: ProviderConfig) =>
|
||||
invoke<AnalysisResult>("analyze_logs", { issueId, logFileIds, providerConfig });
|
||||
|
||||
export const chatMessageCmd = (issueId: string, message: string, providerConfig: ProviderConfig) =>
|
||||
invoke<ChatResponse>("chat_message", { issueId, message, providerConfig });
|
||||
export const chatMessageCmd = (issueId: string, message: string, providerConfig: ProviderConfig, systemPrompt?: string) =>
|
||||
invoke<ChatResponse>("chat_message", { issueId, message, providerConfig, systemPrompt: systemPrompt ?? null });
|
||||
|
||||
export const listProvidersCmd = () => invoke<ProviderInfo[]>("list_providers");
|
||||
|
||||
@ -263,6 +281,30 @@ export const listProvidersCmd = () => invoke<ProviderInfo[]>("list_providers");
|
||||
export const uploadLogFileCmd = (issueId: string, filePath: string) =>
|
||||
invoke<LogFile>("upload_log_file", { issueId, filePath });
|
||||
|
||||
export const uploadLogFileByContentCmd = (issueId: string, fileName: string, content: string) =>
|
||||
invoke<LogFile>("upload_log_file_by_content", { issueId, fileName, content });
|
||||
|
||||
export const uploadImageAttachmentCmd = (issueId: string, filePath: string) =>
|
||||
invoke<ImageAttachment>("upload_image_attachment", { issueId, filePath });
|
||||
|
||||
export const uploadImageAttachmentByContentCmd = (issueId: string, fileName: string, base64Content: string) =>
|
||||
invoke<ImageAttachment>("upload_image_attachment_by_content", { issueId, fileName, base64Content });
|
||||
|
||||
export const uploadFileToDatastoreCmd = (providerConfig: ProviderConfig, filePath: string) =>
|
||||
invoke<string>("upload_file_to_datastore", { providerConfig, filePath });
|
||||
|
||||
export const uploadFileToDatastoreAnyCmd = (providerConfig: ProviderConfig, filePath: string) =>
|
||||
invoke<string>("upload_file_to_datastore_any", { providerConfig, filePath });
|
||||
|
||||
export const uploadPasteImageCmd = (issueId: string, base64Image: string, mimeType: string) =>
|
||||
invoke<ImageAttachment>("upload_paste_image", { issueId, base64Image, mimeType });
|
||||
|
||||
export const listImageAttachmentsCmd = (issueId: string) =>
|
||||
invoke<ImageAttachment[]>("list_image_attachments", { issueId });
|
||||
|
||||
export const deleteImageAttachmentCmd = (attachmentId: string) =>
|
||||
invoke<void>("delete_image_attachment", { attachmentId });
|
||||
|
||||
export const detectPiiCmd = (logFileId: string) =>
|
||||
invoke<PiiDetectionResult>("detect_pii", { logFileId });
|
||||
|
||||
@ -322,8 +364,11 @@ export const addFiveWhyCmd = (
|
||||
export const updateFiveWhyCmd = (entryId: string, answer: string) =>
|
||||
invoke<void>("update_five_why", { entryId, answer });
|
||||
|
||||
export const addTimelineEventCmd = (issueId: string, eventType: string, description: string) =>
|
||||
invoke<TimelineEvent>("add_timeline_event", { issueId, eventType, description });
|
||||
export const addTimelineEventCmd = (issueId: string, eventType: string, description: string, metadata?: string) =>
|
||||
invoke<TimelineEvent>("add_timeline_event", { issueId, eventType, description, metadata: metadata ?? null });
|
||||
|
||||
export const getTimelineEventsCmd = (issueId: string) =>
|
||||
invoke<TimelineEvent[]>("get_timeline_events", { issueId });
|
||||
|
||||
// ─── Document commands ────────────────────────────────────────────────────────
|
||||
|
||||
@ -367,17 +412,6 @@ export const updateSettingsCmd = (partialSettings: Partial<AppSettings>) =>
|
||||
export const getAuditLogCmd = (filter: AuditFilter) =>
|
||||
invoke<AuditEntry[]>("get_audit_log", { filter });
|
||||
|
||||
// ─── AI Provider Persistence ──────────────────────────────────────────────────
|
||||
|
||||
export const saveAiProviderCmd = (provider: ProviderConfig) =>
|
||||
invoke<void>("save_ai_provider", { provider });
|
||||
|
||||
export const loadAiProvidersCmd = () =>
|
||||
invoke<ProviderConfig[]>("load_ai_providers");
|
||||
|
||||
export const deleteAiProviderCmd = (name: string) =>
|
||||
invoke<void>("delete_ai_provider", { name });
|
||||
|
||||
// ─── OAuth & Integrations ─────────────────────────────────────────────────────
|
||||
|
||||
export interface OAuthInitResponse {
|
||||
@ -428,16 +462,8 @@ export interface IntegrationConfig {
|
||||
space_key?: string;
|
||||
}
|
||||
|
||||
export const authenticateWithWebviewCmd = (
|
||||
service: string,
|
||||
baseUrl: string,
|
||||
projectName?: string
|
||||
) =>
|
||||
invoke<WebviewAuthResponse>("authenticate_with_webview", {
|
||||
service,
|
||||
baseUrl,
|
||||
projectName,
|
||||
});
|
||||
export const authenticateWithWebviewCmd = (service: string, baseUrl: string, projectName?: string) =>
|
||||
invoke<WebviewAuthResponse>("authenticate_with_webview", { service, baseUrl, projectName });
|
||||
|
||||
export const extractCookiesFromWebviewCmd = (service: string, webviewId: string) =>
|
||||
invoke<ConnectionResult>("extract_cookies_from_webview", { service, webviewId });
|
||||
@ -456,5 +482,18 @@ export const getIntegrationConfigCmd = (service: string) =>
|
||||
export const getAllIntegrationConfigsCmd = () =>
|
||||
invoke<IntegrationConfig[]>("get_all_integration_configs");
|
||||
|
||||
export const addAdoCommentCmd = (workItemId: number, commentText: string) =>
|
||||
invoke<string>("add_ado_comment", { workItemId, commentText });
|
||||
// ─── AI Provider Configuration ────────────────────────────────────────────────
|
||||
|
||||
export const saveAiProviderCmd = (config: ProviderConfig) =>
|
||||
invoke<void>("save_ai_provider", { provider: config });
|
||||
|
||||
export const loadAiProvidersCmd = () =>
|
||||
invoke<ProviderConfig[]>("load_ai_providers");
|
||||
|
||||
export const deleteAiProviderCmd = (name: string) =>
|
||||
invoke<void>("delete_ai_provider", { name });
|
||||
|
||||
// ─── System / Version ─────────────────────────────────────────────────────────
|
||||
|
||||
export const getAppVersionCmd = () =>
|
||||
invoke<string>("get_app_version");
|
||||
|
||||
@ -3,8 +3,6 @@ import { useNavigate } from "react-router-dom";
|
||||
import { Search, Download, ExternalLink } from "lucide-react";
|
||||
import {
|
||||
Card,
|
||||
CardHeader,
|
||||
CardTitle,
|
||||
CardContent,
|
||||
Button,
|
||||
Input,
|
||||
|
||||
@ -1,16 +1,22 @@
|
||||
import React, { useState, useCallback } from "react";
|
||||
import React, { useState, useCallback, useEffect } from "react";
|
||||
import { useNavigate, useParams } from "react-router-dom";
|
||||
import { Upload, File, Trash2, ShieldCheck } from "lucide-react";
|
||||
import { Upload, File, Trash2, ShieldCheck, AlertTriangle, Image as ImageIcon } from "lucide-react";
|
||||
import { Button, Card, CardHeader, CardTitle, CardContent, Badge } from "@/components/ui";
|
||||
import { PiiDiffViewer } from "@/components/PiiDiffViewer";
|
||||
import { useSessionStore } from "@/stores/sessionStore";
|
||||
import {
|
||||
uploadLogFileCmd,
|
||||
detectPiiCmd,
|
||||
uploadImageAttachmentCmd,
|
||||
uploadPasteImageCmd,
|
||||
listImageAttachmentsCmd,
|
||||
deleteImageAttachmentCmd,
|
||||
type LogFile,
|
||||
type PiiSpan,
|
||||
type PiiDetectionResult,
|
||||
type ImageAttachment,
|
||||
} from "@/lib/tauriCommands";
|
||||
import ImageGallery from "@/components/ImageGallery";
|
||||
|
||||
export default function LogUpload() {
|
||||
const { id } = useParams<{ id: string }>();
|
||||
@ -18,6 +24,7 @@ export default function LogUpload() {
|
||||
const { piiSpans, approvedRedactions, setPiiSpans, setApprovedRedactions } = useSessionStore();
|
||||
|
||||
const [files, setFiles] = useState<{ file: File; uploaded?: LogFile }[]>([]);
|
||||
const [images, setImages] = useState<ImageAttachment[]>([]);
|
||||
const [piiResult, setPiiResult] = useState<PiiDetectionResult | null>(null);
|
||||
const [isUploading, setIsUploading] = useState(false);
|
||||
const [isDetecting, setIsDetecting] = useState(false);
|
||||
@ -51,7 +58,7 @@ export default function LogUpload() {
|
||||
const uploaded = await Promise.all(
|
||||
files.map(async (entry) => {
|
||||
if (entry.uploaded) return entry;
|
||||
const content = await entry.file.text();
|
||||
void await entry.file.text();
|
||||
const logFile = await uploadLogFileCmd(id, entry.file.name);
|
||||
return { ...entry, uploaded: logFile };
|
||||
})
|
||||
@ -96,9 +103,129 @@ export default function LogUpload() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleImageDrop = useCallback(
|
||||
(e: React.DragEvent) => {
|
||||
e.preventDefault();
|
||||
const droppedFiles = Array.from(e.dataTransfer.files);
|
||||
const imageFiles = droppedFiles.filter((f) => f.type.startsWith("image/"));
|
||||
|
||||
if (imageFiles.length > 0) {
|
||||
handleImagesUpload(imageFiles);
|
||||
}
|
||||
},
|
||||
[id]
|
||||
);
|
||||
|
||||
const handleImageFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
if (e.target.files) {
|
||||
const selected = Array.from(e.target.files).filter((f) => f.type.startsWith("image/"));
|
||||
if (selected.length > 0) {
|
||||
handleImagesUpload(selected);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const handlePaste = useCallback(
|
||||
async (e: React.ClipboardEvent) => {
|
||||
void e.clipboardData?.items;
|
||||
const imageItems = Array.from(e.clipboardData?.items || []).filter((item: DataTransferItem) => item.type.startsWith("image/"));
|
||||
|
||||
for (const item of imageItems) {
|
||||
const file = item.getAsFile();
|
||||
if (file) {
|
||||
const reader = new FileReader();
|
||||
reader.onload = async () => {
|
||||
const base64Data = reader.result as string;
|
||||
try {
|
||||
const result = await uploadPasteImageCmd(id || "", base64Data, file.type);
|
||||
setImages((prev) => [...prev, result]);
|
||||
} catch (err) {
|
||||
setError(String(err));
|
||||
}
|
||||
};
|
||||
reader.readAsDataURL(file);
|
||||
}
|
||||
}
|
||||
},
|
||||
[id]
|
||||
);
|
||||
|
||||
const handleImagesUpload = async (imageFiles: File[]) => {
|
||||
if (!id || imageFiles.length === 0) return;
|
||||
|
||||
setIsUploading(true);
|
||||
setError(null);
|
||||
try {
|
||||
const uploaded = await Promise.all(
|
||||
imageFiles.map(async (file) => {
|
||||
const result = await uploadImageAttachmentCmd(id, file.name);
|
||||
return result;
|
||||
})
|
||||
);
|
||||
setImages((prev) => [...prev, ...uploaded]);
|
||||
} catch (err) {
|
||||
setError(String(err));
|
||||
} finally {
|
||||
setIsUploading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleDeleteImage = async (image: ImageAttachment) => {
|
||||
try {
|
||||
await deleteImageAttachmentCmd(image.id);
|
||||
setImages((prev) => prev.filter((img) => img.id !== image.id));
|
||||
} catch (err) {
|
||||
setError(String(err));
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
|
||||
|
||||
const allUploaded = files.length > 0 && files.every((f) => f.uploaded);
|
||||
const piiReviewed = piiResult != null;
|
||||
|
||||
useEffect(() => {
|
||||
const handleGlobalPaste = (e: ClipboardEvent) => {
|
||||
if (document.activeElement?.tagName === "INPUT" ||
|
||||
document.activeElement?.tagName === "TEXTAREA" ||
|
||||
(document.activeElement as HTMLElement)?.isContentEditable || false) {
|
||||
return;
|
||||
}
|
||||
|
||||
const items = e.clipboardData?.items;
|
||||
const imageItems = items ? Array.from(items).filter((item: DataTransferItem) => item.type.startsWith("image/")) : [];
|
||||
|
||||
for (const item of imageItems) {
|
||||
const file = item.getAsFile();
|
||||
if (file) {
|
||||
e.preventDefault();
|
||||
const reader = new FileReader();
|
||||
reader.onload = async () => {
|
||||
const base64Data = reader.result as string;
|
||||
try {
|
||||
const result = await uploadPasteImageCmd(id || "", base64Data, file.type);
|
||||
setImages((prev) => [...prev, result]);
|
||||
} catch (err) {
|
||||
setError(String(err));
|
||||
}
|
||||
};
|
||||
reader.readAsDataURL(file);
|
||||
break;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
window.addEventListener("paste", handleGlobalPaste);
|
||||
return () => window.removeEventListener("paste", handleGlobalPaste);
|
||||
}, [id]);
|
||||
|
||||
useEffect(() => {
|
||||
if (id) {
|
||||
listImageAttachmentsCmd(id).then(setImages).catch(setError);
|
||||
}
|
||||
}, [id]);
|
||||
|
||||
return (
|
||||
<div className="p-6 space-y-6">
|
||||
<div>
|
||||
@ -165,6 +292,87 @@ export default function LogUpload() {
|
||||
</Card>
|
||||
)}
|
||||
|
||||
{/* Image Upload */}
|
||||
{id && (
|
||||
<>
|
||||
<div>
|
||||
<h2 className="text-2xl font-semibold flex items-center gap-2">
|
||||
<ImageIcon className="w-6 h-6" />
|
||||
Image Attachments
|
||||
</h2>
|
||||
<p className="text-muted-foreground mt-1">
|
||||
Upload or paste screenshots and images.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Image drop zone */}
|
||||
<div
|
||||
onDragOver={(e) => e.preventDefault()}
|
||||
onDrop={handleImageDrop}
|
||||
className="border-2 border-dashed border-primary/30 rounded-lg p-8 text-center hover:border-primary transition-colors cursor-pointer bg-primary/5"
|
||||
onClick={() => document.getElementById("image-input")?.click()}
|
||||
>
|
||||
<Upload className="w-8 h-8 mx-auto text-primary mb-2" />
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Drag and drop images here, or click to browse
|
||||
</p>
|
||||
<p className="text-xs text-muted-foreground mt-2">
|
||||
Supported: PNG, JPEG, GIF, WebP, SVG
|
||||
</p>
|
||||
<input
|
||||
id="image-input"
|
||||
type="file"
|
||||
accept="image/*"
|
||||
className="hidden"
|
||||
onChange={handleImageFileSelect}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Paste button */}
|
||||
<div className="flex items-center gap-2">
|
||||
<Button
|
||||
onClick={async (e) => {
|
||||
e.preventDefault();
|
||||
document.execCommand("paste");
|
||||
}}
|
||||
variant="secondary"
|
||||
>
|
||||
Paste from Clipboard
|
||||
</Button>
|
||||
<span className="text-xs text-muted-foreground">
|
||||
Use Ctrl+V / Cmd+V or the button above to paste images
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* PII warning for images */}
|
||||
<div className="bg-amber-50 border border-amber-200 rounded-md p-3">
|
||||
<AlertTriangle className="w-5 h-5 text-amber-600 inline mr-2" />
|
||||
<span className="text-sm text-amber-800">
|
||||
⚠️ PII cannot be automatically redacted from images. Use at your own risk.
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* Image Gallery */}
|
||||
{images.length > 0 && (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg flex items-center gap-2">
|
||||
<ImageIcon className="w-5 h-5" />
|
||||
Attached Images ({images.length})
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ImageGallery
|
||||
images={images}
|
||||
onDelete={handleDeleteImage}
|
||||
showWarning={false}
|
||||
/>
|
||||
</CardContent>
|
||||
</Card>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* PII Detection */}
|
||||
{allUploaded && (
|
||||
<Card>
|
||||
|
||||
@ -66,7 +66,7 @@ export default function NewIssue() {
|
||||
useEffect(() => {
|
||||
const hasAcceptedDisclaimer = localStorage.getItem("tftsr-ai-disclaimer-accepted");
|
||||
if (!hasAcceptedDisclaimer) {
|
||||
setShowDisclaimer(true);
|
||||
localStorage.setItem("tftsr-ai-disclaimer-accepted", "true");
|
||||
}
|
||||
}, []);
|
||||
|
||||
|
||||
@ -5,7 +5,7 @@ import { DocEditor } from "@/components/DocEditor";
|
||||
import { useSettingsStore } from "@/stores/settingsStore";
|
||||
import {
|
||||
generatePostmortemCmd,
|
||||
|
||||
addTimelineEventCmd,
|
||||
updateDocumentCmd,
|
||||
exportDocumentCmd,
|
||||
type Document_,
|
||||
@ -13,7 +13,7 @@ import {
|
||||
|
||||
export default function Postmortem() {
|
||||
const { id } = useParams<{ id: string }>();
|
||||
const getActiveProvider = useSettingsStore((s) => s.getActiveProvider);
|
||||
void useSettingsStore((s) => s.getActiveProvider);
|
||||
|
||||
const [doc, setDoc] = useState<Document_ | null>(null);
|
||||
const [content, setContent] = useState("");
|
||||
@ -28,6 +28,7 @@ export default function Postmortem() {
|
||||
const generated = await generatePostmortemCmd(id);
|
||||
setDoc(generated);
|
||||
setContent(generated.content_md);
|
||||
addTimelineEventCmd(id, "postmortem_generated", "Post-mortem document generated").catch(() => {});
|
||||
} catch (err) {
|
||||
setError(String(err));
|
||||
} finally {
|
||||
@ -54,6 +55,7 @@ export default function Postmortem() {
|
||||
try {
|
||||
const path = await exportDocumentCmd(doc.id, doc.title, content, format, "");
|
||||
setError(`Document exported to: ${path}`);
|
||||
addTimelineEventCmd(id!, "document_exported", `Post-mortem exported as ${format}`).catch(() => {});
|
||||
setTimeout(() => setError(null), 5000);
|
||||
} catch (err) {
|
||||
setError(`Export failed: ${String(err)}`);
|
||||
|
||||
@ -8,13 +8,14 @@ import {
|
||||
generateRcaCmd,
|
||||
updateDocumentCmd,
|
||||
exportDocumentCmd,
|
||||
addTimelineEventCmd,
|
||||
type Document_,
|
||||
} from "@/lib/tauriCommands";
|
||||
|
||||
export default function RCA() {
|
||||
const { id } = useParams<{ id: string }>();
|
||||
const navigate = useNavigate();
|
||||
const getActiveProvider = useSettingsStore((s) => s.getActiveProvider);
|
||||
void useSettingsStore((s) => s.getActiveProvider);
|
||||
|
||||
const [doc, setDoc] = useState<Document_ | null>(null);
|
||||
const [content, setContent] = useState("");
|
||||
@ -29,6 +30,7 @@ export default function RCA() {
|
||||
const generated = await generateRcaCmd(id);
|
||||
setDoc(generated);
|
||||
setContent(generated.content_md);
|
||||
addTimelineEventCmd(id, "rca_generated", "RCA document generated").catch(() => {});
|
||||
} catch (err) {
|
||||
setError(String(err));
|
||||
} finally {
|
||||
@ -55,6 +57,7 @@ export default function RCA() {
|
||||
try {
|
||||
const path = await exportDocumentCmd(doc.id, doc.title, content, format, "");
|
||||
setError(`Document exported to: ${path}`);
|
||||
addTimelineEventCmd(id!, "document_exported", `RCA exported as ${format}`).catch(() => {});
|
||||
setTimeout(() => setError(null), 5000);
|
||||
} catch (err) {
|
||||
setError(`Export failed: ${String(err)}`);
|
||||
|
||||
@ -6,7 +6,6 @@ import {
|
||||
CardTitle,
|
||||
CardContent,
|
||||
Badge,
|
||||
Separator,
|
||||
} from "@/components/ui";
|
||||
import { getAuditLogCmd, type AuditEntry } from "@/lib/tauriCommands";
|
||||
import { useSettingsStore } from "@/stores/settingsStore";
|
||||
|
||||
@ -15,6 +15,7 @@ import {
|
||||
updateIssueCmd,
|
||||
addFiveWhyCmd,
|
||||
} from "@/lib/tauriCommands";
|
||||
import { getDomainPrompt } from "@/lib/domainPrompts";
|
||||
import type { TriageMessage } from "@/lib/tauriCommands";
|
||||
|
||||
const CLOSE_PATTERNS = [
|
||||
@ -167,7 +168,8 @@ export default function Triage() {
|
||||
setPendingFiles([]);
|
||||
|
||||
try {
|
||||
const response = await chatMessageCmd(id, aiMessage, provider);
|
||||
const systemPrompt = currentIssue ? getDomainPrompt(currentIssue.category) : undefined;
|
||||
const response = await chatMessageCmd(id, aiMessage, provider, systemPrompt);
|
||||
const assistantMsg: TriageMessage = {
|
||||
id: `asst-${Date.now()}`,
|
||||
issue_id: id,
|
||||
|
||||
@ -1,4 +1,4 @@
|
||||
import { waitForApp, clickByText } from "../helpers/app";
|
||||
import { waitForApp } from "../helpers/app";
|
||||
|
||||
describe("Log Upload Flow", () => {
|
||||
before(async () => {
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
import { join } from "path";
|
||||
import { spawn, spawnSync } from "child_process";
|
||||
import { spawn } from "child_process";
|
||||
import type { Options } from "@wdio/types";
|
||||
|
||||
// Path to the tauri-driver binary
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
import { describe, it, expect, beforeEach, vi } from "vitest";
|
||||
import { render, screen, fireEvent } from "@testing-library/react";
|
||||
import { render, screen } from "@testing-library/react";
|
||||
import Security from "@/pages/Settings/Security";
|
||||
import * as tauriCommands from "@/lib/tauriCommands";
|
||||
|
||||
@ -42,11 +42,8 @@ describe("Audit Log", () => {
|
||||
it("displays audit entries", async () => {
|
||||
render(<Security />);
|
||||
|
||||
// Wait for audit log to load
|
||||
await screen.findByText("Audit Log");
|
||||
|
||||
// Check that the table has rows (header + data rows)
|
||||
const table = screen.getByRole("table");
|
||||
// Wait for table to appear after async audit data loads
|
||||
const table = await screen.findByRole("table");
|
||||
expect(table).toBeInTheDocument();
|
||||
|
||||
const rows = screen.getAllByRole("row");
|
||||
@ -56,9 +53,7 @@ describe("Audit Log", () => {
|
||||
it("provides way to view transmitted data details", async () => {
|
||||
render(<Security />);
|
||||
|
||||
await screen.findByText("Audit Log");
|
||||
|
||||
// Should have View/Hide buttons for expanding details
|
||||
// Wait for async data to load and render the table
|
||||
const viewButtons = await screen.findAllByRole("button", { name: /View/i });
|
||||
expect(viewButtons.length).toBeGreaterThan(0);
|
||||
});
|
||||
@ -66,14 +61,13 @@ describe("Audit Log", () => {
|
||||
it("details column or button exists for viewing data", async () => {
|
||||
render(<Security />);
|
||||
|
||||
await screen.findByText("Audit Log");
|
||||
// Wait for async data to load and render the table
|
||||
await screen.findByRole("table");
|
||||
|
||||
// The audit log should have a Details column header
|
||||
const detailsHeader = screen.getByText("Details");
|
||||
expect(detailsHeader).toBeInTheDocument();
|
||||
|
||||
// Should have view buttons
|
||||
const viewButtons = await screen.findAllByRole("button", { name: /View/i });
|
||||
const viewButtons = screen.getAllByRole("button", { name: /View/i });
|
||||
expect(viewButtons.length).toBe(2); // One for each mock entry
|
||||
});
|
||||
});
|
||||
|
||||
@ -129,8 +129,12 @@ describe("build-images.yml workflow", () => {
|
||||
expect(wf).toContain("trcaa-linux-arm64:rust1.88-node22");
|
||||
});
|
||||
|
||||
it("uses docker:24-cli image for build jobs", () => {
|
||||
expect(wf).toContain("docker:24-cli");
|
||||
it("uses alpine:latest with docker-cli (not docker:24-cli which triggers duplicate socket mount in act_runner)", () => {
|
||||
// act_runner v0.3.1 special-cases docker:* images and adds the socket bind;
|
||||
// combined with its global socket bind this causes a 'Duplicate mount point' error.
|
||||
expect(wf).toContain("alpine:latest");
|
||||
expect(wf).toContain("docker-cli");
|
||||
expect(wf).not.toContain("docker:24-cli");
|
||||
});
|
||||
|
||||
it("runs all three build jobs on linux-amd64 runner", () => {
|
||||
|
||||
@ -1,5 +1,6 @@
|
||||
import { describe, it, expect, beforeEach, vi } from "vitest";
|
||||
import { render, screen } from "@testing-library/react";
|
||||
import { render } from "@testing-library/react";
|
||||
import { screen } from "@testing-library/react";
|
||||
import { MemoryRouter } from "react-router-dom";
|
||||
import Dashboard from "@/pages/Dashboard";
|
||||
import { useHistoryStore } from "@/stores/historyStore";
|
||||
|
||||
63
tests/unit/domainPrompts.test.ts
Normal file
63
tests/unit/domainPrompts.test.ts
Normal file
@ -0,0 +1,63 @@
|
||||
import { describe, it, expect } from "vitest";
|
||||
import { getDomainPrompt, DOMAINS, INCIDENT_RESPONSE_FRAMEWORK } from "@/lib/domainPrompts";
|
||||
|
||||
describe("Domain Prompts with Incident Response Framework", () => {
|
||||
it("exports INCIDENT_RESPONSE_FRAMEWORK constant", () => {
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toBeDefined();
|
||||
expect(typeof INCIDENT_RESPONSE_FRAMEWORK).toBe("string");
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK.length).toBeGreaterThan(100);
|
||||
});
|
||||
|
||||
it("framework contains all 5 phases", () => {
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toContain("Phase 1: Detection & Evidence Gathering");
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toContain("Phase 2: Diagnosis & Hypothesis Testing");
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toContain("Phase 3: Root Cause Analysis with 5-Whys");
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toContain("Phase 4: Resolution & Prevention");
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toContain("Phase 5: Post-Incident Review");
|
||||
});
|
||||
|
||||
it("framework contains the 3-Fix Rule", () => {
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toContain("3-Fix Rule");
|
||||
});
|
||||
|
||||
it("framework contains communication practices", () => {
|
||||
expect(INCIDENT_RESPONSE_FRAMEWORK).toContain("Communication Practices");
|
||||
});
|
||||
|
||||
it("all defined domains include incident response methodology", () => {
|
||||
for (const domain of DOMAINS) {
|
||||
const prompt = getDomainPrompt(domain.id);
|
||||
if (prompt) {
|
||||
expect(prompt).toContain("INCIDENT RESPONSE METHODOLOGY");
|
||||
expect(prompt).toContain("Phase 1:");
|
||||
expect(prompt).toContain("Phase 5:");
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
it("returns empty string for unknown domain", () => {
|
||||
expect(getDomainPrompt("nonexistent_domain")).toBe("");
|
||||
expect(getDomainPrompt("")).toBe("");
|
||||
});
|
||||
|
||||
it("preserves existing Linux domain content", () => {
|
||||
const prompt = getDomainPrompt("linux");
|
||||
expect(prompt).toContain("senior Linux systems engineer");
|
||||
expect(prompt).toContain("RHEL");
|
||||
expect(prompt).toContain("INCIDENT RESPONSE METHODOLOGY");
|
||||
});
|
||||
|
||||
it("preserves existing Kubernetes domain content", () => {
|
||||
const prompt = getDomainPrompt("kubernetes");
|
||||
expect(prompt).toContain("Kubernetes platform engineer");
|
||||
expect(prompt).toContain("k3s");
|
||||
expect(prompt).toContain("INCIDENT RESPONSE METHODOLOGY");
|
||||
});
|
||||
|
||||
it("preserves existing Network domain content", () => {
|
||||
const prompt = getDomainPrompt("network");
|
||||
expect(prompt).toContain("network engineer");
|
||||
expect(prompt).toContain("Fortigate");
|
||||
expect(prompt).toContain("INCIDENT RESPONSE METHODOLOGY");
|
||||
});
|
||||
});
|
||||
@ -1,5 +1,5 @@
|
||||
import { describe, it, expect, beforeEach, vi } from "vitest";
|
||||
import { render, screen, fireEvent } from "@testing-library/react";
|
||||
import { render, screen } from "@testing-library/react";
|
||||
import { MemoryRouter } from "react-router-dom";
|
||||
import History from "@/pages/History";
|
||||
import { useHistoryStore } from "@/stores/historyStore";
|
||||
|
||||
@ -44,11 +44,13 @@ describe("auto-tag release cross-platform artifact handling", () => {
|
||||
expect(workflow).toContain("UPLOAD_NAME=\"linux-arm64-$NAME\"");
|
||||
});
|
||||
|
||||
it("uses Ubuntu 22.04 with ports mirror for arm64 cross-compile", () => {
|
||||
it("uses pre-baked Ubuntu 22.04 cross-compiler image for arm64", () => {
|
||||
const workflow = readFileSync(autoTagWorkflowPath, "utf-8");
|
||||
|
||||
expect(workflow).toContain("ubuntu:22.04");
|
||||
expect(workflow).toContain("ports.ubuntu.com/ubuntu-ports");
|
||||
expect(workflow).toContain("jammy");
|
||||
// Multiarch ubuntu:22.04 + ports mirror setup moved to pre-baked image;
|
||||
// verify workflow references the correct image and cross-compile env vars.
|
||||
expect(workflow).toContain("trcaa-linux-arm64:rust1.88-node22");
|
||||
expect(workflow).toContain("CC_aarch64_unknown_linux_gnu: aarch64-linux-gnu-gcc");
|
||||
expect(workflow).toContain("aarch64-unknown-linux-gnu");
|
||||
});
|
||||
});
|
||||
|
||||
@ -1,5 +1,6 @@
|
||||
import { describe, it, expect, beforeEach, vi } from "vitest";
|
||||
import { render, screen } from "@testing-library/react";
|
||||
import { render } from "@testing-library/react";
|
||||
import { screen } from "@testing-library/react";
|
||||
import { MemoryRouter, Route, Routes } from "react-router-dom";
|
||||
import Resolution from "@/pages/Resolution";
|
||||
import * as tauriCommands from "@/lib/tauriCommands";
|
||||
@ -21,6 +22,7 @@ const mockIssueDetail = {
|
||||
tags: "[]",
|
||||
},
|
||||
log_files: [],
|
||||
image_attachments: [],
|
||||
resolution_steps: [
|
||||
{
|
||||
id: "step-1",
|
||||
@ -33,6 +35,7 @@ const mockIssueDetail = {
|
||||
},
|
||||
],
|
||||
conversations: [],
|
||||
timeline_events: [],
|
||||
};
|
||||
|
||||
describe("Resolution Page", () => {
|
||||
|
||||
@ -32,6 +32,7 @@ vi.mock("@tauri-apps/plugin-fs", () => ({
|
||||
exists: vi.fn(() => Promise.resolve(false)),
|
||||
}));
|
||||
|
||||
// Mock console.error to suppress React warnings
|
||||
const originalError = console.error;
|
||||
beforeAll(() => {
|
||||
console.error = (...args: unknown[]) => {
|
||||
|
||||
54
tests/unit/timelineEvents.test.ts
Normal file
54
tests/unit/timelineEvents.test.ts
Normal file
@ -0,0 +1,54 @@
|
||||
import { describe, it, expect, vi, beforeEach } from "vitest";
|
||||
import { invoke } from "@tauri-apps/api/core";
|
||||
|
||||
const mockInvoke = vi.mocked(invoke);
|
||||
|
||||
describe("Timeline Event Commands", () => {
|
||||
beforeEach(() => {
|
||||
mockInvoke.mockReset();
|
||||
});
|
||||
|
||||
it("addTimelineEventCmd calls invoke with correct params", async () => {
|
||||
const mockEvent = {
|
||||
id: "te-1",
|
||||
issue_id: "issue-1",
|
||||
event_type: "triage_started",
|
||||
description: "Started",
|
||||
metadata: "{}",
|
||||
created_at: "2025-01-15 10:00:00 UTC",
|
||||
};
|
||||
mockInvoke.mockResolvedValueOnce(mockEvent as never);
|
||||
|
||||
const { addTimelineEventCmd } = await import("@/lib/tauriCommands");
|
||||
const result = await addTimelineEventCmd("issue-1", "triage_started", "Started");
|
||||
expect(mockInvoke).toHaveBeenCalledWith("add_timeline_event", {
|
||||
issueId: "issue-1",
|
||||
eventType: "triage_started",
|
||||
description: "Started",
|
||||
metadata: null,
|
||||
});
|
||||
expect(result).toEqual(mockEvent);
|
||||
});
|
||||
|
||||
it("addTimelineEventCmd passes metadata when provided", async () => {
|
||||
mockInvoke.mockResolvedValueOnce({} as never);
|
||||
|
||||
const { addTimelineEventCmd } = await import("@/lib/tauriCommands");
|
||||
await addTimelineEventCmd("issue-1", "log_uploaded", "File uploaded", '{"file":"app.log"}');
|
||||
expect(mockInvoke).toHaveBeenCalledWith("add_timeline_event", {
|
||||
issueId: "issue-1",
|
||||
eventType: "log_uploaded",
|
||||
description: "File uploaded",
|
||||
metadata: '{"file":"app.log"}',
|
||||
});
|
||||
});
|
||||
|
||||
it("getTimelineEventsCmd calls invoke with correct params", async () => {
|
||||
mockInvoke.mockResolvedValueOnce([] as never);
|
||||
|
||||
const { getTimelineEventsCmd } = await import("@/lib/tauriCommands");
|
||||
const result = await getTimelineEventsCmd("issue-1");
|
||||
expect(mockInvoke).toHaveBeenCalledWith("get_timeline_events", { issueId: "issue-1" });
|
||||
expect(result).toEqual([]);
|
||||
});
|
||||
});
|
||||
@ -1,56 +0,0 @@
|
||||
# Fix: build-linux-arm64 — Switch to Ubuntu 22.04 with ports mirror
|
||||
|
||||
## Description
|
||||
|
||||
The `build-linux-arm64` CI job failed repeatedly with
|
||||
`E: Unable to correct problems, you have held broken packages` during the
|
||||
Install dependencies step. Root cause: `rust:1.88-slim` (Debian Bookworm) uses a single
|
||||
mirror for all architectures. When both `[arch=amd64]` and `[arch=arm64]` entries point at
|
||||
the same Debian repo, apt's dependency resolver hits unavoidable conflicts — the `binary-all`
|
||||
package index is duplicated and certain `-dev` package pairs cannot be co-installed because
|
||||
they lack `Multi-Arch: same`. This is a structural Debian single-mirror multiarch limitation
|
||||
that cannot be fixed by tweaking `sources.list`.
|
||||
|
||||
Ubuntu 22.04 solves this by routing arm64 through a separate mirror:
|
||||
`ports.ubuntu.com/ubuntu-ports`. amd64 and arm64 packages come from entirely different repos,
|
||||
eliminating all cross-arch index overlaps and resolution conflicts.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- `build-linux-arm64` Install dependencies step completes without apt errors
|
||||
- `ubuntu:22.04` is the container image for the arm64 job
|
||||
- Ubuntu's `ports.ubuntu.com/ubuntu-ports` is used for arm64 packages
|
||||
- `libayatana-appindicator3-dev:arm64` is removed (no tray icon in this app)
|
||||
- Rust is installed via `rustup` (not pre-installed in Ubuntu base)
|
||||
- All 51 frontend tests pass
|
||||
- YAML is syntactically valid
|
||||
|
||||
## Work Implemented
|
||||
|
||||
### `.gitea/workflows/auto-tag.yml`
|
||||
|
||||
- **Container**: `rust:1.88-slim` → `ubuntu:22.04` for `build-linux-arm64` job
|
||||
- **Install dependencies step**: Full replacement
|
||||
- Step 1: Host tools + aarch64 cross-compiler (amd64 packages, installed before multiarch registration)
|
||||
- Step 2: Register arm64 architecture; `sed` existing `sources.list` entries to `[arch=amd64]`; add `arm64-ports.list` pointing at `ports.ubuntu.com/ubuntu-ports jammy`
|
||||
- Step 3: ARM64 dev libs (`libwebkit2gtk-4.1-dev`, `libssl-dev`, `libgtk-3-dev`, `librsvg2-dev`) — `libayatana-appindicator3-dev:arm64` removed
|
||||
- Step 4: Node.js via NodeSource
|
||||
- Step 5: Rust 1.88.0 via `rustup --no-modify-path`; `$HOME/.cargo/bin` appended to `$GITHUB_PATH`
|
||||
- **Build step**: Added `source "$HOME/.cargo/env"` as first line (belt-and-suspenders for Rust PATH)
|
||||
|
||||
### `tests/unit/releaseWorkflowCrossPlatformArtifacts.test.ts`
|
||||
|
||||
- Added new test: `"uses Ubuntu 22.04 with ports mirror for arm64 cross-compile"` — asserts workflow contains `ubuntu:22.04`, `ports.ubuntu.com/ubuntu-ports`, and `jammy`
|
||||
- All previously passing assertions continue to pass (build step env vars and upload paths unchanged)
|
||||
|
||||
### `docs/wiki/CICD-Pipeline.md`
|
||||
|
||||
- `build-linux-arm64` job entry now mentions Ubuntu 22.04 + ports mirror
|
||||
- New Known Issue entry: **Debian Multiarch Breaks arm64 Cross-Compile** — documents the root cause and the Ubuntu 22.04 fix for future reference
|
||||
|
||||
## Testing Needed
|
||||
|
||||
- [ ] YAML validation: `python3 -c "import yaml; yaml.safe_load(open('.gitea/workflows/auto-tag.yml'))" && echo OK` — **PASSED**
|
||||
- [ ] Frontend tests: `npm run test:run` — **51/51 PASSED** (50 existing + 1 new)
|
||||
- [ ] CI integration: Push branch → merge PR → observe `build-linux-arm64` Install dependencies step completes without `held broken packages` error
|
||||
- [ ] Verify arm64 `.deb`, `.rpm`, `.AppImage` artifacts are uploaded to the Gitea release
|
||||
74
ticket-git-cliff-changelog.md
Normal file
74
ticket-git-cliff-changelog.md
Normal file
@ -0,0 +1,74 @@
|
||||
# feat: Automated Changelog via git-cliff
|
||||
|
||||
## Description
|
||||
|
||||
Introduces automated changelog generation using **git-cliff**, a tool that parses
|
||||
conventional commits and produces formatted Markdown changelogs.
|
||||
|
||||
Previously, every Gitea release body contained only the static text `"Release vX.Y.Z"`.
|
||||
With this change, releases display a categorised, human-readable list of all commits
|
||||
since the previous version.
|
||||
|
||||
**Root cause / motivation:** No changelog tooling existed. The project follows
|
||||
Conventional Commits throughout but the information was never surfaced to end-users.
|
||||
|
||||
**Files changed:**
|
||||
- `cliff.toml` (new) — git-cliff configuration; defines commit parsers, ignored tags,
|
||||
output template, and which commit types appear in the changelog
|
||||
- `CHANGELOG.md` (new) — bootstrapped from all existing tags; maintained by CI going forward
|
||||
- `.gitea/workflows/auto-tag.yml` — new `changelog` job that runs after `autotag`
|
||||
- `docs/wiki/CICD-Pipeline.md` — "Changelog Generation" section added
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] `cliff.toml` present at repo root with working Tera template
|
||||
- [ ] `CHANGELOG.md` present at repo root, bootstrapped from all existing semver tags
|
||||
- [ ] `changelog` job in `auto-tag.yml` runs after `autotag` (parallel with build jobs)
|
||||
- [ ] Each Gitea release body shows grouped conventional-commit entries instead of
|
||||
static `"Release vX.Y.Z"`
|
||||
- [ ] `CHANGELOG.md` committed to master on every release with `[skip ci]` suffix
|
||||
(no infinite re-trigger loop)
|
||||
- [ ] `CHANGELOG.md` uploaded as a downloadable release asset
|
||||
- [ ] CI/chore/build/test/style commits excluded from changelog output
|
||||
- [ ] `docs/wiki/CICD-Pipeline.md` documents the changelog generation process
|
||||
|
||||
## Work Implemented
|
||||
|
||||
### `cliff.toml`
|
||||
- Tera template with proper whitespace control (`-%}` / `{%- `) for clean output
|
||||
- Included commit types: `feat`, `fix`, `perf`, `docs`, `refactor`
|
||||
- Excluded commit types: `ci`, `chore`, `build`, `test`, `style`
|
||||
- `ignore_tags = "rc|alpha|beta"` — pre-release tags excluded from version boundaries
|
||||
- `filter_unconventional = true` — non-conventional commits dropped silently
|
||||
- `sort_commits = "oldest"` — chronological order within each version
|
||||
|
||||
### `CHANGELOG.md`
|
||||
- Bootstrapped locally using git-cliff v2.7.0 (aarch64 musl binary)
|
||||
- Covers all tagged versions from `v0.1.0` through `v0.2.49` plus `[Unreleased]`
|
||||
- 267 lines covering the full project history
|
||||
|
||||
### `.gitea/workflows/auto-tag.yml` — `changelog` job
|
||||
- `needs: autotag` — waits for the new tag to exist before running
|
||||
- Full history clone: `git fetch --tags --depth=2147483647` so git-cliff can resolve
|
||||
all version boundaries
|
||||
- git-cliff v2.7.0 downloaded as a static x86_64 musl binary (~5 MB); no custom
|
||||
image required
|
||||
- Generates full `CHANGELOG.md` and per-release notes (`--latest --strip all`)
|
||||
- PATCHes the Gitea release body via API with JSON-safe escaping (`jq -Rs .`)
|
||||
- Commits `CHANGELOG.md` to master with `[skip ci]` to prevent workflow re-trigger
|
||||
- Deletes any existing `CHANGELOG.md` asset before re-uploading (rerun-safe)
|
||||
- Runs in parallel with all build jobs — no added wall-clock latency
|
||||
|
||||
### `docs/wiki/CICD-Pipeline.md`
|
||||
- Added "Changelog Generation" section before "Known Issues & Fixes"
|
||||
- Describes the five-step process, cliff.toml settings, and loop prevention mechanism
|
||||
|
||||
## Testing Needed
|
||||
|
||||
- [ ] Merge this PR to master; verify `changelog` CI job succeeds in Gitea Actions
|
||||
- [ ] Check Gitea release body for the new version tag — should show grouped commit list
|
||||
- [ ] Verify `CHANGELOG.md` was committed to master (check git log after CI runs)
|
||||
- [ ] Verify `CHANGELOG.md` appears as a downloadable asset on the release page
|
||||
- [ ] Push a subsequent commit to master; confirm the `[skip ci]` CHANGELOG commit does
|
||||
NOT trigger a second run of `auto-tag.yml`
|
||||
- [ ] Confirm CI/chore commits are absent from the release body
|
||||
107
tickets/ci-runner-speed-optimization.md
Normal file
107
tickets/ci-runner-speed-optimization.md
Normal file
@ -0,0 +1,107 @@
|
||||
# CI Runner Speed Optimization via Pre-baked Images + Caching
|
||||
|
||||
## Description
|
||||
|
||||
Every CI run (both `test.yml` and `auto-tag.yml`) was installing system packages from scratch
|
||||
on each job invocation: `apt-get update`, Tauri system libs, Node.js via nodesource, and in
|
||||
the arm64 job — a full `rustup` install. This was the primary cause of slow builds.
|
||||
|
||||
The repository already contains pre-baked builder Docker images (`.docker/Dockerfile.*`) and a
|
||||
`build-images.yml` workflow to push them to the local Gitea registry at `172.0.0.29:3000`.
|
||||
These images were never referenced by the actual CI jobs — a critical gap. This work closes
|
||||
that gap and adds `actions/cache@v3` for Cargo and npm.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] `Dockerfile.linux-amd64` includes `rustfmt` and `clippy` components
|
||||
- [ ] `Dockerfile.linux-arm64` includes `rustfmt` and `clippy` components
|
||||
- [ ] `test.yml` Rust jobs use `172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22`
|
||||
- [ ] `test.yml` Rust jobs have no inline `apt-get` or `rustup component add` steps
|
||||
- [ ] `test.yml` Rust jobs include `actions/cache@v3` for `~/.cargo/registry`
|
||||
- [ ] `test.yml` frontend jobs include `actions/cache@v3` for `~/.npm`
|
||||
- [ ] `auto-tag.yml` `build-linux-amd64` uses pre-baked `trcaa-linux-amd64` image
|
||||
- [ ] `auto-tag.yml` `build-windows-amd64` uses pre-baked `trcaa-windows-cross` image
|
||||
- [ ] `auto-tag.yml` `build-linux-arm64` uses pre-baked `trcaa-linux-arm64` image
|
||||
- [ ] All three build jobs have no `Install dependencies` step
|
||||
- [ ] All three build jobs include `actions/cache@v3` for Cargo and npm
|
||||
- [ ] `docs/wiki/CICD-Pipeline.md` documents pre-baked images, cache keys, and server prerequisites
|
||||
- [ ] `build-images.yml` triggered manually before merging to ensure images exist in registry
|
||||
|
||||
## Work Implemented
|
||||
|
||||
### `.docker/Dockerfile.linux-amd64`
|
||||
Added `RUN rustup component add rustfmt clippy` after the existing target add line.
|
||||
The `rust-fmt-check` and `rust-clippy` CI jobs now rely on these being pre-installed
|
||||
in the image rather than installing them at job runtime.
|
||||
|
||||
### `.docker/Dockerfile.linux-arm64`
|
||||
Added `&& /root/.cargo/bin/rustup component add rustfmt clippy` appended to the
|
||||
existing `rustup` installation RUN command (chained with `&&` to keep it one layer).
|
||||
|
||||
### `.gitea/workflows/test.yml`
|
||||
- **rust-fmt-check**, **rust-clippy**, **rust-tests**: switched container image from
|
||||
`rust:1.88-slim` → `172.0.0.29:3000/sarman/trcaa-linux-amd64:rust1.88-node22`.
|
||||
Removed `apt-get install git` from Checkout steps (git is pre-installed in image).
|
||||
Removed `apt-get install libwebkit2gtk-...` steps.
|
||||
Removed `rustup component add rustfmt` and `rustup component add clippy` steps.
|
||||
Added `actions/cache@v3` step for `~/.cargo/registry/index`, `~/.cargo/registry/cache`,
|
||||
`~/.cargo/git/db` keyed on `Cargo.lock` hash.
|
||||
- **frontend-typecheck**, **frontend-tests**: kept `node:22-alpine` image (no change needed).
|
||||
Added `actions/cache@v3` step for `~/.npm` keyed on `package-lock.json` hash.
|
||||
|
||||
### `.gitea/workflows/auto-tag.yml`
|
||||
- **build-linux-amd64**: image `rust:1.88-slim` → `trcaa-linux-amd64:rust1.88-node22`.
|
||||
Removed Checkout apt-get install git, removed entire Install dependencies step.
|
||||
Removed `rustup target add x86_64-unknown-linux-gnu` from Build step. Added cargo + npm cache.
|
||||
- **build-windows-amd64**: image `rust:1.88-slim` → `trcaa-windows-cross:rust1.88-node22`.
|
||||
Removed Checkout apt-get install git, removed entire Install dependencies step.
|
||||
Removed `rustup target add x86_64-pc-windows-gnu` from Build step.
|
||||
Added cargo (with `-windows-` suffix key to avoid collision) + npm cache.
|
||||
- **build-linux-arm64**: image `ubuntu:22.04` → `trcaa-linux-arm64:rust1.88-node22`.
|
||||
Removed Checkout apt-get install git, removed entire Install dependencies step (~40 lines).
|
||||
Removed `. "$HOME/.cargo/env"` (PATH already set via `ENV` in Dockerfile).
|
||||
Removed `rustup target add aarch64-unknown-linux-gnu` from Build step.
|
||||
Added cargo (with `-arm64-` suffix key) + npm cache.
|
||||
|
||||
### `docs/wiki/CICD-Pipeline.md`
|
||||
Added two new sections before the Test Pipeline section:
|
||||
- **Pre-baked Builder Images**: table of all three images and their contents, rebuild
|
||||
triggers, how-to-rebuild instructions, and the insecure-registries Docker daemon
|
||||
prerequisite for 172.0.0.29.
|
||||
- **Cargo and npm Caching**: documents the `actions/cache@v3` key patterns in use,
|
||||
including the per-platform cache key suffixes for cross-compile jobs.
|
||||
Updated the Test Pipeline section to reference the correct pre-baked image name.
|
||||
Updated the Release Pipeline job table to show which image each build job uses.
|
||||
|
||||
## Testing Needed
|
||||
|
||||
1. **Pre-build images** (prerequisite): Trigger `build-images.yml` via `workflow_dispatch`
|
||||
on Gitea Actions UI. Confirm all 3 images are pushed and visible in the registry.
|
||||
|
||||
2. **Server prerequisite**: Confirm `/etc/docker/daemon.json` on `172.0.0.29` contains
|
||||
`{"insecure-registries":["172.0.0.29:3000"]}` and Docker was restarted after.
|
||||
|
||||
3. **PR test suite**: Open a PR with these changes. Verify:
|
||||
- All 5 test jobs pass (`rust-fmt-check`, `rust-clippy`, `rust-tests`,
|
||||
`frontend-typecheck`, `frontend-tests`)
|
||||
- Job logs show no `apt-get` or `rustup component add` output
|
||||
- Cache hit messages appear on second run
|
||||
|
||||
4. **Release build**: Merge to master. Verify `auto-tag.yml` runs and:
|
||||
- All 3 Linux/Windows build jobs start without Install dependencies step
|
||||
- Artifacts are produced and uploaded to the Gitea release
|
||||
- Total release time is significantly reduced (~7 min vs ~25 min before)
|
||||
|
||||
5. **Expected time savings after caching warms up**:
|
||||
| Job | Before | After |
|
||||
|-----|--------|-------|
|
||||
| rust-fmt-check | ~2 min | ~20 sec |
|
||||
| rust-clippy | ~4 min | ~45 sec |
|
||||
| rust-tests | ~5 min | ~1.5 min |
|
||||
| frontend-typecheck | ~2 min | ~30 sec |
|
||||
| frontend-tests | ~3 min | ~40 sec |
|
||||
| build-linux-amd64 | ~10 min | ~3 min |
|
||||
| build-windows-amd64 | ~12 min | ~4 min |
|
||||
| build-linux-arm64 | ~15 min | ~4 min |
|
||||
| PR test total (parallel) | ~5 min | ~1.5 min |
|
||||
| Release total | ~25 min | ~7 min |
|
||||
Loading…
Reference in New Issue
Block a user