Compare commits

..

75 Commits

Author SHA1 Message Date
9f730304cc Merge pull request 'fix(ci): remove explicit docker.sock mount — act_runner mounts it automatically' (#22) from fix/build-images-duplicate-socket into master
All checks were successful
Auto Tag / autotag (push) Successful in 1m39s
Auto Tag / build-macos-arm64 (push) Successful in 4m42s
Auto Tag / build-windows-amd64 (push) Successful in 16m15s
Auto Tag / build-linux-arm64 (push) Successful in 28m32s
Auto Tag / wiki-sync (push) Successful in 1m44s
Auto Tag / build-linux-amd64 (push) Successful in 26m57s
Reviewed-on: #22
2026-04-06 02:18:55 +00:00
Shaun Arman
f54d1aa6a8 fix(ci): remove explicit docker.sock mount — act_runner mounts it automatically 2026-04-05 21:18:11 -05:00
bff11dc847 Merge pull request 'feat(ci): add persistent pre-baked Docker builder images' (#21) from feat/persistent-ci-builders into master
Reviewed-on: #21
2026-04-06 02:15:36 +00:00
Shaun Arman
eb8a0531e6 feat(ci): add persistent pre-baked Docker builder images
Add three Dockerfiles under .docker/ and a build-images.yml workflow that
pushes them to the local Gitea container registry (172.0.0.29:3000).

Each image pre-installs all system deps, Node.js 22, and the Rust cross-
compilation target so release builds can skip apt-get entirely:

  trcaa-linux-amd64:rust1.88-node22   — webkit2gtk, gtk3, all Tauri deps
  trcaa-windows-cross:rust1.88-node22 — mingw-w64, nsis, Windows target
  trcaa-linux-arm64:rust1.88-node22   — arm64 multiarch dev libs, Rust 1.88

build-images.yml triggers automatically when .docker/ changes on master
and supports workflow_dispatch for manual/first-time builds.

auto-tag.yml is NOT changed in this commit — switch it to use the new
images in the follow-up PR (after images are pushed to the registry).

One-time server setup required before first use:
  echo '{"insecure-registries":["172.0.0.29:3000"]}' \
    | sudo tee /etc/docker/daemon.json && sudo systemctl restart docker
2026-04-05 21:07:17 -05:00
bf6e589b3c Merge pull request 'feat(ui): UI fixes, theme toggle, PII persistence, Ollama install instructions' (#20) from feat/ui-fixes-ollama-bundle-theme into master
Reviewed-on: #20
2026-04-06 01:54:36 +00:00
Shaun Arman
9175faf0b4 refactor(ollama): remove download/install buttons — show plain install instructions only 2026-04-05 20:53:57 -05:00
Shaun Arman
0796297e8c fix(ci): remove all Ollama bundle download steps — use UI download button instead 2026-04-05 20:53:57 -05:00
Shaun Arman
809c4041ea fix(ci): skip Ollama download on macOS build — runner has no access to GitHub binary assets 2026-04-05 20:53:57 -05:00
180ca74ec2 Merge pull request 'feat(ui): fix model dropdown, auth prefill, PII persistence, theme toggle, Ollama bundle' (#19) from feat/ui-fixes-ollama-bundle-theme into master
Reviewed-on: #19
2026-04-06 01:12:34 +00:00
Shaun Arman
2d02cfa9e8 style: apply cargo fmt to install_ollama_from_bundle 2026-04-05 19:41:59 -05:00
Shaun Arman
dffd26a6fd fix(security): add path canonicalization and actionable permission error in install_ollama_from_bundle 2026-04-05 19:34:47 -05:00
Shaun Arman
fc50fe3102 test(store): add PII pattern persistence tests for settingsStore 2026-04-05 19:33:23 -05:00
Shaun Arman
215c0ae218 feat(ui): fix model dropdown, auth prefill, PII persistence, theme toggle, and Ollama bundle
- AIProviders: hide top model row when custom_rest active (dropdown lower in form handles it);
  clear auth header prefill on format switch; rename User ID / CORE ID → Email Address
- Dashboard + Ollama: add border-border/bg-card classes to Refresh buttons for dark-bg contrast
- Security + settingsStore: wire PII toggle state to persisted Zustand store so pattern
  selections survive app restarts
- App: add Sun/Moon theme toggle button to sidebar footer (always visible when collapsed)
- system.rs: add install_ollama_from_bundle command (copies bundled binary to /usr/local/bin)
- auto-tag.yml: add Download Ollama step to all 4 platform build jobs with SHA256 verification
- tauri.conf.json: add resources/ollama/* to bundle resources
- docs: add install_ollama_from_bundle to IPC-Commands wiki

Security: CI download steps verify SHA256 against Ollama's published sha256sums.txt before bundling.
2026-04-05 19:30:41 -05:00
a40bc2304f Merge pull request 'feat(rebrand): rename binary to trcaa and auto-generate DB key' (#18) from feat/rebrand-binary-trcaa into master
Reviewed-on: #18
2026-04-05 23:17:05 +00:00
Shaun Arman
d87b01b154 feat(rebrand): rename binary to trcaa and auto-generate DB key
- Rename Cargo package from 'tftsr' to 'trcaa' — installed command
  becomes 'trcaa' instead of 'tftsr'
- Update app data directories to ~/.local/share/trcaa (Linux),
  ~/Library/Application Support/trcaa (macOS), %APPDATA%/trcaa (Windows)
- Update bundle identifier to com.trcaa.app
- Auto-generate per-installation DB encryption key on first launch and
  persist to <data_dir>/.dbkey (mode 0600 on Unix) — removes the hard
  requirement for TFTSR_DB_KEY to be set before the app will start
2026-04-05 17:50:16 -05:00
b734991932 Merge pull request 'fix(ci): restrict arm64 bundles to deb,rpm — skip AppImage' (#17) from fix/arm64-skip-appimage into master
Reviewed-on: #17
2026-04-05 22:04:51 +00:00
Shaun Arman
73a4c71196 fix(ci): restrict arm64 bundles to deb,rpm — skip AppImage
linuxdeploy-aarch64.AppImage cannot be reliably executed in a cross-
compile context (amd64 host, aarch64 target) even with QEMU binfmt
and APPIMAGE_EXTRACT_AND_RUN. The .deb and .rpm cover all major arm64
Linux distros. An arm64 AppImage can be added later via a native
arm64 build job if required.
2026-04-05 17:02:20 -05:00
a3c9a5a710 Merge pull request 'fix(ci): set APPIMAGE_EXTRACT_AND_RUN=1 for arm64 AppImage bundling' (#16) from fix/arm64-appimage-fuse into master
Reviewed-on: #16
2026-04-05 20:57:02 +00:00
Shaun Arman
acccab4235 fix(ci): set APPIMAGE_EXTRACT_AND_RUN=1 for arm64 AppImage bundling
linuxdeploy and its plugins are themselves AppImages. Inside a Docker
container FUSE is unavailable, so they cannot self-mount. Setting
APPIMAGE_EXTRACT_AND_RUN=1 causes them to extract to a temp directory
and run directly, bypassing the FUSE requirement.
2026-04-05 15:56:09 -05:00
d6701cb51a Merge pull request 'fix(ci): add make to arm64 host tools for OpenSSL vendored build' (#15) from fix/arm64-missing-make into master
Reviewed-on: #15
2026-04-05 20:10:50 +00:00
Shaun Arman
7ecf66a8cd fix(ci): add make to arm64 host tools for OpenSSL vendored build
openssl-src compiles OpenSSL from source and requires make.
The old Debian image had it; it was not carried over to the
Ubuntu 22.04 host tools list.
2026-04-05 15:09:22 -05:00
fdbcee9fbd Merge pull request 'fix(ci): use POSIX dot instead of source in arm64 build step' (#14) from fix/arm64-source-sh into master
Reviewed-on: #14
2026-04-05 19:42:49 +00:00
Shaun Arman
5546f9f615 fix(ci): use POSIX dot instead of source in arm64 build step
The act runner executes run: blocks with sh (dash), not bash.
'source' is a bash built-in; POSIX sh uses '.' instead.

Co-Authored-By: fix/arm64-source-sh <noreply@local>
2026-04-05 14:41:18 -05:00
3f76818a47 Merge pull request 'fix(ci): remove GITHUB_PATH append that was breaking arm64 install step' (#13) from fix/arm64-github-path into master
Reviewed-on: #13
2026-04-05 19:06:01 +00:00
Shaun Arman
eb4cf59192 fix(ci): remove GITHUB_PATH append that was breaking arm64 install step
$GITHUB_PATH is unset in this Gitea Actions environment, causing the
echo redirect to fail with a non-zero exit, which killed the Install
dependencies step before the Build step could run.

The append was unnecessary — the Build step already sources
$HOME/.cargo/env as its first line, which puts Cargo's bin dir in PATH.

Co-Authored-By: fix/yaml-heredoc-indent <noreply@local>
2026-04-05 14:04:32 -05:00
e6d5a7178b Merge pull request 'fix(ci): switch build-linux-arm64 to Ubuntu 22.04 with ports mirror' (#12) from fix/yaml-heredoc-indent into master
Reviewed-on: #12
2026-04-05 18:15:16 +00:00
Shaun Arman
81442be1bd docs: update CI pipeline wiki and add ticket summary for arm64 fix
Documents the Ubuntu 22.04 + ports.ubuntu.com approach for arm64
cross-compilation and adds a Known Issues entry explaining the Debian
single-mirror multiarch root cause that was replaced.

Co-Authored-By: fix/yaml-heredoc-indent <noreply@local>
2026-04-05 12:51:30 -05:00
Shaun Arman
9188a63305 fix(ci): switch build-linux-arm64 to Ubuntu 22.04 with ports mirror
The Debian single-mirror multiarch approach causes irreconcilable
apt dependency conflicts when both amd64 and arm64 point at the same
repo: the binary-all index is duplicated and certain -dev package pairs
lack Multi-Arch: same. This produces "held broken packages" regardless
of sources.list tweaks.

Ubuntu 22.04 routes arm64 through ports.ubuntu.com/ubuntu-ports, a
separate mirror from archive.ubuntu.com (amd64). This eliminates all
cross-arch index overlaps. Rust is installed via rustup since it is not
pre-installed in the Ubuntu base image. libayatana-appindicator3-dev
is dropped — no tray icon is used by this application.

Co-Authored-By: fix/yaml-heredoc-indent <noreply@local>
2026-04-05 12:51:19 -05:00
bc9c7d5cd1 Merge pull request 'fix(ci): replace heredoc with printf in arm64 install step' (#11) from fix/yaml-heredoc-indent into master
Reviewed-on: #11
2026-04-05 17:12:11 +00:00
Shaun Arman
5ab00a3759 fix(ci): replace heredoc with printf in arm64 install step
YAML block scalars end when a line is found with less indentation than
the scalar's own indent level. The heredoc body was at column 0 while
the rest of the run: block was at column 10, causing Gitea's YAML parser
to reject the entire workflow file with:

  yaml: line 412: could not find expected ':'

This silently invalidated auto-tag.yml on every push to master since the
apt-sources commit was merged, which is why PR#9 and PR#10 merges produced
no action runs.

Fix: replace the heredoc with a printf that stays within the block scalar's
indentation so the YAML remains valid.
2026-04-05 12:11:12 -05:00
d676372487 Merge pull request 'fix(ci): add workflow_dispatch and concurrency guard to auto-tag' (#10) from fix/auto-tag-dispatch into master
Reviewed-on: #10
2026-04-05 17:06:09 +00:00
Shaun Arman
a04ba02424 fix(ci): add workflow_dispatch and concurrency guard to auto-tag
Gitea 1.22 silently drops a push event for a workflow when a run for that
same workflow+branch is already in progress. This caused the PR#9 merge to
master to produce no auto-tag run.

- workflow_dispatch: allows manual triggering via API when an event is dropped
- concurrency group (cancel-in-progress: false): causes Gitea to queue a second
  run rather than discard it when one is already active
2026-04-05 11:41:21 -05:00
2bc4cf60a0 Merge pull request 'fix(ci): rebuild apt sources with per-arch entries before arm64 cross-compile' (#9) from bug/build-failure into master
Reviewed-on: #9
2026-04-05 16:32:20 +00:00
Shaun Arman
15b69e2350 fix(ci): rebuild apt sources with per-arch entries before arm64 cross-compile install
rust:1.88-slim (Debian Bookworm) uses DEB822-format sources which have no arch
restriction. After dpkg --add-architecture arm64, apt tries to resolve deps for
both amd64 and arm64 simultaneously and hits 'held broken packages' conflicts on
shared -dev packages.

Fix: remove debian.sources and write a clean sources.list that pins amd64 repos
to [arch=amd64] and arm64 repos to [arch=arm64]. This gives apt a clear,
non-conflicting view of each architecture's package set.
2026-04-05 11:05:46 -05:00
1b26bf5214 Merge pull request 'security/audit' (#8) from security/audit into master
Reviewed-on: #8
2026-04-05 15:56:26 +00:00
Shaun Arman
cde4a85cc7 fix(ci): fix arm64 cross-compile, drop cargo install tauri-cli, move wiki-sync
build-linux-arm64: switch from QEMU-emulated linux-arm64 runner to cross-compile
on linux-amd64 using aarch64-linux-gnu toolchain. Removes the uname -m arch guard
that was causing the job to exit immediately (QEMU reports x86_64 as kernel arch),
and fixes the artifact path to the explicit target directory.

All build jobs: replace `cargo install tauri-cli --locked` with `npx tauri build`,
using the pre-compiled @tauri-apps/cli binary from devDependencies. Eliminates the
20-30 min Tauri CLI recompilation on every run.

wiki-sync: move from test.yml to auto-tag.yml. test.yml only fires on pull_request
events so the `if: github.ref == 'refs/heads/master'` guard was never true and the
wiki was never updated. auto-tag.yml triggers on push to master, so wiki sync now
runs on every merge.

Update releaseWorkflowCrossPlatformArtifacts.test.ts to match the new workflow.
2026-04-05 10:33:53 -05:00
3831ac0262 Merge branch 'master' into security/audit 2026-04-05 15:10:21 +00:00
Shaun Arman
abab5c3153 fix(security): enforce PII redaction before AI log transmission
analyze_logs() was reading the original log file from disk and sending its
full contents to external AI providers, completely bypassing the redaction
pipeline. The redacted flag in log_files and the .redacted file on disk were
written by apply_redactions() but never consulted on the read path.

Fix: query the redacted column alongside file_path. If the file has not been
redacted, return an error to the caller before any AI provider call is made.
When redacted, read from {path}.redacted instead of the original.

Adds redacted_path_for() helper and two unit tests covering the rejection
and happy-path cases.
2026-04-05 10:08:16 -05:00
Shaun Arman
0a25ca7692 fix(pii): remove lookahead from hostname regex, fix fmt in analysis test
Rust's `regex` crate does not support lookaround assertions. The hostname
pattern `(?=.{1,253}\b)` caused a panic on every `PiiDetector::new()` call,
failing all four PII detector tests in CI (rust-fmt-check, rust-clippy,
rust-tests). Removed the lookahead; the remaining pattern correctly matches
valid FQDNs without the RFC 1035 length pre-check.

Also reformatted analysis.rs:253 to satisfy `rustfmt` (line break after `=`).

All 127 Rust tests pass and `cargo fmt --check` and `cargo clippy -- -D
warnings` are clean.
2026-04-05 09:59:19 -05:00
Shaun Arman
281e676ad1 fix(security): harden secret handling and audit integrity
Remove high-risk defaults and tighten data handling across auth, storage, IPC, provider calls, and capabilities so sensitive data is better protected by default. Also update README/wiki security guidance and add targeted tests for the new hardening behaviors.

Made-with: Cursor
2026-04-04 23:37:05 -05:00
Shaun Arman
10cccdc653 fix(ci): unblock release jobs and namespace linux artifacts by arch
Drop fragile job-condition gates that were blocking release jobs, and upload linux artifacts with arch-prefixed release asset names so amd64 and arm64 outputs can coexist even when bundle filenames are identical.

Made-with: Cursor
2026-04-04 23:19:40 -05:00
Shaun Arman
b1d794765f fix(ci): unblock release jobs and namespace linux artifacts by arch
Drop fragile job-condition gates that were blocking release jobs, and upload linux artifacts with arch-prefixed release asset names so amd64 and arm64 outputs can coexist even when bundle filenames are identical.

Made-with: Cursor
2026-04-04 23:17:12 -05:00
Shaun Arman
7b5f2daaa4 fix(ci): run linux arm release natively and enforce arm artifacts
Avoid cross-compiling GTK/glib on the arm release job by building natively on ARM64 hosts, add an explicit architecture guard, and restrict uploads to arm64/aarch64 artifact filenames so amd64 outputs cannot be published as arm releases.

Made-with: Cursor
2026-04-04 22:46:23 -05:00
Shaun Arman
aaa48d65a2 fix(ci): force explicit linux arm64 target for release artifacts
Build linux arm64 bundles with --target aarch64-unknown-linux-gnu and upload from the target-specific bundle path so arm64 releases cannot accidentally publish amd64 artifacts.

Made-with: Cursor
2026-04-04 22:15:02 -05:00
Shaun Arman
e20228da6f refactor(ci): remove standalone release workflow
Delete .gitea/workflows/release.yml and keep release orchestration in auto-tag.yml only, then update related workflow tests and docs to reference the unified pipeline.

Made-with: Cursor
2026-04-04 21:34:15 -05:00
Shaun Arman
2d2c62e4f5 fix(ci): repair auto-tag workflow yaml so jobs trigger
Replace heredoc-based Python error logging with single-line python invocations to keep YAML block indentation valid, restoring Gitea's ability to parse and trigger auto-tag plus downstream release build jobs.

Made-with: Cursor
2026-04-04 21:28:52 -05:00
Shaun Arman
b69c132a0a fix(ci): run post-tag release builds without job-output gating
Remove auto-tag job output dependencies and conditional gates so release build jobs always run after autotag completes, resolving skipped fan-out caused by output/if evaluation issues in Gitea Actions.

Made-with: Cursor
2026-04-04 21:24:24 -05:00
Shaun Arman
a6b4ed789c fix(ci): use stable auto-tag job outputs for release fanout
Rename the auto-tag job id to a non-hyphenated identifier and update needs/output references so dependent release jobs evaluate conditions correctly and reliably run after tagging.

Made-with: Cursor
2026-04-04 21:21:35 -05:00
Shaun Arman
93ead1362f fix(ci): guarantee release jobs run after auto-tag
Run linux/windows/macos/arm release build and upload jobs in the auto-tag workflow with needs:auto-tag outputs so release execution no longer depends on a second tag-triggered workflow dispatch path.

Made-with: Cursor
2026-04-04 21:19:13 -05:00
Shaun Arman
48041acc8c fix(ci): trigger release workflow from auto-tag pushes
Switch auto-tag to create and push tags via git instead of the tag API so Gitea emits a real tag push event that reliably starts release builds. Document the trigger behavior and add a workflow regression test.

Made-with: Cursor
2026-04-04 21:14:41 -05:00
42120cb140 Merge pull request 'fix(ci): harden release asset uploads for reruns' (#7) from fix/release-upload-rerun-hardening into master
Reviewed-on: #7
2026-04-05 02:10:54 +00:00
Shaun Arman
2d35e2a2c1 fix(ci): harden release asset uploads for reruns
Make all release upload steps fail fast when expected artifacts are missing, replace existing same-name assets before uploading, and print HTTP/body details on upload failures so Linux/Windows publishing issues are diagnosable and reruns remain deterministic.

Made-with: Cursor
2026-04-04 21:09:03 -05:00
b22d508f25 Merge pull request 'fix(ci): stabilize release artifacts for windows and linux' (#6) from fix/release-windows-openssl-linux-assets into master
Some checks failed
Release / build-macos-arm64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
Reviewed-on: #6
2026-04-05 01:21:31 +00:00
Shaun Arman
c3fd83f330 fix(ci): make release artifacts reliable across platforms
Override OpenSSL vendoring for the windows-gnu release build so cross-compiles no longer fail on pkg-config lookup, and fail fast when Linux release jobs produce no artifacts so incomplete releases are detected immediately.

Made-with: Cursor
2026-04-04 19:53:40 -05:00
4606fdd104 Merge pull request 'ci: run test workflow only on pull requests' (#5) from fix/pr4-clean-replacement into master
Some checks failed
Release / build-linux-arm64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-macos-arm64 (push) Has been cancelled
Reviewed-on: #5
2026-04-05 00:14:07 +00:00
Shaun Arman
4e7a5b64ba ci: run test workflow only on pull requests
Avoid duplicate Test workflow executions by removing push triggers and keeping pull_request validation as the single gate. Also fix remaining clippy format string violations in integration modules to keep rust-clippy passing.

Made-with: Cursor
2026-04-04 18:52:13 -05:00
82c18871af Merge pull request 'fix/skip-master-test-workflow' (#3) from fix/skip-master-test-workflow into master
Some checks failed
Release / build-windows-amd64 (push) Has been cancelled
Release / build-macos-arm64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Reviewed-on: #3
2026-04-04 21:48:47 +00:00
Shaun Arman
8e7356e62d ci: skip test workflow pushes on master
Avoid rerunning the full test workflow on direct master pushes while keeping pull request validation intact. Update the CI/CD wiki page to reflect the new trigger behavior.

Made-with: Cursor
2026-04-04 16:45:55 -05:00
Shaun Arman
b426f56149 fix: resolve macOS bundle path after app rename
Find the generated .app bundle dynamically in release CI so macOS packaging no longer depends on the legacy TFTSR.app name. Add a unit test to prevent regressions by asserting the old hardcoded path is not reintroduced.

Made-with: Cursor
2026-04-04 16:28:01 -05:00
f2531eb922 Merge pull request 'fix: resolve clippy uninlined_format_args (CI run 178)' (#2) from fix/clippy-uninlined-format-args into master
Some checks failed
Release / build-linux-arm64 (push) Has been cancelled
Release / build-macos-arm64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
Reviewed-on: #2
2026-04-04 21:08:52 +00:00
Shaun Arman
c4ea32e660 feat: add custom_rest provider mode and rebrand application name
Rename custom API format handling from custom_rest to custom_rest with backward compatibility, add guided model selection with custom entry in provider settings, and rebrand app naming to Troubleshooting and RCA Assistant across UI, metadata, and docs.

Made-with: Cursor
2026-04-04 15:35:58 -05:00
Shaun Arman
0bc20f09f6 style: apply rustfmt output for clippy-related edits
Apply canonical rustfmt formatting in files touched by the clippy format-args cleanup so cargo fmt --check passes consistently in CI.

Made-with: Cursor
2026-04-04 15:10:17 -05:00
Shaun Arman
85a8d0a4c0 fix: resolve clippy format-args failures and OpenSSL vendoring issue
Inline format arguments across Rust modules to satisfy clippy -D warnings, and configure Cargo to prefer system OpenSSL so clippy builds do not fail on missing vendored Perl modules.

Made-with: Cursor
2026-04-04 15:05:13 -05:00
Shaun Arman
bdb63f3aee fix: resolve clippy uninlined_format_args in integrations and related modules
Replace format!("msg: {}", var) with format!("msg: {var}") across 8 files
to satisfy the uninlined_format_args lint (-D warnings) in CI run 178.

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
2026-04-04 12:27:26 -05:00
Shaun Arman
64492c743b fix: ARM64 build uses native target instead of cross-compile
Some checks failed
Release / build-macos-arm64 (push) Successful in 5m14s
Release / build-linux-arm64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
The ARM64 build was failing because explicitly specifying
--target aarch64-unknown-linux-gnu on an ARM64 runner was
triggering cross-compilation logic.

Changes:
- Remove rustup target add (not needed for native build)
- Remove --target flag from cargo tauri build
- Update artifact path: target/aarch64-unknown-linux-gnu/release/bundle
  → target/release/bundle

This allows the native ARM64 toolchain to build without
attempting cross-compilation and avoids the pkg-config
cross-compilation configuration requirement.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-04-04 09:59:56 -05:00
Shaun Arman
a7903db904 fix: persist integration settings and implement persistent browser windows
Some checks failed
Release / build-macos-arm64 (push) Successful in 4m52s
Release / build-linux-amd64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
## Integration Settings Persistence
- Add database commands to save/load integration configs (base_url, username, project_name, space_key)
- Frontend now loads configs from DB on mount and saves changes automatically
- Fixes issue where settings were lost on app restart

## Persistent Browser Window Architecture
- Integration browser windows now stay open for user browsing and authentication
- Extract fresh cookies before each API call to handle token rotation
- Track open windows in app state (integration_webviews HashMap)
- Windows titled as "{Service} Browser (TFTSR)" for clarity
- Support easy navigation between app and browser windows (Cmd+Tab/Alt+Tab)
- Gracefully handle closed windows with automatic cleanup

## Bug Fixes
- Fix Rust formatting issues across 8 files
- Fix clippy warnings:
  - Use is_some_and() instead of map_or() in openai.rs
  - Use .to_string() instead of format!() in integrations.rs
- Add missing OptionalExtension import for .optional() method

## Tests
- Add test_integration_config_serialization
- Add test_webview_tracking
- Add test_token_auth_request_serialization
- All 6 integration tests passing

## Files Modified
- src-tauri/src/state.rs: Add integration_webviews tracking
- src-tauri/src/lib.rs: Register 3 new commands, initialize webviews HashMap
- src-tauri/src/commands/integrations.rs: Config persistence, fresh cookie extraction (+151 lines)
- src-tauri/src/integrations/webview_auth.rs: Persistent window behavior
- src/lib/tauriCommands.ts: TypeScript wrappers for new commands
- src/pages/Settings/Integrations.tsx: Load/save configs from DB

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-04-04 09:57:22 -05:00
Shaun Arman
fbce897608 feat: complete webview cookie extraction implementation
Some checks failed
Release / build-macos-arm64 (push) Successful in 5m4s
Release / build-windows-amd64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
Implement working cookie extraction using Tauri's IPC event system:

**How it works:**
1. Opens embedded browser window for user to login
2. User completes authentication (including SSO)
3. User clicks "Complete Login" button in UI
4. JavaScript injected into webview extracts `document.cookie`
5. Parsed cookies emitted via Tauri event: `tftsr-cookies-extracted`
6. Rust listens for event and receives cookie data
7. Cookies encrypted and stored in database

**Technical implementation:**
- Uses `window.__TAURI__.event.emit()` from injected JavaScript
- Rust listens via `app_handle.listen()` with Listener trait
- 10-second timeout with clear error messages
- Handles empty cookies and JavaScript errors gracefully
- Cross-platform compatible (no platform-specific APIs)

**Cookie limitations:**
- `document.cookie` only exposes non-HttpOnly cookies
- HttpOnly session cookies won't be captured via JavaScript
- For HttpOnly cookies, services must provide API tokens as fallback

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-04-03 17:31:48 -05:00
Shaun Arman
32d83df3cf feat: add multi-mode authentication for integrations (v0.2.10)
Some checks failed
Release / build-windows-amd64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-macos-arm64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
Implement three authentication methods for Confluence, ServiceNow, and Azure DevOps:

1. **OAuth2** - Traditional OAuth flow for enterprise SSO environments
2. **Embedded Browser** - Webview-based login that captures session cookies/tokens
   - Solves VPN constraints: users authenticate off-VPN via web UI
   - Extracted credentials work on-VPN for API calls
   - Based on confluence-publisher agent pattern
3. **Manual Token** - Direct API token/PAT input as fallback

**Changes:**
- Add webview_auth.rs module for embedded browser authentication
- Implement authenticate_with_webview and extract_cookies_from_webview commands
- Implement save_manual_token command with validation
- Add AuthMethod enum to support all three modes
- Add RadioGroup UI component for mode selection
- Complete rewrite of Integrations settings page with mode-specific UI
- Add secondary button variant for UI consistency

**VPN-friendly design:**
Users can authenticate via webview when off-VPN (web UI accessible), then use extracted cookies for API calls when on-VPN (API requires VPN). Addresses enterprise SSO limitations where OAuth app registration is blocked.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-04-03 17:26:09 -05:00
Shaun Arman
2c5e04a6ce feat: add temperature and max_tokens support for Custom REST providers (v0.2.9)
Some checks failed
Release / build-linux-amd64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
Release / build-macos-arm64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
- Added max_tokens and temperature fields to ProviderConfig
- Custom REST providers now send modelConfig with temperature and max_tokens
- OpenAI-compatible providers now use configured max_tokens/temperature
- Both formats fall back to defaults if not specified
- Bumped version to 0.2.9

This allows users to configure response length and randomness for all
AI providers, including Custom REST providers which require modelConfig format.
2026-04-03 17:08:34 -05:00
Shaun Arman
1d40dfb15b fix: use Wiki secret for authenticated wiki sync (v0.2.8)
Some checks failed
Release / build-macos-arm64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
- Updated wiki-sync job to use secrets.Wiki for authentication
- Simplified clone/push logic with token-based auth
- Wiki push will now succeed with proper credentials
- Bumped version to 0.2.8

The workflow now uses the 'Wiki' secret created in Gitea Actions
to authenticate wiki repository pushes. This fixes the authentication
issue that was preventing automatic wiki synchronization.
2026-04-03 16:47:32 -05:00
Shaun Arman
94b486b801 feat: add automatic wiki sync to CI workflow (v0.2.7)
- Added wiki-sync job to .gitea/workflows/test.yml
- Runs only on pushes to master branch
- Automatically copies docs/wiki/*.md to Gogs wiki repository
- Supports token-based authentication via secrets.GITHUB_TOKEN
- Handles wiki initialization if repository doesn't exist
- Bumped version to 0.2.7

Wiki sync will now automatically update the Gogs wiki at
https://gogs.tftsr.com/sarman/tftsr-devops_investigation/wiki
whenever docs/wiki/ files are modified on master.
2026-04-03 16:42:37 -05:00
Shaun Arman
5f9798a4fd docs: update wiki for v0.2.6 - integrations and Custom REST provider
Updated 5 wiki pages:

Home.md:
- Updated version to v0.2.6
- Added Custom REST provider and custom provider support to features
- Updated integration status from stubs to complete
- Updated release table with v0.2.3 and v0.2.6 highlights

Integrations.md:
- Complete rewrite: Changed from 'v0.2 stubs' to fully implemented
- Added detailed docs for Confluence REST API client (6 tests)
- Added detailed docs for ServiceNow REST API client (7 tests)
- Added detailed docs for Azure DevOps REST API client (6 tests)
- Documented OAuth2 PKCE flow implementation
- Added database schema for credentials and integration_config tables
- Added troubleshooting section with common OAuth/API errors

AI-Providers.md:
- Added section for Custom Provider (Custom REST provider)
- Documented Custom REST provider API format differences from OpenAI
- Added request/response format examples
- Added configuration instructions and troubleshooting
- Documented custom provider fields (api_format, custom_endpoint_path, etc)
- Added available Custom REST provider models list

IPC-Commands.md:
- Replaced 'v0.2 stubs' section with full implementation details
- Added OAuth2 commands (initiate_oauth, handle_oauth_callback)
- Added Confluence commands (5 functions)
- Added ServiceNow commands (5 functions)
- Added Azure DevOps commands (5 functions)
- Documented authentication storage with AES-256-GCM encryption
- Added common types (ConnectionResult, PublishResult, TicketResult)

Database.md:
- Updated migration count from 10 to 11
- Added migration 011: credentials and integration_config tables
- Documented AES-256-GCM encryption for OAuth tokens
- Added usage notes for OAuth2 vs basic auth storage
2026-04-03 16:39:49 -05:00
Shaun Arman
a42745b791 fix: add user_id support and OAuth shell permission (v0.2.6)
Some checks failed
Release / build-linux-arm64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-macos-arm64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
Fixes:
- Added shell:allow-open permission to fix OAuth integration flows
- Added user_id field to ProviderConfig for Custom REST provider CORE ID
- Added UI field for user_id when api_format is custom_rest
- Made userId optional in Custom REST provider requests (only sent if provided)
- Added X-msi-genai-client header to Custom REST provider requests
- Updated CSP to include Custom REST provider domains
- Bumped version to 0.2.6

This fixes:
- OAuth error: 'Command plugin:shell|open not allowed by ACL'
- Missing User ID field in Custom REST provider configuration UI
2026-04-03 16:34:00 -05:00
Shaun Arman
dd06566375 docs: add Custom REST provider documentation
Some checks failed
Release / build-macos-arm64 (push) Has been cancelled
Release / build-linux-amd64 (push) Has been cancelled
Release / build-linux-arm64 (push) Has been cancelled
Release / build-windows-amd64 (push) Has been cancelled
- Added GenAI API User Guide.md with complete API specification
- Added HANDOFF-MSI-GENAI.md documenting custom provider implementation
- Includes API endpoints, request/response formats, available models, and rate limits
2026-04-03 15:45:52 -05:00
Shaun Arman
190084888c feat: add Custom REST provider support
- Extended ProviderConfig with optional custom fields for non-OpenAI APIs
- Added custom_endpoint_path, custom_auth_header, custom_auth_prefix fields
- Added api_format field to distinguish between OpenAI and Custom REST provider formats
- Added session_id field for stateful conversation APIs
- Implemented chat_custom_rest() method in OpenAI provider
- Custom REST provider uses different request format (prompt+sessionId) and response (msg field)
- Updated TypeScript types to match Rust schema
- Added UI controls in Settings/AIProviders for custom provider configuration
- API format selector auto-populates appropriate defaults (OpenAI vs Custom REST provider)
- Backward compatible: existing providers default to OpenAI format
2026-04-03 15:45:42 -05:00
35 changed files with 428 additions and 4854 deletions

View File

@ -1,175 +0,0 @@
# Integration Authentication Guide
## Overview
The TRCAA application supports three integration authentication methods, with automatic fallback between them:
1. **API Tokens** (Manual) - Recommended ✅
2. **OAuth 2.0** - Fully automated (when configured)
3. **Browser Cookies** - Partially working ⚠️
## Authentication Priority
When you ask an AI question, the system attempts authentication in this order:
```
1. Extract cookies from persistent browser window
↓ (if fails)
2. Use stored API token from database
↓ (if fails)
3. Skip that integration and log guidance
```
## HttpOnly Cookie Limitation
**Problem**: Confluence, ServiceNow, and Azure DevOps use **HttpOnly cookies** for security. These cookies:
- ✅ Exist in the persistent browser window
- ✅ Are sent automatically by the browser
- ❌ **Cannot be extracted by JavaScript** (security feature)
- ❌ **Cannot be used in separate HTTP requests**
**Impact**: Cookie extraction via the persistent browser window **fails** for HttpOnly cookies, even though you're logged in.
## Recommended Solution: Use API Tokens
### Confluence Personal Access Token
1. Log into Confluence
2. Go to **Profile → Settings → Personal Access Tokens**
3. Click **Create token**
4. Copy the generated token
5. In TRCAA app:
- Go to **Settings → Integrations**
- Find your Confluence integration
- Click **"Save Manual Token"**
- Paste the token
- Token Type: `Bearer`
### ServiceNow API Key
1. Log into ServiceNow
2. Go to **System Security → Application Registry**
3. Click **New → OAuth API endpoint for external clients**
4. Configure and generate API key
5. In TRCAA app:
- Go to **Settings → Integrations**
- Find your ServiceNow integration
- Click **"Save Manual Token"**
- Paste the API key
### Azure DevOps Personal Access Token (PAT)
1. Log into Azure DevOps
2. Click **User Settings (top right) → Personal Access Tokens**
3. Click **New Token**
4. Scopes: Select **Read** for:
- Code (for wiki)
- Work Items (for work item search)
5. Click **Create** and copy the token
6. In TRCAA app:
- Go to **Settings → Integrations**
- Find your Azure DevOps integration
- Click **"Save Manual Token"**
- Paste the token
- Token Type: `Bearer`
## Verification
After adding API tokens, test the integration:
1. Open or create an issue
2. Go to Triage page
3. Ask a question like: "How do I upgrade Vesta NXT to 1.0.12"
4. Check the logs for:
```
INFO Using stored cookies for confluence (count: 1)
INFO Found X integration sources for AI context
```
If successful, the AI response should include:
- Content from internal documentation
- Source citations with URLs
- Links to Confluence/ServiceNow/Azure DevOps pages
## Troubleshooting
### No search results found
**Symptom**: AI gives generic answers instead of internal documentation
**Check logs for**:
```
WARN Unable to search confluence - no authentication available
```
**Solution**: Add an API token (see above)
### Cookie extraction timeout
**Symptom**: Logs show:
```
WARN Failed to extract cookies from confluence: Timeout extracting cookies
```
**Why**: HttpOnly cookies cannot be extracted via JavaScript
**Solution**: Use API tokens instead
### Integration not configured
**Symptom**: No integration searches at all
**Check**: Settings → Integrations - ensure integration is added with:
- Base URL configured
- Either browser window open OR API token saved
## Future Enhancements
### Native Cookie Extraction (Planned)
We plan to implement platform-specific native cookie extraction that can access HttpOnly cookies directly from the webview's cookie store:
- **macOS**: Use WKWebView's HTTPCookieStore (requires `cocoa`/`objc` crates)
- **Windows**: Use WebView2's cookie manager (requires `windows` crate)
- **Linux**: Use WebKitGTK cookie manager (requires `webkit2gtk` binding)
This will make the persistent browser approach fully automatic, even with HttpOnly cookies.
### Webview-Based Search (Experimental)
Another approach is to make search requests FROM within the authenticated webview using JavaScript fetch, which automatically includes HttpOnly cookies. This requires reliable IPC communication between JavaScript and Rust.
## Security Notes
### Token Storage
API tokens are:
- ✅ **Encrypted** using AES-256-GCM before storage
- ✅ **Hashed** (SHA-256) for audit logging
- ✅ Stored in encrypted SQLite database
- ✅ Never exposed to frontend JavaScript
### Cookie Storage (when working)
Extracted cookies are:
- ✅ Encrypted before database storage
- ✅ Only retrieved when making API requests
- ✅ Transmitted only over HTTPS
### Audit Trail
All integration authentication attempts are logged:
- Cookie extraction attempts
- Token usage
- Search requests
- Authentication failures
Check **Settings → Security → Audit Log** to review activity.
## Summary
**For reliable integration search NOW**: Use API tokens (Option 1)
**For automatic integration search LATER**: Native cookie extraction will be implemented in a future update
**Current workaround**: API tokens provide full functionality without browser dependency

View File

@ -1,254 +0,0 @@
# Ticket Summary - Persistent Browser Windows for Integration Authentication
## Description
Implement persistent browser window sessions for integration authentication (Confluence, Azure DevOps, ServiceNow). Browser windows now persist across application restarts, eliminating the need to extract HttpOnly cookies via JavaScript (which fails due to browser security restrictions).
This follows a Playwright-style "piggyback" authentication approach where the browser window maintains its own internal cookie store, allowing the user to log in once and have the session persist indefinitely until they manually close the window.
## Acceptance Criteria
- [x] Integration browser windows persist to database when created
- [x] Browser windows are automatically restored on app startup
- [x] Cookies are maintained automatically by the browser's internal store (no JavaScript extraction of HttpOnly cookies)
- [x] Windows can be manually closed by the user, which removes them from persistence
- [x] Database migration creates `persistent_webviews` table
- [x] Window close events are handled to update database and in-memory tracking
## Work Implemented
### 1. Database Migration for Persistent Webviews
**Files Modified:**
- `src-tauri/src/db/migrations.rs:154-167`
**Changes:**
- Added migration `013_create_persistent_webviews` to create the `persistent_webviews` table
- Table schema includes:
- `id` (TEXT PRIMARY KEY)
- `service` (TEXT with CHECK constraint for 'confluence', 'servicenow', 'azuredevops')
- `webview_label` (TEXT - the Tauri window identifier)
- `base_url` (TEXT - the integration base URL)
- `last_active` (TEXT timestamp, defaults to now)
- `window_x`, `window_y`, `window_width`, `window_height` (INTEGER - for future window position persistence)
- UNIQUE constraint on `service` (one browser window per integration)
### 2. Webview Persistence on Creation
**Files Modified:**
- `src-tauri/src/commands/integrations.rs:531-591`
**Changes:**
- Modified `authenticate_with_webview` command to persist webview state to database after creation
- Stores service name, webview label, and base URL
- Logs persistence operation for debugging
- Sets up window close event handler to remove webview from tracking and database
- Event handler properly clones Arc fields for `'static` lifetime requirement
- Updated success message to inform user that window persists across restarts
### 3. Webview Restoration on App Startup
**Files Modified:**
- `src-tauri/src/commands/integrations.rs:793-865` - Added `restore_persistent_webviews` function
- `src-tauri/src/lib.rs:60-84` - Added `.setup()` hook to call restoration
**Changes:**
- Added `restore_persistent_webviews` async function that:
- Queries `persistent_webviews` table for all saved webviews
- Recreates each webview window by calling `authenticate_with_webview`
- Updates in-memory tracking map
- Removes from database if restoration fails
- Logs all operations for debugging
- Updated `lib.rs` to call restoration in `.setup()` hook:
- Clones Arc fields from `AppState` for `'static` lifetime
- Spawns async task to restore webviews
- Logs warnings if restoration fails
### 4. Window Close Event Handling
**Files Modified:**
- `src-tauri/src/commands/integrations.rs:559-591`
**Changes:**
- Added `on_window_event` listener to detect window close events
- On `CloseRequested` event:
- Spawns async task to clean up
- Removes service from in-memory `integration_webviews` map
- Deletes entry from `persistent_webviews` database table
- Logs all cleanup operations
- Properly handles Arc cloning to avoid lifetime issues in spawned task
### 5. Removed Auto-Close Behavior
**Files Modified:**
- `src-tauri/src/commands/integrations.rs:606-618`
**Changes:**
- Removed automatic window closing in `extract_cookies_from_webview`
- Windows now stay open after cookie extraction
- Updated success message to inform user that window persists for future use
### 6. Frontend UI Update - Removed "Complete Login" Button
**Files Modified:**
- `src/pages/Settings/Integrations.tsx:371-409` - Updated webview authentication UI
- `src/pages/Settings/Integrations.tsx:140-165` - Simplified `handleConnectWebview`
- `src/pages/Settings/Integrations.tsx:167-200` - Removed `handleCompleteWebviewLogin` function
- `src/pages/Settings/Integrations.tsx:16-26` - Removed unused `extractCookiesFromWebviewCmd` import
- `src/pages/Settings/Integrations.tsx:670-677` - Updated authentication method comparison text
**Changes:**
- Removed "Complete Login" button that tried to extract cookies via JavaScript
- Updated UI to show success message when browser opens, explaining persistence
- Removed confusing two-step flow (open browser → complete login)
- New flow: click "Open Browser" → log in → leave window open (that's it!)
- Updated description text to explain persistent window behavior
- Mark integration as "connected" immediately when browser opens
- Removed unused function and import for cookie extraction
### 7. Unused Import Cleanup
**Files Modified:**
- `src-tauri/src/integrations/webview_auth.rs:2`
- `src-tauri/src/lib.rs:13` - Added `use tauri::Manager;`
**Changes:**
- Removed unused `Listener` import from webview_auth.rs
- Added `Manager` trait import to lib.rs for `.state()` method
## Testing Needed
### Manual Testing
1. **Initial Browser Window Creation**
- [ ] Navigate to Settings > Integrations
- [ ] Configure a Confluence integration with base URL
- [ ] Click "Open Browser" button
- [ ] Verify browser window opens with Confluence login page
- [ ] Complete login in the browser window
- [ ] Verify window stays open after login
2. **Window Persistence Across Restarts**
- [ ] With Confluence browser window open, close the main application
- [ ] Relaunch the application
- [ ] Verify Confluence browser window is automatically restored
- [ ] Verify you are still logged in (cookies maintained)
- [ ] Navigate to different pages in Confluence to verify session works
3. **Manual Window Close**
- [ ] With browser window open, manually close it (X button)
- [ ] Restart the application
- [ ] Verify browser window does NOT reopen (removed from persistence)
4. **Database Verification**
- [ ] Open database: `sqlite3 ~/Library/Application\ Support/trcaa/data.db`
- [ ] Run: `SELECT * FROM persistent_webviews;`
- [ ] Verify entry exists when window is open
- [ ] Close window and verify entry is removed
5. **Multiple Integration Windows**
- [ ] Open browser window for Confluence
- [ ] Open browser window for Azure DevOps
- [ ] Restart application
- [ ] Verify both windows are restored
- [ ] Close one window
- [ ] Verify only one is removed from database
- [ ] Restart and verify remaining window still restores
6. **Cookie Persistence (No HttpOnly Extraction Needed)**
- [ ] Log into Confluence browser window
- [ ] Close main application
- [ ] Relaunch application
- [ ] Navigate to a Confluence page that requires authentication
- [ ] Verify you are still logged in (cookies maintained by browser)
### Automated Testing
```bash
# Type checking
npx tsc --noEmit
# Rust compilation
cargo check --manifest-path src-tauri/Cargo.toml
# Rust tests
cargo test --manifest-path src-tauri/Cargo.toml
# Rust linting
cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings
```
### Edge Cases to Test
- Application crash while browser window is open (verify restoration on next launch)
- Database corruption (verify graceful handling of restore failures)
- Window already exists when trying to create duplicate (verify existing window is focused)
- Network connectivity lost during window restoration (verify error handling)
- Multiple rapid window open/close cycles (verify database consistency)
## Architecture Notes
### Design Decision: Persistent Windows vs Cookie Extraction
**Problem:** HttpOnly cookies cannot be accessed via JavaScript (`document.cookie`), which broke the original cookie extraction approach for Confluence and other services.
**Solution:** Instead of extracting cookies, keep the browser window alive across app restarts:
- Browser maintains its own internal cookie store (includes HttpOnly cookies)
- Cookies are automatically sent with all HTTP requests from the browser
- No need for JavaScript extraction or manual token management
- Matches Playwright's approach of persistent browser contexts
### Lifecycle Flow
1. **Window Creation:** User clicks "Open Browser" → `authenticate_with_webview` creates window → State saved to database
2. **App Running:** Window stays open, user can browse freely, cookies maintained by browser
3. **Window Close:** User closes window → Event handler removes from database and memory
4. **App Restart:** `restore_persistent_webviews` queries database → Recreates all windows → Windows resume with original cookies
### Database Schema
```sql
CREATE TABLE persistent_webviews (
id TEXT PRIMARY KEY,
service TEXT NOT NULL CHECK(service IN ('confluence','servicenow','azuredevops')),
webview_label TEXT NOT NULL,
base_url TEXT NOT NULL,
last_active TEXT NOT NULL DEFAULT (datetime('now')),
window_x INTEGER,
window_y INTEGER,
window_width INTEGER,
window_height INTEGER,
UNIQUE(service)
);
```
### Future Enhancements
- [ ] Save and restore window position/size (columns already exist in schema)
- [ ] Add "last_active" timestamp updates on window focus events
- [ ] Implement "Close All Windows" command for cleanup
- [ ] Add visual indicator in main UI showing which integrations have active browser windows
- [ ] Implement session timeout logic (close windows after X days of inactivity)
## Related Files
- `src-tauri/src/db/migrations.rs` - Database schema migration
- `src-tauri/src/commands/integrations.rs` - Webview persistence and restoration logic
- `src-tauri/src/integrations/webview_auth.rs` - Browser window creation
- `src-tauri/src/lib.rs` - App startup hook for restoration
- `src-tauri/src/state.rs` - AppState structure with `integration_webviews` map
## Security Considerations
- Cookie storage remains in the browser's internal secure store (not extracted to database)
- Database only stores window metadata (service, label, URL)
- No credential information persisted beyond what the browser already maintains
- Audit log still tracks all integration API calls separately
## Migration Path
Users upgrading to this version will:
1. See new database migration `013_create_persistent_webviews` applied automatically
2. Existing integrations continue to work (migration is additive only)
3. First time opening a browser window will persist it for future sessions
4. No manual action required from users

View File

@ -1,536 +1,134 @@
# Ticket Summary - Integration Search + AI Tool-Calling Implementation
# Ticket Summary - UI Fixes and Audit Log Enhancement
## Description
This ticket implements Confluence, ServiceNow, and Azure DevOps as primary data sources for AI queries. When users ask questions in the AI chat, the system now searches these internal documentation sources first and injects the results as context before sending the query to the AI provider. This ensures the AI prioritizes internal company documentation over general knowledge.
This ticket addresses multiple UI and functionality issues reported in the tftsr-devops_investigation application:
**User Requirement:** "using confluance as the initial data source was a key requirement. The same for ServiceNow and ADO"
**Example Use Case:** When asking "How do I upgrade Vesta NXT to 1.0.12", the AI should return the Confluence documentation link or content from internal wiki pages, rather than generic upgrade instructions.
### AI Tool-Calling Implementation
This ticket also implements AI function calling (tool calling) to allow AI to automatically execute actions like adding comments to Azure DevOps tickets. When the AI determines it should perform an action (rather than just respond with text), it can call defined tools/functions and the system will execute them, returning results to the AI for further processing.
**User Requirement:** "using the AI intagration, I wanted to beable to ask it to put a coment in a ADO ticket and have it pull the data from the integration search and then post a coment in the ticket"
**Example Use Case:** When asking "Add a comment to ADO ticket 758421 with the test results", the AI should automatically call the `add_ado_comment` tool with the appropriate parameters, execute the action, and confirm completion.
1. **Download Icons Visibility**: Download icons (PDF, DOCX) in RCA and Post-Mortem pages were not visible in dark theme
2. **Export File System Error**: "Read-only file system (os error 30)" error when attempting to export documents
3. **History Search Button**: Search button not visible in the History page
4. **Domain Filtering**: Domain-only filtering not working in History page
5. **Audit Log Enhancement**: Audit log showed only internal IDs, lacking actual transmitted data for security auditing
## Acceptance Criteria
- [x] Confluence search integration retrieves wiki pages matching user queries
- [x] ServiceNow search integration retrieves knowledge base articles and related incidents
- [x] Azure DevOps search integration retrieves wiki pages and work items
- [x] Integration searches execute in parallel for performance
- [x] Search results are injected as system context before AI queries
- [x] AI responses include source citations with URLs from internal documentation
- [x] System uses persistent browser cookies from authenticated sessions
- [x] Graceful fallback when integration sources are unavailable
- [x] All searches complete successfully without compilation errors
- [x] AI tool-calling architecture implemented with Provider trait support
- [x] Tool definitions created for available actions (add_ado_comment)
- [x] Tool execution loop implemented in chat_message command
- [x] OpenAI-compatible providers support tool-calling
- [x] MSI GenAI custom REST provider supports tool-calling
- [ ] Tool-calling tested with MSI GenAI provider (pending user testing)
- [ ] AI successfully executes add_ado_comment when requested
- [ ] Download icons are visible in both light and dark themes on RCA and Post-Mortem pages
- [ ] Documents can be exported successfully to Downloads directory without filesystem errors
- [ ] Search button is visible with proper styling in History page
- [ ] Domain filter works independently without requiring a search query
- [ ] Audit log displays full transmitted data including:
- AI chat messages with provider details, user message, and response preview
- Document generation with content preview and metadata
- All entries show properly formatted JSON with details
## Work Implemented
### 1. Confluence Search Module
**Files Created:**
- `src-tauri/src/integrations/confluence_search.rs` (173 lines)
**Implementation:**
```rust
pub async fn search_confluence(
base_url: &str,
query: &str,
cookies: &[Cookie],
) -> Result<Vec<SearchResult>, String>
```
**Features:**
- Uses Confluence CQL (Confluence Query Language) search API
- Searches text content across all wiki pages
- Fetches full page content via `/rest/api/content/{id}?expand=body.storage`
- Strips HTML tags from content for clean AI context
- Returns top 3 most relevant results
- Truncates content to 3000 characters for AI context window
- Includes title, URL, excerpt, and full content in results
### 2. ServiceNow Search Module
**Files Created:**
- `src-tauri/src/integrations/servicenow_search.rs` (181 lines)
**Implementation:**
```rust
pub async fn search_servicenow(
instance_url: &str,
query: &str,
cookies: &[Cookie],
) -> Result<Vec<SearchResult>, String>
pub async fn search_incidents(
instance_url: &str,
query: &str,
cookies: &[Cookie],
) -> Result<Vec<SearchResult>, String>
```
**Features:**
- Searches Knowledge Base articles via `/api/now/table/kb_knowledge`
- Searches incidents via `/api/now/table/incident`
- Uses ServiceNow query language with `LIKE` operators
- Returns article text and incident descriptions/resolutions
- Includes incident numbers and states in results
- Top 3 knowledge base articles + top 3 incidents
### 3. Azure DevOps Search Module
**Files Created:**
- `src-tauri/src/integrations/azuredevops_search.rs` (274 lines)
**Implementation:**
```rust
pub async fn search_wiki(
org_url: &str,
project: &str,
query: &str,
cookies: &[Cookie],
) -> Result<Vec<SearchResult>, String>
pub async fn search_work_items(
org_url: &str,
project: &str,
query: &str,
cookies: &[Cookie],
) -> Result<Vec<SearchResult>, String>
```
**Features:**
- Uses Azure DevOps Search API for wiki search
- Uses WIQL (Work Item Query Language) for work item search
- Fetches full wiki page content via `/api/wiki/wikis/{id}/pages`
- Retrieves work item details including descriptions and states
- Project-scoped searches for better relevance
- Returns top 3 wiki pages + top 3 work items
### 4. AI Command Integration
### 1. Download Icons Visibility Fix
**Files Modified:**
- `src-tauri/src/commands/ai.rs:377-511` (Added `search_integration_sources` function)
**Implementation:**
```rust
async fn search_integration_sources(
query: &str,
app_handle: &tauri::AppHandle,
state: &State<'_, AppState>,
) -> String
```
**Features:**
- Queries database for all configured integrations
- Retrieves persistent browser cookies for each integration
- Spawns parallel tokio tasks for each integration search
- Aggregates results from all sources
- Formats results as AI context with source metadata
- Returns formatted context string for injection into AI prompts
**Context Injection:**
```rust
if !integration_context.is_empty() {
let context_message = Message {
role: "system".into(),
content: format!(
"INTERNAL DOCUMENTATION SOURCES:\n\n{}\n\n\
Instructions: The above content is from internal company \
documentation systems (Confluence, ServiceNow, Azure DevOps). \
You MUST prioritize this information when answering. Include \
source citations with URLs in your response. Only use general \
knowledge if the internal documentation doesn't cover the question.",
integration_context
),
};
messages.push(context_message);
}
```
### 5. AI Tool-Calling Architecture
**Files Created/Modified:**
- `src-tauri/src/ai/tools.rs` (43 lines) - NEW FILE
- `src-tauri/src/ai/mod.rs:34-68` (Added tool-calling data structures)
- `src-tauri/src/ai/provider.rs:16` (Added tools parameter to Provider trait)
- `src-tauri/src/ai/openai.rs:89-113, 137-157, 257-376` (Tool-calling for OpenAI and MSI GenAI)
- `src-tauri/src/commands/ai.rs:60-98, 126-167` (Tool execution and chat loop)
- `src-tauri/src/commands/integrations.rs:85-121` (add_ado_comment command)
**Implementation:**
**Tool Definitions (`src-tauri/src/ai/tools.rs`):**
```rust
pub fn get_available_tools() -> Vec<Tool> {
vec![get_add_ado_comment_tool()]
}
fn get_add_ado_comment_tool() -> Tool {
Tool {
name: "add_ado_comment".to_string(),
description: "Add a comment to an Azure DevOps work item".to_string(),
parameters: ToolParameters {
param_type: "object".to_string(),
properties: {
"work_item_id": integer,
"comment_text": string
},
required: vec!["work_item_id", "comment_text"],
},
}
}
```
**Data Structures (`src-tauri/src/ai/mod.rs`):**
```rust
pub struct ToolCall {
pub id: String,
pub name: String,
pub arguments: String, // JSON string
}
pub struct Message {
pub role: String,
pub content: String,
pub tool_call_id: Option<String>,
pub tool_calls: Option<Vec<ToolCall>>,
}
pub struct ChatResponse {
pub content: String,
pub model: String,
pub usage: Option<TokenUsage>,
pub tool_calls: Option<Vec<ToolCall>>,
}
```
**OpenAI Provider (`src-tauri/src/ai/openai.rs`):**
- Sends tools in OpenAI format: `{"type": "function", "function": {...}}`
- Parses `tool_calls` array from response
- Sets `tool_choice: "auto"` to enable automatic tool selection
- Works with OpenAI, Azure OpenAI, and compatible APIs
**MSI GenAI Provider (`src-tauri/src/ai/openai.rs::chat_custom_rest`):**
- Sends tools in OpenAI-compatible format (MSI GenAI standard)
- Adds `tools` and `tool_choice` fields to request body
- Parses multiple response formats:
- OpenAI format: `tool_calls[].function.name/arguments`
- Simpler format: `tool_calls[].name/arguments`
- Alternative field names: `toolCalls`, `function_calls`
- Enhanced logging for debugging tool call responses
- Generates tool call IDs if not provided by API
**Tool Executor (`src-tauri/src/commands/ai.rs`):**
```rust
async fn execute_tool_call(
tool_call: &crate::ai::ToolCall,
app_handle: &tauri::AppHandle,
app_state: &State<'_, AppState>,
) -> Result<String, String> {
match tool_call.name.as_str() {
"add_ado_comment" => {
let args: serde_json::Value = serde_json::from_str(&tool_call.arguments)?;
let work_item_id = args.get("work_item_id").and_then(|v| v.as_i64())?;
let comment_text = args.get("comment_text").and_then(|v| v.as_str())?;
crate::commands::integrations::add_ado_comment(
work_item_id,
comment_text.to_string(),
app_handle.clone(),
app_state.clone(),
).await
}
_ => Err(format!("Unknown tool: {}", tool_call.name))
}
}
```
**Chat Loop with Tool-Calling (`src-tauri/src/commands/ai.rs::chat_message`):**
```rust
let tools = Some(crate::ai::tools::get_available_tools());
let max_iterations = 10;
let mut iteration = 0;
loop {
iteration += 1;
if iteration > max_iterations {
return Err("Tool-calling loop exceeded maximum iterations".to_string());
}
let response = provider.chat(messages.clone(), &provider_config, tools.clone()).await?;
// Check if AI wants to call any tools
if let Some(tool_calls) = &response.tool_calls {
for tool_call in tool_calls {
// Execute the tool
let tool_result = execute_tool_call(tool_call, &app_handle, &state).await;
let result_content = match tool_result {
Ok(result) => result,
Err(e) => format!("Error executing tool: {}", e),
};
// Add tool result to conversation
messages.push(Message {
role: "tool".into(),
content: result_content,
tool_call_id: Some(tool_call.id.clone()),
tool_calls: None,
});
}
continue; // Loop back to get AI's next response
}
// No more tool calls - return final response
final_response = response;
break;
}
```
**Features:**
- Iterative tool-calling loop (up to 10 iterations)
- AI can call multiple tools in sequence
- Tool results injected back into conversation
- Error handling for invalid tool calls
- Support for both OpenAI and MSI GenAI providers
- Extensible architecture for adding new tools
**Provider Compatibility:**
All AI providers updated to support tools parameter:
- `src-tauri/src/ai/anthropic.rs` - Added `_tools` parameter (not yet implemented)
- `src-tauri/src/ai/gemini.rs` - Added `_tools` parameter (not yet implemented)
- `src-tauri/src/ai/mistral.rs` - Added `_tools` parameter (not yet implemented)
- `src-tauri/src/ai/ollama.rs` - Added `_tools` parameter (not yet implemented)
- `src-tauri/src/ai/openai.rs` - **Fully implemented** for OpenAI and MSI GenAI
Note: Other providers are prepared for future tool-calling support but currently ignore the tools parameter. Only OpenAI-compatible providers and MSI GenAI have active tool-calling implementation.
### 7. Module Integration
**Files Modified:**
- `src-tauri/src/integrations/mod.rs:1-10` (Added search module exports)
- `src-tauri/src/ai/mod.rs:10` (Added tools export)
- `src/components/DocEditor.tsx:60-67`
**Changes:**
```rust
// integrations/mod.rs
pub mod confluence_search;
pub mod servicenow_search;
pub mod azuredevops_search;
- Added `text-foreground` class to Download icons for PDF and DOCX buttons
- Ensures icons inherit the current theme's foreground color for visibility
// ai/mod.rs
pub use tools::*;
```
### 8. Test Fixes
### 2. Export File System Error Fix
**Files Modified:**
- `src-tauri/src/integrations/confluence_search.rs:178-185` (Fixed test assertions)
- `src-tauri/src/integrations/azuredevops_search.rs:1` (Removed unused imports)
- `src-tauri/src/integrations/servicenow_search.rs:1` (Removed unused imports)
- `src-tauri/Cargo.toml:38` - Added `dirs = "5"` dependency
- `src-tauri/src/commands/docs.rs:127-170` - Rewrote `export_document` function
- `src/pages/RCA/index.tsx:53-60` - Updated error handling and user feedback
- `src/pages/Postmortem/index.tsx:52-59` - Updated error handling and user feedback
## Architecture
**Changes:**
- Modified `export_document` to use Downloads directory by default instead of "."
- Falls back to `app_data_dir/exports` if Downloads directory unavailable
- Added proper directory creation with error handling
- Updated frontend to show success message with file path
- Empty `output_dir` parameter now triggers default behavior
### Search Flow
### 3. Search Button Visibility Fix
**Files Modified:**
- `src/pages/History/index.tsx:124-127`
```
User asks question in AI chat
chat_message() command called
search_integration_sources() executed
Query database for integration configs
Get fresh cookies from persistent browsers
Spawn parallel search tasks:
- Confluence CQL search
- ServiceNow KB + incident search
- Azure DevOps wiki + work item search
Wait for all tasks to complete
Format results with source citations
Inject as system message in AI context
Send to AI provider with context
AI responds with source-aware answer
```
**Changes:**
- Changed button from `variant="outline"` to default variant
- Added Search icon to button for better visibility
- Button now has proper contrast in both themes
### Tool-Calling Flow
### 4. Domain-Only Filtering Fix
**Files Modified:**
- `src-tauri/src/commands/db.rs:305-312`
```
User asks AI to perform action (e.g., "Add comment to ticket 758421")
chat_message() command called
Get available tools (add_ado_comment)
Send message + tools to AI provider
AI decides to call tool → returns ToolCall in response
execute_tool_call() dispatches to appropriate handler
add_ado_comment() retrieves ADO config from DB
Gets fresh cookies from persistent ADO browser
Calls webview_fetch to POST comment via ADO API
Tool result returned as Message with role="tool"
Send updated conversation back to AI
AI processes result and responds to user
User sees confirmation: "I've successfully added the comment"
```
**Changes:**
- Added missing `filter.domain` handling in `list_issues` function
- Domain filter now properly filters by `i.category` field
- Filter works independently of search query
**Multi-Tool Support:**
- AI can call multiple tools in sequence
- Each tool result is added to conversation history
- Loop continues until AI provides final text response
- Maximum 10 iterations to prevent infinite loops
### 5. Audit Log Enhancement
**Files Modified:**
- `src-tauri/src/commands/ai.rs:242-266` - Enhanced AI chat audit logging
- `src-tauri/src/commands/docs.rs:44-73` - Enhanced RCA generation audit logging
- `src-tauri/src/commands/docs.rs:90-119` - Enhanced postmortem generation audit logging
- `src/pages/Settings/Security.tsx:191-206` - Enhanced audit log display
**Error Handling:**
- Invalid tool calls return error message to AI
- AI can retry with corrected parameters
- Missing arguments caught and reported
- Unknown tool names return error
### Database Query
Integration configurations are queried from the `integration_config` table:
```sql
SELECT service, base_url, username, project_name, space_key
FROM integration_config
```
This provides:
- `service`: "confluence", "servicenow", or "azuredevops"
- `base_url`: Integration instance URL
- `project_name`: For Azure DevOps project scoping
- `space_key`: For future Confluence space scoping
### Cookie Management
Persistent browser windows maintain authenticated sessions. The `get_fresh_cookies_from_webview()` function retrieves current cookies from the browser window, ensuring authentication remains valid across sessions.
### Parallel Execution
All integration searches execute in parallel using `tokio::spawn()`:
```rust
for config in configs {
let cookies_result = get_fresh_cookies_from_webview(&config.service, ...).await;
if let Ok(Some(cookies)) = cookies_result {
match config.service.as_str() {
"confluence" => {
search_tasks.push(tokio::spawn(async move {
confluence_search::search_confluence(...).await
.unwrap_or_default()
}));
}
// ... other integrations
}
}
}
// Wait for all searches
for task in search_tasks {
if let Ok(results) = task.await {
all_results.extend(results);
}
}
```
### Error Handling
- Database lock failures return empty context (non-blocking)
- SQL query errors return empty context (non-blocking)
- Missing cookies skip that integration (non-blocking)
- Failed search requests return empty results (non-blocking)
- All errors are logged via `tracing::warn!`
- AI query proceeds with whatever context is available
**Changes:**
- AI chat audit now captures:
- Provider name, model, and API URL
- Full user message
- Response preview (first 200 chars)
- Token count
- Document generation audit now captures:
- Issue ID and title
- Document type and title
- Content length and preview (first 300 chars)
- Security page now displays:
- Pretty-printed JSON with proper formatting
- Entry ID and entity type below the data
- Better layout with whitespace handling
## Testing Needed
### Manual Testing
1. **Confluence Integration**
- [ ] Configure Confluence integration with valid base URL
- [ ] Open persistent browser and log into Confluence
- [ ] Create a test issue and ask: "How do I upgrade Vesta NXT to 1.0.12"
- [ ] Verify AI response includes Confluence wiki content
- [ ] Verify response includes source URL
- [ ] Check logs for "Found X integration sources for AI context"
1. **Download Icons Visibility**
- [ ] Open RCA page in light theme
- [ ] Verify PDF and DOCX download icons are visible
- [ ] Switch to dark theme
- [ ] Verify PDF and DOCX download icons are still visible
2. **ServiceNow Integration**
- [ ] Configure ServiceNow integration with valid instance URL
- [ ] Open persistent browser and log into ServiceNow
- [ ] Ask question related to known KB article
- [ ] Verify AI response includes ServiceNow KB content
- [ ] Ask about known incident patterns
- [ ] Verify AI response includes incident information
2. **Export Functionality**
- [ ] Generate an RCA document
- [ ] Click "PDF" export button
- [ ] Verify file is created in Downloads directory
- [ ] Verify success message displays with file path
- [ ] Check file opens correctly
- [ ] Repeat for "MD" and "DOCX" formats
- [ ] Test on Post-Mortem page as well
3. **Azure DevOps Integration**
- [ ] Configure Azure DevOps integration with org URL and project
- [ ] Open persistent browser and log into Azure DevOps
- [ ] Ask question about documented features in ADO wiki
- [ ] Verify AI response includes ADO wiki content
- [ ] Ask about known work items
- [ ] Verify AI response includes work item details
3. **History Search Button**
- [ ] Navigate to History page
- [ ] Verify Search button is visible
- [ ] Verify button has search icon
- [ ] Test button in both light and dark themes
4. **Parallel Search Performance**
- [ ] Configure all three integrations
- [ ] Authenticate all three browsers
- [ ] Ask a question that matches content in all sources
- [ ] Verify results from multiple sources appear
- [ ] Check logs to confirm parallel execution
- [ ] Measure response time (should be <5s for all searches)
4. **Domain Filtering**
- [ ] Navigate to History page
- [ ] Select a domain from dropdown (e.g., "Linux")
- [ ] Do NOT enter any search text
- [ ] Verify issues are filtered by selected domain
- [ ] Change domain selection
- [ ] Verify filtering updates correctly
5. **Graceful Degradation**
- [ ] Test with only Confluence configured
- [ ] Verify AI still works with single source
- [ ] Test with no integrations configured
- [ ] Verify AI still works with general knowledge
- [ ] Test with integration browser closed
- [ ] Verify AI continues with available sources
6. **AI Tool-Calling with MSI GenAI**
- [ ] Configure MSI GenAI as active AI provider
- [ ] Configure Azure DevOps integration and authenticate
- [ ] Create test issue and start triage conversation
- [ ] Ask: "Add a comment to ADO ticket 758421 saying 'This is a test'"
- [ ] Verify AI calls add_ado_comment tool (check logs for "MSI GenAI: Parsed tool call")
- [ ] Verify comment appears in ADO ticket 758421
- [ ] Verify AI confirms action was completed
- [ ] Test with invalid ticket number (e.g., 99999999)
- [ ] Verify AI reports error gracefully
7. **AI Tool-Calling with OpenAI**
- [ ] Configure OpenAI or Azure OpenAI as active provider
- [ ] Repeat tool-calling tests from section 6
- [ ] Verify tool-calling works with OpenAI-compatible providers
- [ ] Test multi-tool scenario: "Add comment to 758421 and then another to 758422"
- [ ] Verify AI calls tool multiple times in sequence
8. **Tool-Calling Error Handling**
- [ ] Test with ADO browser closed (no cookies available)
- [ ] Verify AI reports authentication error
- [ ] Test with invalid work item ID format (non-integer)
- [ ] Verify error caught in tool executor
- [ ] Test with missing ADO configuration
- [ ] Verify graceful error message to user
5. **Audit Log**
- [ ] Perform an AI chat interaction
- [ ] Navigate to Settings > Security > Audit Log
- [ ] Click "View" on a recent entry
- [ ] Verify transmitted data shows:
- Provider details
- User message
- Response preview
- [ ] Generate an RCA or Post-Mortem
- [ ] Check audit log for document generation entry
- [ ] Verify content preview and metadata are visible
### Automated Testing
@ -538,183 +136,20 @@ for task in search_tasks {
# Type checking
npx tsc --noEmit
# Rust compilation check
# Rust compilation
cargo check --manifest-path src-tauri/Cargo.toml
# Run all tests
cargo test --manifest-path src-tauri/Cargo.toml
# Build debug version
cargo tauri build --debug
# Run linter
# Rust linting
cargo clippy --manifest-path src-tauri/Cargo.toml -- -D warnings
```
### Test Results
All tests passing:
```
test result: ok. 130 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out
# Frontend tests (if applicable)
npm run test:run
```
### Edge Cases to Test
- [ ] Query with no matching content in any source
- [ ] Query matching content in all three sources (verify aggregation)
- [ ] Very long query strings (>1000 characters)
- [ ] Special characters in queries (quotes, brackets, etc.)
- [ ] Integration returns >3 results (verify truncation)
- [ ] Integration returns very large content (verify 3000 char limit)
- [ ] Multiple persistent browsers for same integration
- [ ] Cookie expiration during search
- [ ] Network timeout during search
- [ ] Integration API version changes
- [ ] HTML content with complex nested tags
- [ ] Unicode content in search results
- [ ] AI calling same tool multiple times in one response
- [ ] Tool returning very large result (>10k characters)
- [ ] Tool execution timeout (slow API response)
- [ ] AI calling non-existent tool name
- [ ] Tool call with malformed JSON arguments
- [ ] Reaching max iteration limit (10 tool calls in sequence)
## Performance Considerations
### Content Truncation
- Wiki pages truncated to 3000 characters
- Knowledge base articles truncated to 3000 characters
- Excerpts limited to 200-300 characters
- Top 3 results per source type
These limits ensure:
- AI context window remains reasonable (~10k chars max)
- Response times stay under 5 seconds
- Costs remain manageable for AI providers
### Parallel Execution
- All integrations searched simultaneously
- No blocking between different sources
- Failed searches don't block successful ones
- Total time = slowest individual search, not sum
### Caching Strategy (Future Enhancement)
- Could cache search results for 5-10 minutes
- Would reduce API calls for repeated queries
- Needs invalidation strategy for updated content
## Security Considerations
1. **Cookie Security**
- Cookies stored in encrypted database
- Retrieved only when needed for API calls
- Never exposed to frontend
- Transmitted only over HTTPS
2. **Content Sanitization**
- HTML tags stripped from content
- No script injection possible
- Content truncated to prevent overflow
3. **Audit Trail**
- Integration searches not currently audited (future enhancement)
- AI chat with context is audited
- Could add audit entries for each integration query
4. **Access Control**
- Uses user's authenticated session
- Respects integration platform permissions
- No privilege escalation
## Known Issues / Future Enhancements
1. **Tool-Calling Format Unknown for MSI GenAI**
- Implementation uses OpenAI-compatible format as standard
- MSI GenAI response format for tool_calls is unknown (not documented)
- Code parses multiple possible response formats as fallback
- Requires real-world testing with MSI GenAI to verify
- May need format adjustments based on actual API responses
- Enhanced logging added to debug actual response structure
2. **ADO Browser Window Blank Page Issue**
- Azure DevOps browser opens as blank white page
- Requires closing and relaunching to get functional page
- Multiple attempts to fix (delayed show, immediate show, enhanced logging)
- Root cause not yet identified
- Workaround: Close and reopen ADO browser connection
- Needs diagnostic logging to identify root cause
3. **Limited Tool Support**
- Currently only one tool implemented: add_ado_comment
- Could add more tools: create_work_item, update_ticket_state, search_tickets
- Could add Confluence tools: create_page, update_page
- Could add ServiceNow tools: create_incident, assign_ticket
- Extensible architecture makes adding new tools straightforward
4. **No Search Result Caching**
- Every query searches all integrations
- Could cache results for repeated queries
- Would improve response time for common questions
5. **No Relevance Scoring**
- Returns top 3 results from each source
- No cross-platform relevance ranking
- Could implement scoring algorithm in future
6. **No Integration Search Audit**
- Integration queries not logged to audit table
- Only final AI interaction is audited
- Could add audit entries for transparency
7. **No Confluence Space Filtering**
- Searches all spaces
- `space_key` field in config not yet used
- Could restrict to specific spaces in future
8. **No ServiceNow Table Filtering**
- Searches all KB articles
- Could filter by category or state
- Could add configurable table names
9. **No Azure DevOps Area Path Filtering**
- Searches entire project
- Could filter by area path or iteration
- Could add configurable WIQL filters
## Dependencies
No new external dependencies added. Uses existing:
- `tokio` for async/parallel execution
- `reqwest` for HTTP requests
- `rusqlite` for database queries
- `urlencoding` for query encoding
- `serde_json` for API responses
## Documentation
This implementation is documented in:
- Code comments in all search modules
- Architecture section above
- CLAUDE.md project instructions
- Function-level documentation strings
## Rollback Plan
If issues are discovered:
1. **Disable Integration Search**
```rust
// In chat_message() function, comment out:
// let integration_context = search_integration_sources(...).await;
```
2. **Revert to Previous Behavior**
- AI will use only general knowledge
- No breaking changes to existing functionality
- All other features remain functional
3. **Clean Revert**
```bash
git revert <commit-hash>
cargo tauri build --debug
```
- Export when Downloads directory doesn't exist
- Export with very long document titles (special character handling)
- Domain filter with empty result set
- Audit log with very large payloads (>1000 chars)
- Audit log JSON parsing errors (malformed data)

166
src-tauri/Cargo.lock generated
View File

@ -263,12 +263,6 @@ dependencies = [
"constant_time_eq 0.4.2",
]
[[package]]
name = "block"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d8c1fef690941d3e7788d328517591fecc684c084084702d6ff1641e993699a"
[[package]]
name = "block-buffer"
version = "0.10.4"
@ -526,36 +520,6 @@ dependencies = [
"zeroize",
]
[[package]]
name = "cocoa"
version = "0.25.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6140449f97a6e97f9511815c5632d84c8aacf8ac271ad77c559218161a1373c"
dependencies = [
"bitflags 1.3.2",
"block",
"cocoa-foundation",
"core-foundation 0.9.4",
"core-graphics 0.23.2",
"foreign-types 0.5.0",
"libc",
"objc",
]
[[package]]
name = "cocoa-foundation"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c6234cbb2e4c785b456c0644748b1ac416dd045799740356f8363dfe00c93f7"
dependencies = [
"bitflags 1.3.2",
"block",
"core-foundation 0.9.4",
"core-graphics-types 0.1.3",
"libc",
"objc",
]
[[package]]
name = "color_quant"
version = "1.1.0"
@ -684,19 +648,6 @@ version = "0.8.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "773648b94d0e5d620f64f280777445740e61fe701025087ec8b57f45c791888b"
[[package]]
name = "core-graphics"
version = "0.23.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c07782be35f9e1140080c6b96f0d44b739e2278479f64e02fdab4e32dfd8b081"
dependencies = [
"bitflags 1.3.2",
"core-foundation 0.9.4",
"core-graphics-types 0.1.3",
"foreign-types 0.5.0",
"libc",
]
[[package]]
name = "core-graphics"
version = "0.25.0"
@ -705,22 +656,11 @@ checksum = "064badf302c3194842cf2c5d61f56cc88e54a759313879cdf03abdd27d0c3b97"
dependencies = [
"bitflags 2.11.0",
"core-foundation 0.10.1",
"core-graphics-types 0.2.0",
"core-graphics-types",
"foreign-types 0.5.0",
"libc",
]
[[package]]
name = "core-graphics-types"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "45390e6114f68f718cc7a830514a96f903cccd70d02a8f6d9f643ac4ba45afaf"
dependencies = [
"bitflags 1.3.2",
"core-foundation 0.9.4",
"libc",
]
[[package]]
name = "core-graphics-types"
version = "0.2.0"
@ -2892,15 +2832,6 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c41e0c4fef86961ac6d6f8a82609f55f31b05e4fce149ac5710e439df7619ba4"
[[package]]
name = "malloc_buf"
version = "0.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "62bb907fe88d54d8d9ce32a3cceab4218ed2f6b7d35617cafe9adf84e43919cb"
dependencies = [
"libc",
]
[[package]]
name = "markup5ever"
version = "0.14.1"
@ -3216,15 +3147,6 @@ dependencies = [
"syn 2.0.117",
]
[[package]]
name = "objc"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "915b1b472bc21c53464d6c8461c9d3af805ba1ef837e1cac254428f4a77177b1"
dependencies = [
"malloc_buf",
]
[[package]]
name = "objc2"
version = "0.6.4"
@ -5330,7 +5252,7 @@ dependencies = [
"bitflags 2.11.0",
"block2",
"core-foundation 0.10.1",
"core-graphics 0.25.0",
"core-graphics",
"crossbeam-channel",
"dispatch2",
"dlopen2",
@ -5748,6 +5670,47 @@ dependencies = [
"utf-8",
]
[[package]]
name = "tftsr"
version = "0.1.0"
dependencies = [
"aes-gcm",
"aho-corasick",
"anyhow",
"async-trait",
"base64 0.22.1",
"chrono",
"dirs 5.0.1",
"docx-rs",
"futures",
"hex",
"lazy_static",
"mockito",
"printpdf",
"rand 0.8.5",
"regex",
"reqwest 0.12.28",
"rusqlite",
"serde",
"serde_json",
"sha2",
"tauri",
"tauri-build",
"tauri-plugin-dialog",
"tauri-plugin-fs",
"tauri-plugin-http",
"tauri-plugin-shell",
"tauri-plugin-stronghold",
"thiserror 1.0.69",
"tokio",
"tokio-test",
"tracing",
"tracing-subscriber",
"urlencoding",
"uuid",
"warp",
]
[[package]]
name = "thiserror"
version = "1.0.69"
@ -6205,49 +6168,6 @@ dependencies = [
"windows-sys 0.60.2",
]
[[package]]
name = "trcaa"
version = "0.1.0"
dependencies = [
"aes-gcm",
"aho-corasick",
"anyhow",
"async-trait",
"base64 0.22.1",
"chrono",
"cocoa",
"dirs 5.0.1",
"docx-rs",
"futures",
"hex",
"lazy_static",
"mockito",
"objc",
"printpdf",
"rand 0.8.5",
"regex",
"reqwest 0.12.28",
"rusqlite",
"serde",
"serde_json",
"sha2",
"tauri",
"tauri-build",
"tauri-plugin-dialog",
"tauri-plugin-fs",
"tauri-plugin-http",
"tauri-plugin-shell",
"tauri-plugin-stronghold",
"thiserror 1.0.69",
"tokio",
"tokio-test",
"tracing",
"tracing-subscriber",
"urlencoding",
"uuid",
"warp",
]
[[package]]
name = "try-lock"
version = "0.2.5"

View File

@ -44,11 +44,6 @@ lazy_static = "1.4"
warp = "0.3"
urlencoding = "2"
# Platform-specific dependencies for native cookie extraction
[target.'cfg(target_os = "macos")'.dependencies]
cocoa = "0.25"
objc = "0.2"
[dev-dependencies]
tokio-test = "0.4"
mockito = "1.2"

View File

@ -29,7 +29,6 @@ impl Provider for AnthropicProvider {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
_tools: Option<Vec<crate::ai::Tool>>,
) -> anyhow::Result<ChatResponse> {
let client = reqwest::Client::builder()
.timeout(Duration::from_secs(60))
@ -116,7 +115,6 @@ impl Provider for AnthropicProvider {
content,
model,
usage,
tool_calls: None,
})
}
}

View File

@ -30,7 +30,6 @@ impl Provider for GeminiProvider {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
_tools: Option<Vec<crate::ai::Tool>>,
) -> anyhow::Result<ChatResponse> {
let client = reqwest::Client::builder()
.timeout(Duration::from_secs(60))
@ -119,7 +118,6 @@ impl Provider for GeminiProvider {
content,
model: config.model.clone(),
usage,
tool_calls: None,
})
}
}

View File

@ -30,7 +30,6 @@ impl Provider for MistralProvider {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
_tools: Option<Vec<crate::ai::Tool>>,
) -> anyhow::Result<ChatResponse> {
// Mistral uses OpenAI-compatible format
let client = reqwest::Client::builder()
@ -84,7 +83,6 @@ impl Provider for MistralProvider {
content,
model: config.model.clone(),
usage,
tool_calls: None,
})
}
}

View File

@ -4,22 +4,15 @@ pub mod mistral;
pub mod ollama;
pub mod openai;
pub mod provider;
pub mod tools;
pub use provider::*;
pub use tools::*;
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Message {
pub role: String,
pub content: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub tool_call_id: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub tool_calls: Option<Vec<ToolCall>>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
@ -27,44 +20,6 @@ pub struct ChatResponse {
pub content: String,
pub model: String,
pub usage: Option<TokenUsage>,
#[serde(skip_serializing_if = "Option::is_none")]
pub tool_calls: Option<Vec<ToolCall>>,
}
/// Represents a tool call made by the AI
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ToolCall {
pub id: String,
pub name: String,
pub arguments: String, // JSON string
}
/// Tool definition that describes available functions to the AI
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Tool {
pub name: String,
pub description: String,
pub parameters: ToolParameters,
}
/// JSON Schema-style parameter definition for tools
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ToolParameters {
#[serde(rename = "type")]
pub param_type: String, // Usually "object"
pub properties: HashMap<String, ParameterProperty>,
pub required: Vec<String>,
}
/// Individual parameter property definition
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ParameterProperty {
#[serde(rename = "type")]
pub prop_type: String, // "string", "number", "integer", "boolean"
pub description: String,
#[serde(skip_serializing_if = "Option::is_none")]
#[serde(rename = "enum")]
pub enum_values: Option<Vec<String>>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]

View File

@ -31,7 +31,6 @@ impl Provider for OllamaProvider {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
_tools: Option<Vec<crate::ai::Tool>>,
) -> anyhow::Result<ChatResponse> {
let client = reqwest::Client::builder()
.timeout(Duration::from_secs(60))
@ -100,7 +99,6 @@ impl Provider for OllamaProvider {
content,
model: config.model.clone(),
usage,
tool_calls: None,
})
}
}

View File

@ -33,16 +33,15 @@ impl Provider for OpenAiProvider {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
tools: Option<Vec<crate::ai::Tool>>,
) -> anyhow::Result<ChatResponse> {
// Check if using custom REST format
let api_format = config.api_format.as_deref().unwrap_or("openai");
// Backward compatibility: accept legacy msi_genai identifier
if is_custom_rest_format(Some(api_format)) {
self.chat_custom_rest(messages, config, tools).await
self.chat_custom_rest(messages, config).await
} else {
self.chat_openai(messages, config, tools).await
self.chat_openai(messages, config).await
}
}
}
@ -74,7 +73,6 @@ impl OpenAiProvider {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
tools: Option<Vec<crate::ai::Tool>>,
) -> anyhow::Result<ChatResponse> {
let client = reqwest::Client::builder()
.timeout(Duration::from_secs(60))
@ -101,25 +99,6 @@ impl OpenAiProvider {
body["temperature"] = serde_json::Value::from(temp);
}
// Add tools if provided (OpenAI function calling format)
if let Some(tools_list) = tools {
let formatted_tools: Vec<serde_json::Value> = tools_list
.iter()
.map(|tool| {
serde_json::json!({
"type": "function",
"function": {
"name": tool.name,
"description": tool.description,
"parameters": tool.parameters
}
})
})
.collect();
body["tools"] = serde_json::Value::from(formatted_tools);
body["tool_choice"] = serde_json::Value::from("auto");
}
// Use custom auth header and prefix if provided
let auth_header = config
.custom_auth_header
@ -143,32 +122,10 @@ impl OpenAiProvider {
}
let json: serde_json::Value = resp.json().await?;
let message = &json["choices"][0]["message"];
let content = message["content"].as_str().unwrap_or("").to_string();
// Parse tool_calls if present
let tool_calls = message.get("tool_calls").and_then(|tc| {
if let Some(arr) = tc.as_array() {
let calls: Vec<crate::ai::ToolCall> = arr
.iter()
.filter_map(|call| {
Some(crate::ai::ToolCall {
id: call["id"].as_str()?.to_string(),
name: call["function"]["name"].as_str()?.to_string(),
arguments: call["function"]["arguments"].as_str()?.to_string(),
})
})
.collect();
if calls.is_empty() {
None
} else {
Some(calls)
}
} else {
None
}
});
let content = json["choices"][0]["message"]["content"]
.as_str()
.ok_or_else(|| anyhow::anyhow!("No content in response"))?
.to_string();
let usage = json.get("usage").and_then(|u| {
Some(TokenUsage {
@ -182,7 +139,6 @@ impl OpenAiProvider {
content,
model: config.model.clone(),
usage,
tool_calls,
})
}
@ -191,7 +147,6 @@ impl OpenAiProvider {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
tools: Option<Vec<crate::ai::Tool>>,
) -> anyhow::Result<ChatResponse> {
let client = reqwest::Client::builder()
.timeout(Duration::from_secs(60))
@ -249,33 +204,11 @@ impl OpenAiProvider {
body["modelConfig"] = model_config;
}
// Add tools if provided (OpenAI-style format, most common standard)
if let Some(tools_list) = tools {
let formatted_tools: Vec<serde_json::Value> = tools_list
.iter()
.map(|tool| {
serde_json::json!({
"type": "function",
"function": {
"name": tool.name,
"description": tool.description,
"parameters": tool.parameters
}
})
})
.collect();
let tool_count = formatted_tools.len();
body["tools"] = serde_json::Value::from(formatted_tools);
body["tool_choice"] = serde_json::Value::from("auto");
tracing::info!("MSI GenAI: Sending {} tools in request", tool_count);
}
// Use custom auth header and prefix (no default prefix for custom REST)
// Use custom auth header and prefix (no prefix for this custom REST contract)
let auth_header = config
.custom_auth_header
.as_deref()
.unwrap_or("Authorization");
.unwrap_or("x-msi-genai-api-key");
let auth_prefix = config.custom_auth_prefix.as_deref().unwrap_or("");
let auth_value = format!("{auth_prefix}{api_key}", api_key = config.api_key);
@ -283,6 +216,7 @@ impl OpenAiProvider {
.post(&url)
.header(auth_header, auth_value)
.header("Content-Type", "application/json")
.header("X-msi-genai-client", "troubleshooting-rca-assistant")
.json(&body)
.send()
.await?;
@ -295,84 +229,12 @@ impl OpenAiProvider {
let json: serde_json::Value = resp.json().await?;
tracing::debug!(
"MSI GenAI response: {}",
serde_json::to_string_pretty(&json).unwrap_or_else(|_| "invalid JSON".to_string())
);
// Extract response content from "msg" field
let content = json["msg"]
.as_str()
.ok_or_else(|| anyhow::anyhow!("No 'msg' field in response"))?
.to_string();
// Parse tool_calls if present (check multiple possible field names)
let tool_calls = json
.get("tool_calls")
.or_else(|| json.get("toolCalls"))
.or_else(|| json.get("function_calls"))
.and_then(|tc| {
if let Some(arr) = tc.as_array() {
let calls: Vec<crate::ai::ToolCall> = arr
.iter()
.filter_map(|call| {
// Try OpenAI format first
if let (Some(id), Some(name), Some(args)) = (
call.get("id").and_then(|v| v.as_str()),
call.get("function")
.and_then(|f| f.get("name"))
.and_then(|n| n.as_str())
.or_else(|| call.get("name").and_then(|n| n.as_str())),
call.get("function")
.and_then(|f| f.get("arguments"))
.and_then(|a| a.as_str())
.or_else(|| call.get("arguments").and_then(|a| a.as_str())),
) {
tracing::info!("MSI GenAI: Parsed tool call: {} ({})", name, id);
return Some(crate::ai::ToolCall {
id: id.to_string(),
name: name.to_string(),
arguments: args.to_string(),
});
}
// Try simpler format
if let (Some(name), Some(args)) = (
call.get("name").and_then(|n| n.as_str()),
call.get("arguments").and_then(|a| a.as_str()),
) {
let id = call
.get("id")
.and_then(|v| v.as_str())
.unwrap_or("tool_call_0")
.to_string();
tracing::info!(
"MSI GenAI: Parsed tool call (simple format): {} ({})",
name,
id
);
return Some(crate::ai::ToolCall {
id,
name: name.to_string(),
arguments: args.to_string(),
});
}
tracing::warn!("MSI GenAI: Failed to parse tool call: {:?}", call);
None
})
.collect();
if calls.is_empty() {
None
} else {
tracing::info!("MSI GenAI: Found {} tool calls", calls.len());
Some(calls)
}
} else {
None
}
});
// Note: sessionId from response should be stored back to config.session_id
// This would require making config mutable or returning it as part of ChatResponse
// For now, the caller can extract it from the response if needed
@ -382,7 +244,6 @@ impl OpenAiProvider {
content,
model: config.model.clone(),
usage: None, // This custom REST contract doesn't provide token usage in response
tool_calls,
})
}
}

View File

@ -1,6 +1,6 @@
use async_trait::async_trait;
use crate::ai::{ChatResponse, Message, ProviderInfo, Tool};
use crate::ai::{ChatResponse, Message, ProviderInfo};
use crate::state::ProviderConfig;
#[async_trait]
@ -11,7 +11,6 @@ pub trait Provider: Send + Sync {
&self,
messages: Vec<Message>,
config: &ProviderConfig,
tools: Option<Vec<Tool>>,
) -> anyhow::Result<ChatResponse>;
}

View File

@ -1,41 +0,0 @@
use crate::ai::{ParameterProperty, Tool, ToolParameters};
use std::collections::HashMap;
/// Get all available tools for AI function calling
pub fn get_available_tools() -> Vec<Tool> {
vec![get_add_ado_comment_tool()]
}
/// Tool definition for adding comments to Azure DevOps work items
fn get_add_ado_comment_tool() -> Tool {
let mut properties = HashMap::new();
properties.insert(
"work_item_id".to_string(),
ParameterProperty {
prop_type: "integer".to_string(),
description: "The Azure DevOps work item ID (ticket number) to add the comment to"
.to_string(),
enum_values: None,
},
);
properties.insert(
"comment_text".to_string(),
ParameterProperty {
prop_type: "string".to_string(),
description: "The text content of the comment to add to the work item".to_string(),
enum_values: None,
},
);
Tool {
name: "add_ado_comment".to_string(),
description: "Add a comment to an Azure DevOps work item (ticket). Use this when the user asks you to add a comment, update a ticket, or provide information to a ticket.".to_string(),
parameters: ToolParameters {
param_type: "object".to_string(),
properties,
required: vec!["work_item_id".to_string(), "comment_text".to_string()],
},
}
}

View File

@ -1,5 +1,4 @@
use rusqlite::OptionalExtension;
use tauri::{Manager, State};
use tauri::State;
use tracing::warn;
use crate::ai::provider::create_provider;
@ -52,19 +51,15 @@ pub async fn analyze_logs(
FIRST_WHY: (initial why question for 5-whys analysis), \
SEVERITY: (critical/high/medium/low)"
.into(),
tool_call_id: None,
tool_calls: None,
},
Message {
role: "user".into(),
content: format!("Analyze logs for issue {issue_id}:\n\n{log_contents}"),
tool_call_id: None,
tool_calls: None,
},
];
let response = provider
.chat(messages, &provider_config, None)
.chat(messages, &provider_config)
.await
.map_err(|e| {
warn!(error = %e, "ai analyze_logs provider request failed");
@ -165,7 +160,6 @@ pub async fn chat_message(
issue_id: String,
message: String,
provider_config: ProviderConfig,
app_handle: tauri::AppHandle,
state: State<'_, AppState>,
) -> Result<ChatResponse, String> {
// Find or create a conversation for this issue + provider
@ -218,105 +212,25 @@ pub async fn chat_message(
.unwrap_or_default();
drop(db);
raw.into_iter()
.map(|(role, content)| Message {
role,
content,
tool_call_id: None,
tool_calls: None,
})
.map(|(role, content)| Message { role, content })
.collect()
};
let provider = create_provider(&provider_config);
// Search integration sources for relevant context
let integration_context = search_integration_sources(&message, &app_handle, &state).await;
let mut messages = history;
// If we found integration content, add it to the conversation context
if !integration_context.is_empty() {
let context_message = Message {
role: "system".into(),
content: format!(
"INTERNAL DOCUMENTATION SOURCES:\n\n{integration_context}\n\n\
Instructions: The above content is from internal company documentation systems \
(Confluence, ServiceNow, Azure DevOps). \
\n\n**IMPORTANT**: First determine if this documentation is RELEVANT to the user's question:\
\n- If the documentation directly addresses the question Use it and cite sources with URLs\
\n- If the documentation is tangentially related but doesn't answer the question Briefly mention what internal docs exist, then provide a complete answer using general knowledge\
\n- If the documentation is completely unrelated Ignore it and answer using general knowledge\
\n\nDo NOT force irrelevant internal documentation into your answer. The user needs accurate information, not forced citations."
),
tool_call_id: None,
tool_calls: None,
};
messages.push(context_message);
}
messages.push(Message {
role: "user".into(),
content: message.clone(),
tool_call_id: None,
tool_calls: None,
});
// Get available tools
let tools = Some(crate::ai::tools::get_available_tools());
// Tool-calling loop: keep calling until AI gives final answer
let final_response;
let max_iterations = 10; // Prevent infinite loops
let mut iteration = 0;
loop {
iteration += 1;
if iteration > max_iterations {
return Err("Tool-calling loop exceeded maximum iterations".to_string());
}
let response = provider
.chat(messages.clone(), &provider_config, tools.clone())
.await
.map_err(|e| {
let error_msg = format!("AI provider request failed: {e}");
warn!("{}", error_msg);
error_msg
})?;
// Check if AI wants to call tools
if let Some(tool_calls) = &response.tool_calls {
tracing::info!("AI requested {} tool call(s)", tool_calls.len());
// Execute each tool call
for tool_call in tool_calls {
tracing::info!("Executing tool: {}", tool_call.name);
let tool_result = execute_tool_call(tool_call, &app_handle, &state).await;
// Format result
let result_content = match tool_result {
Ok(result) => result,
Err(e) => format!("Error executing tool: {e}"),
};
// Add tool result as a message
messages.push(Message {
role: "tool".into(),
content: result_content,
tool_call_id: Some(tool_call.id.clone()),
tool_calls: None,
});
}
// Continue loop to get AI's next response
continue;
}
// No tool calls - this is the final answer
final_response = response;
break;
}
let response = provider
.chat(messages, &provider_config)
.await
.map_err(|e| {
warn!(error = %e, "ai chat provider request failed");
"AI provider request failed".to_string()
})?;
// Save both user message and response to DB
{
@ -325,7 +239,7 @@ pub async fn chat_message(
let asst_msg = AiMessage::new(
conversation_id,
"assistant".to_string(),
final_response.content.clone(),
response.content.clone(),
);
db.execute(
@ -354,10 +268,10 @@ pub async fn chat_message(
"model": provider_config.model,
"api_url": provider_config.api_url,
"user_message": user_msg.content,
"response_preview": if final_response.content.len() > 200 {
format!("{preview}...", preview = &final_response.content[..200])
"response_preview": if response.content.len() > 200 {
format!("{preview}...", preview = &response.content[..200])
} else {
final_response.content.clone()
response.content.clone()
},
"token_count": user_msg.token_count,
});
@ -378,7 +292,7 @@ pub async fn chat_message(
}
}
Ok(final_response)
Ok(response)
}
#[tauri::command]
@ -391,11 +305,9 @@ pub async fn test_provider_connection(
content:
"Reply with exactly: Troubleshooting and RCA Assistant connection test successful."
.into(),
tool_call_id: None,
tool_calls: None,
}];
provider
.chat(messages, &provider_config, None)
.chat(messages, &provider_config)
.await
.map_err(|e| {
warn!(error = %e, "ai test_provider_connection failed");
@ -440,417 +352,6 @@ pub async fn list_providers() -> Result<Vec<ProviderInfo>, String> {
])
}
/// Search integration sources (Confluence, ServiceNow, Azure DevOps) for relevant context
async fn search_integration_sources(
query: &str,
app_handle: &tauri::AppHandle,
state: &State<'_, AppState>,
) -> String {
let mut all_results = Vec::new();
// Try to get integration configurations
let configs: Vec<crate::commands::integrations::IntegrationConfig> = {
let db = match state.db.lock() {
Ok(db) => db,
Err(e) => {
tracing::warn!("Failed to lock database: {}", e);
return String::new();
}
};
let mut stmt = match db.prepare(
"SELECT service, base_url, username, project_name, space_key FROM integration_config",
) {
Ok(stmt) => stmt,
Err(e) => {
tracing::warn!("Failed to prepare statement: {}", e);
return String::new();
}
};
let rows = match stmt.query_map([], |row| {
Ok(crate::commands::integrations::IntegrationConfig {
service: row.get(0)?,
base_url: row.get(1)?,
username: row.get(2)?,
project_name: row.get(3)?,
space_key: row.get(4)?,
})
}) {
Ok(rows) => rows,
Err(e) => {
tracing::warn!("Failed to query integration configs: {}", e);
return String::new();
}
};
rows.filter_map(|r| r.ok()).collect()
};
// Search each available integration in parallel
let mut search_tasks = Vec::new();
for config in configs {
// Authentication priority:
// 1. Try cookies from persistent browser (may fail for HttpOnly)
// 2. Try stored credentials from database
// 3. Fall back to webview-based search (uses browser's session directly)
let cookies_opt = match crate::commands::integrations::get_fresh_cookies_from_webview(
&config.service,
app_handle,
state,
)
.await
{
Ok(Some(cookies)) => {
tracing::info!("Using extracted cookies for {}", config.service);
Some(cookies)
}
_ => {
// Fallback: check for stored credentials in database
tracing::info!(
"Cookie extraction failed for {}, checking stored credentials",
config.service
);
let encrypted_token: Option<String> = {
let db = match state.db.lock() {
Ok(db) => db,
Err(_) => continue,
};
db.query_row(
"SELECT encrypted_token FROM credentials WHERE service = ?1",
[&config.service],
|row| row.get::<_, String>(0),
)
.optional()
.ok()
.flatten()
};
if let Some(token) = encrypted_token {
if let Ok(decrypted) = crate::integrations::auth::decrypt_token(&token) {
// Try to parse as cookies JSON
if let Ok(cookie_list) = serde_json::from_str::<
Vec<crate::integrations::webview_auth::Cookie>,
>(&decrypted)
{
tracing::info!(
"Using stored cookies for {} (count: {})",
config.service,
cookie_list.len()
);
Some(cookie_list)
} else {
tracing::warn!(
"Stored credentials for {} not in cookie format",
config.service
);
None
}
} else {
None
}
} else {
None
}
}
};
// If we have cookies (from extraction or database), use standard API search
if let Some(cookies) = cookies_opt {
match config.service.as_str() {
"confluence" => {
let base_url = config.base_url.clone();
let query = query.to_string();
let cookies_clone = cookies.clone();
search_tasks.push(tokio::spawn(async move {
crate::integrations::confluence_search::search_confluence(
&base_url,
&query,
&cookies_clone,
)
.await
.unwrap_or_default()
}));
}
"servicenow" => {
let instance_url = config.base_url.clone();
let query = query.to_string();
let cookies_clone = cookies.clone();
search_tasks.push(tokio::spawn(async move {
let mut results = Vec::new();
// Search knowledge base
if let Ok(kb_results) =
crate::integrations::servicenow_search::search_servicenow(
&instance_url,
&query,
&cookies_clone,
)
.await
{
results.extend(kb_results);
}
// Search incidents
if let Ok(incident_results) =
crate::integrations::servicenow_search::search_incidents(
&instance_url,
&query,
&cookies_clone,
)
.await
{
results.extend(incident_results);
}
results
}));
}
"azuredevops" => {
let org_url = config.base_url.clone();
let project = config.project_name.unwrap_or_default();
let query = query.to_string();
let cookies_clone = cookies.clone();
search_tasks.push(tokio::spawn(async move {
let mut results = Vec::new();
// Search wiki
if let Ok(wiki_results) =
crate::integrations::azuredevops_search::search_wiki(
&org_url,
&project,
&query,
&cookies_clone,
)
.await
{
results.extend(wiki_results);
}
// Search work items
if let Ok(wi_results) =
crate::integrations::azuredevops_search::search_work_items(
&org_url,
&project,
&query,
&cookies_clone,
)
.await
{
results.extend(wi_results);
}
results
}));
}
_ => {}
}
} else {
// Final fallback: try webview-based fetch (includes HttpOnly cookies automatically)
// This makes HTTP requests FROM the authenticated webview, which includes all cookies
tracing::info!(
"No extracted cookies for {}, trying webview-based fetch",
config.service
);
// Check if webview exists for this service
let webview_label = {
let webviews = match state.integration_webviews.lock() {
Ok(w) => w,
Err(_) => continue,
};
webviews.get(&config.service).cloned()
};
if let Some(label) = webview_label {
// Get window handle
if let Some(webview_window) = app_handle.get_webview_window(&label) {
let base_url = config.base_url.clone();
let service = config.service.clone();
let query_str = query.to_string();
match service.as_str() {
"confluence" => {
search_tasks.push(tokio::spawn(async move {
tracing::info!("Executing Confluence search via webview fetch");
match crate::integrations::webview_fetch::search_confluence_webview(
&webview_window,
&base_url,
&query_str,
)
.await
{
Ok(results) => {
tracing::info!(
"Webview fetch for Confluence returned {} results",
results.len()
);
results
}
Err(e) => {
tracing::warn!(
"Webview fetch failed for Confluence: {}",
e
);
Vec::new()
}
}
}));
}
"servicenow" => {
search_tasks.push(tokio::spawn(async move {
tracing::info!("Executing ServiceNow search via webview fetch");
match crate::integrations::webview_fetch::search_servicenow_webview(
&webview_window,
&base_url,
&query_str,
)
.await
{
Ok(results) => {
tracing::info!(
"Webview fetch for ServiceNow returned {} results",
results.len()
);
results
}
Err(e) => {
tracing::warn!(
"Webview fetch failed for ServiceNow: {}",
e
);
Vec::new()
}
}
}));
}
"azuredevops" => {
let project = config.project_name.unwrap_or_default();
search_tasks.push(tokio::spawn(async move {
tracing::info!("Executing Azure DevOps search via webview fetch");
let mut results = Vec::new();
// Search wiki
match crate::integrations::webview_fetch::search_azuredevops_wiki_webview(
&webview_window,
&base_url,
&project,
&query_str
).await {
Ok(wiki_results) => {
tracing::info!("Webview fetch for ADO wiki returned {} results", wiki_results.len());
results.extend(wiki_results);
}
Err(e) => {
tracing::warn!("Webview fetch failed for ADO wiki: {}", e);
}
}
// Search work items
match crate::integrations::webview_fetch::search_azuredevops_workitems_webview(
&webview_window,
&base_url,
&project,
&query_str
).await {
Ok(wi_results) => {
tracing::info!("Webview fetch for ADO work items returned {} results", wi_results.len());
results.extend(wi_results);
}
Err(e) => {
tracing::warn!("Webview fetch failed for ADO work items: {}", e);
}
}
results
}));
}
_ => {}
}
} else {
tracing::warn!("Webview window not found for {}", config.service);
}
} else {
tracing::warn!(
"No webview open for {} - cannot search. Please open browser window in Settings → Integrations",
config.service
);
}
}
}
// Wait for all searches to complete
for task in search_tasks {
if let Ok(results) = task.await {
all_results.extend(results);
}
}
// Format results for AI context
if all_results.is_empty() {
return String::new();
}
let mut context = String::new();
for (idx, result) in all_results.iter().enumerate() {
context.push_str(&format!("--- SOURCE {} ({}) ---\n", idx + 1, result.source));
context.push_str(&format!("Title: {}\n", result.title));
context.push_str(&format!("URL: {}\n", result.url));
if let Some(content) = &result.content {
context.push_str(&format!("Content:\n{content}\n\n"));
} else {
context.push_str(&format!("Excerpt: {}\n\n", result.excerpt));
}
}
tracing::info!(
"Found {} integration sources for AI context",
all_results.len()
);
context
}
/// Execute a tool call made by the AI
async fn execute_tool_call(
tool_call: &crate::ai::ToolCall,
app_handle: &tauri::AppHandle,
app_state: &State<'_, AppState>,
) -> Result<String, String> {
match tool_call.name.as_str() {
"add_ado_comment" => {
// Parse arguments
let args: serde_json::Value = serde_json::from_str(&tool_call.arguments)
.map_err(|e| format!("Failed to parse tool arguments: {e}"))?;
let work_item_id = args
.get("work_item_id")
.and_then(|v| v.as_i64())
.ok_or_else(|| "Missing or invalid work_item_id parameter".to_string())?;
let comment_text = args
.get("comment_text")
.and_then(|v| v.as_str())
.ok_or_else(|| "Missing or invalid comment_text parameter".to_string())?;
// Execute the add_ado_comment command
tracing::info!(
"AI executing tool: add_ado_comment({}, \"{}\")",
work_item_id,
comment_text
);
crate::commands::integrations::add_ado_comment(
work_item_id,
comment_text.to_string(),
app_handle.clone(),
app_state.clone(),
)
.await
}
_ => {
let error = format!("Unknown tool: {}", tool_call.name);
tracing::warn!("{}", error);
Err(error)
}
}
}
#[cfg(test)]
mod tests {
use super::*;

View File

@ -19,105 +19,10 @@ lazy_static::lazy_static! {
#[tauri::command]
pub async fn test_confluence_connection(
base_url: String,
_base_url: String,
_credentials: serde_json::Value,
app_handle: tauri::AppHandle,
app_state: State<'_, AppState>,
) -> Result<ConnectionResult, String> {
// Try to get fresh cookies from persistent webview
let cookies = get_fresh_cookies_from_webview("confluence", &app_handle, &app_state).await?;
if let Some(cookie_list) = cookies {
// Use cookies for authentication
let cookie_header = crate::integrations::webview_auth::cookies_to_header(&cookie_list);
let client = reqwest::Client::new();
let url = format!("{}/rest/api/user/current", base_url.trim_end_matches('/'));
let resp = client
.get(&url)
.header("Cookie", cookie_header)
.send()
.await
.map_err(|e| format!("Connection failed: {e}"))?;
if resp.status().is_success() {
Ok(ConnectionResult {
success: true,
message: "Successfully connected to Confluence using browser session".to_string(),
})
} else {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
Ok(ConnectionResult {
success: false,
message: format!("Connection failed with status {status}: {text}"),
})
}
} else {
// No webview open, check if we have stored credentials
let encrypted_token: Option<String> = {
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
db.query_row(
"SELECT encrypted_token FROM credentials WHERE service = ?1",
["confluence"],
|row| row.get(0),
)
.optional()
.map_err(|e| format!("Failed to query credentials: {e}"))?
};
if let Some(token) = encrypted_token {
let decrypted = crate::integrations::auth::decrypt_token(&token)?;
// Try to parse as cookies JSON first
if let Ok(cookie_list) =
serde_json::from_str::<Vec<crate::integrations::webview_auth::Cookie>>(&decrypted)
{
let cookie_header =
crate::integrations::webview_auth::cookies_to_header(&cookie_list);
let client = reqwest::Client::new();
let url = format!("{}/rest/api/user/current", base_url.trim_end_matches('/'));
let resp = client
.get(&url)
.header("Cookie", cookie_header)
.send()
.await
.map_err(|e| format!("Connection failed: {e}"))?;
if resp.status().is_success() {
Ok(ConnectionResult {
success: true,
message: "Successfully connected to Confluence using stored session"
.to_string(),
})
} else {
let status = resp.status();
Ok(ConnectionResult {
success: false,
message: format!(
"Connection failed with status {status}. Session may have expired - try reopening the browser window."
),
})
}
} else {
// Treat as bearer token
let config = crate::integrations::confluence::ConfluenceConfig {
base_url: base_url.clone(),
access_token: decrypted,
};
crate::integrations::confluence::test_connection(&config).await
}
} else {
Err("Not authenticated. Please open the browser window and log in, or provide a manual token.".to_string())
}
}
Err("Integrations available in v0.2. Please update to the latest version.".to_string())
}
#[tauri::command]
@ -131,71 +36,10 @@ pub async fn publish_to_confluence(
#[tauri::command]
pub async fn test_servicenow_connection(
instance_url: String,
_instance_url: String,
_credentials: serde_json::Value,
app_handle: tauri::AppHandle,
app_state: State<'_, AppState>,
) -> Result<ConnectionResult, String> {
// Try to get fresh cookies from persistent webview
let cookies = get_fresh_cookies_from_webview("servicenow", &app_handle, &app_state).await?;
if let Some(cookie_list) = cookies {
let cookie_header = crate::integrations::webview_auth::cookies_to_header(&cookie_list);
let client = reqwest::Client::new();
let url = format!(
"{}/api/now/table/sys_user?sysparm_limit=1",
instance_url.trim_end_matches('/')
);
let resp = client
.get(&url)
.header("Cookie", cookie_header)
.send()
.await
.map_err(|e| format!("Connection failed: {e}"))?;
if resp.status().is_success() {
Ok(ConnectionResult {
success: true,
message: "Successfully connected to ServiceNow using browser session".to_string(),
})
} else {
let status = resp.status();
Ok(ConnectionResult {
success: false,
message: format!("Connection failed with status {status}"),
})
}
} else {
// Check stored credentials
let encrypted_token: Option<String> = {
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
db.query_row(
"SELECT encrypted_token FROM credentials WHERE service = ?1",
["servicenow"],
|row| row.get(0),
)
.optional()
.map_err(|e| format!("Failed to query credentials: {e}"))?
};
if let Some(token) = encrypted_token {
let password = crate::integrations::auth::decrypt_token(&token)?;
let config = crate::integrations::servicenow::ServiceNowConfig {
instance_url: instance_url.clone(),
username: "".to_string(),
password,
};
crate::integrations::servicenow::test_connection(&config).await
} else {
Err("Not authenticated. Please open the browser window and log in, or provide a manual token.".to_string())
}
}
Err("Integrations available in v0.2. Please update to the latest version.".to_string())
}
#[tauri::command]
@ -208,71 +52,10 @@ pub async fn create_servicenow_incident(
#[tauri::command]
pub async fn test_azuredevops_connection(
org_url: String,
_org_url: String,
_credentials: serde_json::Value,
app_handle: tauri::AppHandle,
app_state: State<'_, AppState>,
) -> Result<ConnectionResult, String> {
// Try to get fresh cookies from persistent webview
let cookies = get_fresh_cookies_from_webview("azuredevops", &app_handle, &app_state).await?;
if let Some(cookie_list) = cookies {
let cookie_header = crate::integrations::webview_auth::cookies_to_header(&cookie_list);
let client = reqwest::Client::new();
let url = format!(
"{}/_apis/projects?api-version=6.0",
org_url.trim_end_matches('/')
);
let resp = client
.get(&url)
.header("Cookie", cookie_header)
.send()
.await
.map_err(|e| format!("Connection failed: {e}"))?;
if resp.status().is_success() {
Ok(ConnectionResult {
success: true,
message: "Successfully connected to Azure DevOps using browser session".to_string(),
})
} else {
let status = resp.status();
Ok(ConnectionResult {
success: false,
message: format!("Connection failed with status {status}"),
})
}
} else {
// Check stored credentials
let encrypted_token: Option<String> = {
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
db.query_row(
"SELECT encrypted_token FROM credentials WHERE service = ?1",
["azuredevops"],
|row| row.get(0),
)
.optional()
.map_err(|e| format!("Failed to query credentials: {e}"))?
};
if let Some(token) = encrypted_token {
let access_token = crate::integrations::auth::decrypt_token(&token)?;
let config = crate::integrations::azuredevops::AzureDevOpsConfig {
organization_url: org_url.clone(),
access_token,
project: "".to_string(),
};
crate::integrations::azuredevops::test_connection(&config).await
} else {
Err("Not authenticated. Please open the browser window and log in, or provide a manual token.".to_string())
}
}
Err("Integrations available in v0.2. Please update to the latest version.".to_string())
}
#[tauri::command]
@ -722,7 +505,6 @@ pub struct WebviewAuthResponse {
pub async fn authenticate_with_webview(
service: String,
base_url: String,
project_name: Option<String>,
app_handle: tauri::AppHandle,
app_state: State<'_, AppState>,
) -> Result<WebviewAuthResponse, String> {
@ -748,81 +530,21 @@ pub async fn authenticate_with_webview(
// Open persistent browser window
let _credentials = crate::integrations::webview_auth::authenticate_with_webview(
app_handle.clone(),
&service,
&base_url,
project_name.as_deref(),
app_handle, &service, &base_url,
)
.await?;
// Store window reference in memory
// Store window reference
app_state
.integration_webviews
.lock()
.map_err(|e| format!("Failed to lock webviews: {e}"))?
.insert(service.clone(), webview_id.clone());
// Persist to database for restoration on app restart
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
db.execute(
"INSERT OR REPLACE INTO persistent_webviews
(id, service, webview_label, base_url, last_active)
VALUES (?1, ?2, ?3, ?4, datetime('now'))",
rusqlite::params![
uuid::Uuid::now_v7().to_string(),
service.clone(),
webview_id.clone(),
base_url.clone(),
],
)
.map_err(|e| format!("Failed to persist webview: {e}"))?;
tracing::info!("Persisted webview {} for service {}", webview_id, service);
// Set up window close handler to remove from tracking and database
if let Some(webview_window) = app_handle.get_webview_window(&webview_id) {
let service_clone = service.clone();
let db_arc = app_state.db.clone();
let webviews_arc = app_state.integration_webviews.clone();
webview_window.on_window_event(move |event| {
if let tauri::WindowEvent::CloseRequested { .. } = event {
let service = service_clone.clone();
let db = db_arc.clone();
let webviews = webviews_arc.clone();
// Spawn async task to clean up
tauri::async_runtime::spawn(async move {
// Remove from in-memory tracking
if let Ok(mut webviews_lock) = webviews.lock() {
webviews_lock.remove(&service);
tracing::info!("Removed {} from webview tracking", service);
}
// Remove from database
if let Ok(db_lock) = db.lock() {
if let Err(e) = db_lock.execute(
"DELETE FROM persistent_webviews WHERE service = ?1",
rusqlite::params![service],
) {
tracing::warn!("Failed to remove persistent webview from DB: {}", e);
} else {
tracing::info!("Removed {} from persistent webviews database", service);
}
}
});
}
});
}
Ok(WebviewAuthResponse {
success: true,
message: format!(
"{service} browser window opened. This window will stay open across app restarts - use it to browse and authenticate. Cookies are maintained automatically."
"{service} browser window opened. This window will stay open - use it to browse and authenticate. Cookies will be extracted automatically for API calls."
),
webview_id,
})
@ -883,11 +605,16 @@ pub async fn extract_cookies_from_webview(
)
.map_err(|e| format!("Failed to store cookies: {e}"))?;
// NOTE: Window stays open for persistent browsing - no longer closing after cookie extraction
// Close the webview window
if let Some(webview) = app_handle.get_webview_window(&webview_id) {
webview
.close()
.map_err(|e| format!("Failed to close webview: {e}"))?;
}
Ok(ConnectionResult {
success: true,
message: format!("{service} authentication saved successfully. The browser window will stay open for future use."),
message: format!("{service} authentication saved successfully"),
})
}
@ -1059,122 +786,6 @@ pub async fn get_fresh_cookies_from_webview(
}
}
// ============================================================================
// Persistent Webview Restoration
// ============================================================================
/// Restore persistent browser windows from database on app startup.
/// This recreates integration browser windows that were open when the app last closed.
pub async fn restore_persistent_webviews(
app_handle: &tauri::AppHandle,
app_state: &AppState,
) -> Result<(), String> {
let webviews_to_restore: Vec<(String, String, String)> = {
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
let mut stmt = db
.prepare("SELECT service, webview_label, base_url FROM persistent_webviews")
.map_err(|e| format!("Failed to prepare query: {e}"))?;
let rows: Vec<(String, String, String)> = stmt
.query_map([], |row| {
Ok((
row.get::<_, String>(0)?, // service
row.get::<_, String>(1)?, // webview_label
row.get::<_, String>(2)?, // base_url
))
})
.map_err(|e| format!("Failed to query persistent webviews: {e}"))?
.collect::<Result<Vec<_>, _>>()
.map_err(|e| format!("Failed to collect webviews: {e}"))?;
rows
};
for (service, webview_label, base_url) in webviews_to_restore {
tracing::info!(
"Restoring persistent webview {} for service {} at {}",
webview_label,
service,
base_url
);
// Get project name from integration config if available
let project_name: Option<String> = {
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
db.query_row(
"SELECT project_name FROM integration_config WHERE service = ?1",
[&service],
|row| row.get(0),
)
.ok()
};
// Recreate the webview window
match crate::integrations::webview_auth::authenticate_with_webview(
app_handle.clone(),
&service,
&base_url,
project_name.as_deref(),
)
.await
{
Ok(_) => {
// Store in memory tracking
app_state
.integration_webviews
.lock()
.map_err(|e| format!("Failed to lock webviews: {e}"))?
.insert(service.clone(), webview_label.clone());
tracing::info!("Successfully restored webview for {}", service);
}
Err(e) => {
tracing::warn!("Failed to restore webview for {}: {}", service, e);
// Remove from database if restoration failed
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
db.execute(
"DELETE FROM persistent_webviews WHERE service = ?1",
rusqlite::params![service],
)
.map_err(|e| format!("Failed to remove failed webview: {e}"))?;
}
}
}
Ok(())
}
/// Remove persistent webview from database (called when window is closed).
pub async fn remove_persistent_webview(
service: &str,
app_state: &State<'_, AppState>,
) -> Result<(), String> {
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
db.execute(
"DELETE FROM persistent_webviews WHERE service = ?1",
rusqlite::params![service],
)
.map_err(|e| format!("Failed to remove persistent webview: {e}"))?;
tracing::info!("Removed persistent webview for service: {}", service);
Ok(())
}
// ============================================================================
// Integration Configuration Persistence
// ============================================================================
@ -1280,51 +891,3 @@ pub async fn get_all_integration_configs(
Ok(configs)
}
/// Add a comment to an Azure DevOps work item
#[tauri::command]
pub async fn add_ado_comment(
work_item_id: i64,
comment_text: String,
app_handle: tauri::AppHandle,
app_state: State<'_, AppState>,
) -> Result<String, String> {
// Get ADO configuration
let (org_url, _project_name) = {
let db = app_state
.db
.lock()
.map_err(|e| format!("Failed to lock database: {e}"))?;
let mut stmt = db.prepare(
"SELECT base_url, project_name FROM integration_config WHERE service = 'azuredevops'"
).map_err(|e| format!("Failed to prepare query: {e}"))?;
stmt.query_row([], |row| {
Ok((row.get::<_, String>(0)?, row.get::<_, Option<String>>(1)?))
})
.map_err(|e| format!("Azure DevOps not configured: {e}"))?
};
// Get webview window
let webview_label = {
let webviews = app_state
.integration_webviews
.lock()
.map_err(|e| format!("Failed to lock webviews: {e}"))?;
webviews.get("azuredevops").cloned()
.ok_or_else(|| "Azure DevOps browser window not open. Please open it in Settings → Integrations first.".to_string())?
};
let webview_window = app_handle
.get_webview_window(&webview_label)
.ok_or_else(|| "Azure DevOps browser window not found".to_string())?;
// Add the comment
crate::integrations::webview_fetch::add_azuredevops_comment_webview(
&webview_window,
&org_url,
work_item_id,
&comment_text,
)
.await
}

View File

@ -3,7 +3,7 @@ use crate::ollama::{
hardware, installer, manager, recommender, InstallGuide, ModelRecommendation, OllamaModel,
OllamaStatus,
};
use crate::state::{AppSettings, AppState, ProviderConfig};
use crate::state::{AppSettings, AppState};
// --- Ollama commands ---
@ -141,133 +141,3 @@ pub async fn get_audit_log(
Ok(rows)
}
// --- AI Provider persistence commands ---
/// Save an AI provider configuration to encrypted database
#[tauri::command]
pub async fn save_ai_provider(
provider: ProviderConfig,
state: tauri::State<'_, AppState>,
) -> Result<(), String> {
// Encrypt the API key
let encrypted_key = crate::integrations::auth::encrypt_token(&provider.api_key)?;
let db = state.db.lock().map_err(|e| e.to_string())?;
db.execute(
"INSERT OR REPLACE INTO ai_providers
(id, name, provider_type, api_url, encrypted_api_key, model, max_tokens, temperature,
custom_endpoint_path, custom_auth_header, custom_auth_prefix, api_format, user_id, updated_at)
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10, ?11, ?12, ?13, datetime('now'))",
rusqlite::params![
uuid::Uuid::now_v7().to_string(),
provider.name,
provider.provider_type,
provider.api_url,
encrypted_key,
provider.model,
provider.max_tokens,
provider.temperature,
provider.custom_endpoint_path,
provider.custom_auth_header,
provider.custom_auth_prefix,
provider.api_format,
provider.user_id,
],
)
.map_err(|e| format!("Failed to save AI provider: {e}"))?;
Ok(())
}
/// Load all AI provider configurations from database
#[tauri::command]
pub async fn load_ai_providers(
state: tauri::State<'_, AppState>,
) -> Result<Vec<ProviderConfig>, String> {
let db = state.db.lock().map_err(|e| e.to_string())?;
let mut stmt = db
.prepare(
"SELECT name, provider_type, api_url, encrypted_api_key, model, max_tokens, temperature,
custom_endpoint_path, custom_auth_header, custom_auth_prefix, api_format, user_id
FROM ai_providers
ORDER BY name",
)
.map_err(|e| e.to_string())?;
let providers = stmt
.query_map([], |row| {
let encrypted_key: String = row.get(3)?;
Ok((
row.get::<_, String>(0)?, // name
row.get::<_, String>(1)?, // provider_type
row.get::<_, String>(2)?, // api_url
encrypted_key, // encrypted_api_key
row.get::<_, String>(4)?, // model
row.get::<_, Option<u32>>(5)?, // max_tokens
row.get::<_, Option<f64>>(6)?, // temperature
row.get::<_, Option<String>>(7)?, // custom_endpoint_path
row.get::<_, Option<String>>(8)?, // custom_auth_header
row.get::<_, Option<String>>(9)?, // custom_auth_prefix
row.get::<_, Option<String>>(10)?, // api_format
row.get::<_, Option<String>>(11)?, // user_id
))
})
.map_err(|e| e.to_string())?
.filter_map(|r| r.ok())
.filter_map(
|(
name,
provider_type,
api_url,
encrypted_key,
model,
max_tokens,
temperature,
custom_endpoint_path,
custom_auth_header,
custom_auth_prefix,
api_format,
user_id,
)| {
// Decrypt the API key
let api_key = crate::integrations::auth::decrypt_token(&encrypted_key).ok()?;
Some(ProviderConfig {
name,
provider_type,
api_url,
api_key,
model,
max_tokens,
temperature,
custom_endpoint_path,
custom_auth_header,
custom_auth_prefix,
api_format,
session_id: None, // Session IDs are not persisted
user_id,
})
},
)
.collect();
Ok(providers)
}
/// Delete an AI provider configuration
#[tauri::command]
pub async fn delete_ai_provider(
name: String,
state: tauri::State<'_, AppState>,
) -> Result<(), String> {
let db = state.db.lock().map_err(|e| e.to_string())?;
db.execute("DELETE FROM ai_providers WHERE name = ?1", [&name])
.map_err(|e| format!("Failed to delete AI provider: {e}"))?;
Ok(())
}

View File

@ -81,82 +81,16 @@ pub fn open_dev_db(path: &Path) -> anyhow::Result<Connection> {
Ok(conn)
}
/// Migrates a plain SQLite database to an encrypted SQLCipher database.
/// Creates a backup of the original file before migration.
fn migrate_plain_to_encrypted(db_path: &Path, key: &str) -> anyhow::Result<Connection> {
tracing::warn!("Detected plain SQLite database in release build - migrating to encrypted");
// Create backup of plain database
let backup_path = db_path.with_extension("db.plain-backup");
std::fs::copy(db_path, &backup_path)?;
tracing::info!("Backed up plain database to {:?}", backup_path);
// Open the plain database
let plain_conn = Connection::open(db_path)?;
// Create temporary encrypted database path
let temp_encrypted = db_path.with_extension("db.encrypted-temp");
// Attach and migrate to encrypted database using SQLCipher export
plain_conn.execute_batch(&format!(
"ATTACH DATABASE '{}' AS encrypted KEY '{}';\
PRAGMA encrypted.cipher_page_size = 16384;\
PRAGMA encrypted.kdf_iter = 256000;\
PRAGMA encrypted.cipher_hmac_algorithm = HMAC_SHA512;\
PRAGMA encrypted.cipher_kdf_algorithm = PBKDF2_HMAC_SHA512;",
temp_encrypted.display(),
key.replace('\'', "''")
))?;
// Export all data to encrypted database
plain_conn.execute_batch("SELECT sqlcipher_export('encrypted');")?;
plain_conn.execute_batch("DETACH DATABASE encrypted;")?;
drop(plain_conn);
// Replace original with encrypted version
std::fs::rename(&temp_encrypted, db_path)?;
tracing::info!("Successfully migrated database to encrypted format");
// Open and return the encrypted database
open_encrypted_db(db_path, key)
}
/// Checks if a database file is plain SQLite by reading its header.
fn is_plain_sqlite(path: &Path) -> bool {
if let Ok(mut file) = std::fs::File::open(path) {
use std::io::Read;
let mut header = [0u8; 16];
if file.read_exact(&mut header).is_ok() {
// SQLite databases start with "SQLite format 3\0"
return &header == b"SQLite format 3\0";
}
}
false
}
pub fn init_db(data_dir: &Path) -> anyhow::Result<Connection> {
std::fs::create_dir_all(data_dir)?;
let db_path = data_dir.join("trcaa.db");
let db_path = data_dir.join("tftsr.db");
let key = get_db_key(data_dir)?;
let conn = if cfg!(debug_assertions) {
open_dev_db(&db_path)?
} else {
// In release mode, try encrypted first
match open_encrypted_db(&db_path, &key) {
Ok(conn) => conn,
Err(e) => {
// Check if error is due to trying to decrypt a plain SQLite database
if db_path.exists() && is_plain_sqlite(&db_path) {
// Auto-migrate from plain to encrypted
migrate_plain_to_encrypted(&db_path, &key)?
} else {
// Different error - propagate it
return Err(e);
}
}
}
open_encrypted_db(&db_path, &key)?
};
crate::db::migrations::run_migrations(&conn)?;
@ -168,22 +102,13 @@ mod tests {
use super::*;
fn temp_dir(name: &str) -> std::path::PathBuf {
use std::time::SystemTime;
let timestamp = SystemTime::now()
.duration_since(SystemTime::UNIX_EPOCH)
.unwrap()
.as_nanos();
let dir = std::env::temp_dir().join(format!("tftsr-test-{}-{}", name, timestamp));
// Clean up if it exists
let _ = std::fs::remove_dir_all(&dir);
let dir = std::env::temp_dir().join(format!("tftsr-test-{}", name));
std::fs::create_dir_all(&dir).unwrap();
dir
}
#[test]
fn test_get_db_key_uses_env_var_when_present() {
// Remove any existing env var first
std::env::remove_var("TFTSR_DB_KEY");
let dir = temp_dir("env-var");
std::env::set_var("TFTSR_DB_KEY", "test-db-key");
let key = get_db_key(&dir).unwrap();
@ -193,43 +118,10 @@ mod tests {
#[test]
fn test_get_db_key_debug_fallback_for_empty_env() {
// Remove any existing env var first
std::env::remove_var("TFTSR_DB_KEY");
let dir = temp_dir("empty-env");
std::env::set_var("TFTSR_DB_KEY", " ");
let key = get_db_key(&dir).unwrap();
assert_eq!(key, "dev-key-change-in-prod");
std::env::remove_var("TFTSR_DB_KEY");
}
#[test]
fn test_is_plain_sqlite_detects_plain_database() {
let dir = temp_dir("plain-detect");
let db_path = dir.join("test.db");
// Create a plain SQLite database
let conn = Connection::open(&db_path).unwrap();
conn.execute("CREATE TABLE test (id INTEGER)", []).unwrap();
drop(conn);
assert!(is_plain_sqlite(&db_path));
}
#[test]
fn test_is_plain_sqlite_rejects_encrypted() {
let dir = temp_dir("encrypted-detect");
let db_path = dir.join("test.db");
// Create an encrypted database
let conn = Connection::open(&db_path).unwrap();
conn.execute_batch(
"PRAGMA key = 'test-key';\
PRAGMA cipher_page_size = 16384;",
)
.unwrap();
conn.execute("CREATE TABLE test (id INTEGER)", []).unwrap();
drop(conn);
assert!(!is_plain_sqlite(&db_path));
}
}

View File

@ -155,41 +155,6 @@ pub fn run_migrations(conn: &Connection) -> anyhow::Result<()> {
"ALTER TABLE audit_log ADD COLUMN prev_hash TEXT NOT NULL DEFAULT '';
ALTER TABLE audit_log ADD COLUMN entry_hash TEXT NOT NULL DEFAULT '';",
),
(
"013_create_persistent_webviews",
"CREATE TABLE IF NOT EXISTS persistent_webviews (
id TEXT PRIMARY KEY,
service TEXT NOT NULL CHECK(service IN ('confluence','servicenow','azuredevops')),
webview_label TEXT NOT NULL,
base_url TEXT NOT NULL,
last_active TEXT NOT NULL DEFAULT (datetime('now')),
window_x INTEGER,
window_y INTEGER,
window_width INTEGER,
window_height INTEGER,
UNIQUE(service)
);",
),
(
"014_create_ai_providers",
"CREATE TABLE IF NOT EXISTS ai_providers (
id TEXT PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
provider_type TEXT NOT NULL,
api_url TEXT NOT NULL,
encrypted_api_key TEXT NOT NULL,
model TEXT NOT NULL,
max_tokens INTEGER,
temperature REAL,
custom_endpoint_path TEXT,
custom_auth_header TEXT,
custom_auth_prefix TEXT,
api_format TEXT,
user_id TEXT,
created_at TEXT NOT NULL DEFAULT (datetime('now')),
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
);",
),
];
for (name, sql) in migrations {

View File

@ -179,60 +179,7 @@ fn get_encryption_key_material() -> Result<String, String> {
return Ok("dev-key-change-me-in-production-32b".to_string());
}
// Release: load or auto-generate a per-installation encryption key
// stored in the app data directory, similar to the database key.
if let Some(app_data_dir) = crate::state::get_app_data_dir() {
let key_path = app_data_dir.join(".enckey");
// Try to load existing key
if key_path.exists() {
if let Ok(key) = std::fs::read_to_string(&key_path) {
let key = key.trim().to_string();
if !key.is_empty() {
return Ok(key);
}
}
}
// Generate and store new key
use rand::RngCore;
let mut bytes = [0u8; 32];
rand::rngs::OsRng.fill_bytes(&mut bytes);
let key = hex::encode(bytes);
// Ensure directory exists
if let Err(e) = std::fs::create_dir_all(&app_data_dir) {
tracing::warn!("Failed to create app data directory: {e}");
return Err(format!("Failed to create app data directory: {e}"));
}
// Write key with restricted permissions
#[cfg(unix)]
{
use std::io::Write;
use std::os::unix::fs::OpenOptionsExt;
let mut f = std::fs::OpenOptions::new()
.write(true)
.create(true)
.truncate(true)
.mode(0o600)
.open(&key_path)
.map_err(|e| format!("Failed to write encryption key: {e}"))?;
f.write_all(key.as_bytes())
.map_err(|e| format!("Failed to write encryption key: {e}"))?;
}
#[cfg(not(unix))]
{
std::fs::write(&key_path, &key)
.map_err(|e| format!("Failed to write encryption key: {e}"))?;
}
tracing::info!("Generated new encryption key at {:?}", key_path);
return Ok(key);
}
Err("Failed to determine app data directory for encryption key storage".to_string())
Err("TFTSR_ENCRYPTION_KEY must be set in release builds".to_string())
}
fn derive_aes_key() -> Result<[u8; 32], String> {

View File

@ -1,265 +0,0 @@
use super::confluence_search::SearchResult;
/// Search Azure DevOps Wiki for content matching the query
pub async fn search_wiki(
org_url: &str,
project: &str,
query: &str,
cookies: &[crate::integrations::webview_auth::Cookie],
) -> Result<Vec<SearchResult>, String> {
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
let client = reqwest::Client::new();
// Use Azure DevOps Search API
let search_url = format!(
"{}/_apis/search/wikisearchresults?api-version=7.0",
org_url.trim_end_matches('/')
);
let search_body = serde_json::json!({
"searchText": query,
"$top": 5,
"filters": {
"ProjectFilters": [project]
}
});
tracing::info!("Searching Azure DevOps Wiki: {}", search_url);
let resp = client
.post(&search_url)
.header("Cookie", &cookie_header)
.header("Accept", "application/json")
.header("Content-Type", "application/json")
.json(&search_body)
.send()
.await
.map_err(|e| format!("Azure DevOps wiki search failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!(
"Azure DevOps wiki search failed with status {status}: {text}"
));
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse ADO wiki search response: {e}"))?;
let mut results = Vec::new();
if let Some(results_array) = json["results"].as_array() {
for item in results_array.iter().take(3) {
let title = item["fileName"].as_str().unwrap_or("Untitled").to_string();
let path = item["path"].as_str().unwrap_or("");
let url = format!(
"{}/_wiki/wikis/{}/{}",
org_url.trim_end_matches('/'),
project,
path
);
let excerpt = item["content"]
.as_str()
.unwrap_or("")
.chars()
.take(300)
.collect::<String>();
// Fetch full wiki page content
let content = if let Some(wiki_id) = item["wiki"]["id"].as_str() {
if let Some(page_path) = item["path"].as_str() {
fetch_wiki_page(org_url, wiki_id, page_path, &cookie_header)
.await
.ok()
} else {
None
}
} else {
None
};
results.push(SearchResult {
title,
url,
excerpt,
content,
source: "Azure DevOps".to_string(),
});
}
}
Ok(results)
}
/// Fetch full wiki page content
async fn fetch_wiki_page(
org_url: &str,
wiki_id: &str,
page_path: &str,
cookie_header: &str,
) -> Result<String, String> {
let client = reqwest::Client::new();
let page_url = format!(
"{}/_apis/wiki/wikis/{}/pages?path={}&api-version=7.0&includeContent=true",
org_url.trim_end_matches('/'),
wiki_id,
urlencoding::encode(page_path)
);
let resp = client
.get(&page_url)
.header("Cookie", cookie_header)
.header("Accept", "application/json")
.send()
.await
.map_err(|e| format!("Failed to fetch wiki page: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
return Err(format!("Failed to fetch wiki page: {status}"));
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse wiki page: {e}"))?;
let content = json["content"].as_str().unwrap_or("").to_string();
// Truncate to reasonable length
let truncated = if content.len() > 3000 {
format!("{}...", &content[..3000])
} else {
content
};
Ok(truncated)
}
/// Search Azure DevOps Work Items for related issues
pub async fn search_work_items(
org_url: &str,
project: &str,
query: &str,
cookies: &[crate::integrations::webview_auth::Cookie],
) -> Result<Vec<SearchResult>, String> {
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
let client = reqwest::Client::new();
// Use WIQL (Work Item Query Language)
let wiql_url = format!(
"{}/_apis/wit/wiql?api-version=7.0",
org_url.trim_end_matches('/')
);
let wiql_query = format!(
"SELECT [System.Id], [System.Title], [System.Description], [System.State] FROM WorkItems WHERE [System.TeamProject] = '{project}' AND ([System.Title] CONTAINS '{query}' OR [System.Description] CONTAINS '{query}') ORDER BY [System.ChangedDate] DESC"
);
let wiql_body = serde_json::json!({
"query": wiql_query
});
tracing::info!("Searching Azure DevOps work items");
let resp = client
.post(&wiql_url)
.header("Cookie", &cookie_header)
.header("Accept", "application/json")
.header("Content-Type", "application/json")
.json(&wiql_body)
.send()
.await
.map_err(|e| format!("ADO work item search failed: {e}"))?;
if !resp.status().is_success() {
return Ok(Vec::new()); // Don't fail if work item search fails
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|_| "Failed to parse work item response".to_string())?;
let mut results = Vec::new();
if let Some(work_items) = json["workItems"].as_array() {
// Fetch details for top 3 work items
for item in work_items.iter().take(3) {
if let Some(id) = item["id"].as_i64() {
if let Ok(work_item) = fetch_work_item_details(org_url, id, &cookie_header).await {
results.push(work_item);
}
}
}
}
Ok(results)
}
/// Fetch work item details
async fn fetch_work_item_details(
org_url: &str,
id: i64,
cookie_header: &str,
) -> Result<SearchResult, String> {
let client = reqwest::Client::new();
let item_url = format!(
"{}/_apis/wit/workitems/{}?api-version=7.0",
org_url.trim_end_matches('/'),
id
);
let resp = client
.get(&item_url)
.header("Cookie", cookie_header)
.header("Accept", "application/json")
.send()
.await
.map_err(|e| format!("Failed to fetch work item: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
return Err(format!("Failed to fetch work item: {status}"));
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse work item: {e}"))?;
let fields = &json["fields"];
let title = format!(
"Work Item {}: {}",
id,
fields["System.Title"].as_str().unwrap_or("No title")
);
let url = json["_links"]["html"]["href"]
.as_str()
.unwrap_or("")
.to_string();
let description = fields["System.Description"]
.as_str()
.unwrap_or("")
.to_string();
let state = fields["System.State"].as_str().unwrap_or("Unknown");
let content = format!("State: {state}\n\nDescription: {description}");
let excerpt = content.chars().take(200).collect::<String>();
Ok(SearchResult {
title,
url,
excerpt,
content: Some(content),
source: "Azure DevOps".to_string(),
})
}

View File

@ -1,188 +0,0 @@
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SearchResult {
pub title: String,
pub url: String,
pub excerpt: String,
pub content: Option<String>,
pub source: String, // "confluence", "servicenow", "azuredevops"
}
/// Search Confluence for content matching the query
pub async fn search_confluence(
base_url: &str,
query: &str,
cookies: &[crate::integrations::webview_auth::Cookie],
) -> Result<Vec<SearchResult>, String> {
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
let client = reqwest::Client::new();
// Use Confluence CQL search
let search_url = format!(
"{}/rest/api/search?cql=text~\"{}\"&limit=5",
base_url.trim_end_matches('/'),
urlencoding::encode(query)
);
tracing::info!("Searching Confluence: {}", search_url);
let resp = client
.get(&search_url)
.header("Cookie", &cookie_header)
.header("Accept", "application/json")
.send()
.await
.map_err(|e| format!("Confluence search request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!(
"Confluence search failed with status {status}: {text}"
));
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse Confluence search response: {e}"))?;
let mut results = Vec::new();
if let Some(results_array) = json["results"].as_array() {
for item in results_array.iter().take(3) {
// Take top 3 results
let title = item["title"].as_str().unwrap_or("Untitled").to_string();
let id = item["content"]["id"].as_str();
let space_key = item["content"]["space"]["key"].as_str();
// Build URL
let url = if let (Some(id_str), Some(space)) = (id, space_key) {
format!(
"{}/display/{}/{}",
base_url.trim_end_matches('/'),
space,
id_str
)
} else {
base_url.to_string()
};
// Get excerpt from search result
let excerpt = item["excerpt"]
.as_str()
.unwrap_or("")
.to_string()
.replace("<span class=\"highlight\">", "")
.replace("</span>", "");
// Fetch full page content
let content = if let Some(content_id) = id {
fetch_page_content(base_url, content_id, &cookie_header)
.await
.ok()
} else {
None
};
results.push(SearchResult {
title,
url,
excerpt,
content,
source: "Confluence".to_string(),
});
}
}
Ok(results)
}
/// Fetch full content of a Confluence page
async fn fetch_page_content(
base_url: &str,
page_id: &str,
cookie_header: &str,
) -> Result<String, String> {
let client = reqwest::Client::new();
let content_url = format!(
"{}/rest/api/content/{}?expand=body.storage",
base_url.trim_end_matches('/'),
page_id
);
let resp = client
.get(&content_url)
.header("Cookie", cookie_header)
.header("Accept", "application/json")
.send()
.await
.map_err(|e| format!("Failed to fetch page content: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
return Err(format!("Failed to fetch page: {status}"));
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse page content: {e}"))?;
// Extract plain text from HTML storage format
let html = json["body"]["storage"]["value"]
.as_str()
.unwrap_or("")
.to_string();
// Basic HTML tag stripping (for better results, use a proper HTML parser)
let text = strip_html_tags(&html);
// Truncate to reasonable length for AI context
let truncated = if text.len() > 3000 {
format!("{}...", &text[..3000])
} else {
text
};
Ok(truncated)
}
/// Basic HTML tag stripping
fn strip_html_tags(html: &str) -> String {
let mut result = String::new();
let mut in_tag = false;
for ch in html.chars() {
match ch {
'<' => in_tag = true,
'>' => in_tag = false,
_ if !in_tag => result.push(ch),
_ => {}
}
}
// Clean up whitespace
result
.split_whitespace()
.collect::<Vec<_>>()
.join(" ")
.trim()
.to_string()
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_strip_html_tags() {
let html = "<p>Hello <strong>world</strong>!</p>";
assert_eq!(strip_html_tags(html), "Hello world!");
let html2 = "<div><h1>Title</h1><p>Content</p></div>";
assert_eq!(strip_html_tags(html2), "TitleContent");
}
}

View File

@ -1,13 +1,9 @@
pub mod auth;
pub mod azuredevops;
pub mod azuredevops_search;
pub mod callback_server;
pub mod confluence;
pub mod confluence_search;
pub mod servicenow;
pub mod servicenow_search;
pub mod webview_auth;
pub mod webview_fetch;
use serde::{Deserialize, Serialize};

View File

@ -1,45 +0,0 @@
/// Platform-specific native cookie extraction from webview
/// This can access HttpOnly cookies that JavaScript cannot
use super::webview_auth::Cookie;
#[cfg(target_os = "macos")]
pub async fn extract_cookies_native(
window_label: &str,
domain: &str,
) -> Result<Vec<Cookie>, String> {
// On macOS, we can use WKWebView's HTTPCookieStore via Objective-C bridge
// This requires cocoa/objc crates which we don't have yet
// For now, return an error indicating this needs implementation
tracing::warn!("Native cookie extraction not yet implemented for macOS");
Err("Native cookie extraction requires additional dependencies (cocoa, objc)".to_string())
}
#[cfg(target_os = "windows")]
pub async fn extract_cookies_native(
window_label: &str,
domain: &str,
) -> Result<Vec<Cookie>, String> {
// On Windows, we can use WebView2's cookie manager
// This requires windows crates
tracing::warn!("Native cookie extraction not yet implemented for Windows");
Err("Native cookie extraction requires additional dependencies (windows crate)".to_string())
}
#[cfg(target_os = "linux")]
pub async fn extract_cookies_native(
window_label: &str,
domain: &str,
) -> Result<Vec<Cookie>, String> {
// On Linux with WebKitGTK, we can use the cookie manager
tracing::warn!("Native cookie extraction not yet implemented for Linux");
Err("Native cookie extraction requires additional dependencies (webkit2gtk)".to_string())
}
#[cfg(not(any(target_os = "macos", target_os = "windows", target_os = "linux")))]
pub async fn extract_cookies_native(
_window_label: &str,
_domain: &str,
) -> Result<Vec<Cookie>, String> {
Err("Native cookie extraction not supported on this platform".to_string())
}

View File

@ -1,50 +0,0 @@
/// macOS-specific native cookie extraction using WKWebView's HTTPCookieStore
/// This can access HttpOnly cookies that JavaScript cannot
#[cfg(target_os = "macos")]
use super::webview_auth::Cookie;
#[cfg(target_os = "macos")]
pub async fn extract_cookies_native(
webview_label: &str,
domain: &str,
) -> Result<Vec<Cookie>, String> {
use cocoa::base::{id, nil};
use cocoa::foundation::{NSArray, NSString};
use objc::runtime::{Class, Object};
use objc::{msg_send, sel, sel_impl};
tracing::info!("Attempting native cookie extraction for {} on domain {}", webview_label, domain);
unsafe {
// Get the WKWebsiteDataStore (where cookies are stored)
let wk_websitedata_store_class = Class::get("WKWebsiteDataStore").ok_or("WKWebsiteDataStore class not found")?;
let data_store: id = msg_send![wk_websitedata_store_class, defaultDataStore];
if data_store == nil {
return Err("Failed to get WKWebsiteDataStore".to_string());
}
// Get the HTTPCookieStore
let cookie_store: id = msg_send![data_store, httpCookieStore];
if cookie_store == nil {
return Err("Failed to get HTTPCookieStore".to_string());
}
// Unfortunately, WKHTTPCookieStore's getAllCookies method requires a completion handler
// which is complex to bridge from Rust. For now, we'll document this limitation
// and suggest using the Tauri cookie plugin when it's available.
tracing::warn!("Native cookie extraction requires async completion handler - not yet fully implemented");
Err("Native cookie extraction requires Tauri cookie plugin (coming in future Tauri version)".to_string())
}
}
#[cfg(not(target_os = "macos"))]
pub async fn extract_cookies_native(
_webview_label: &str,
_domain: &str,
) -> Result<Vec<super::webview_auth::Cookie>, String> {
Err("Native cookie extraction only supported on macOS".to_string())
}

View File

@ -1,163 +0,0 @@
use super::confluence_search::SearchResult;
/// Search ServiceNow Knowledge Base for content matching the query
pub async fn search_servicenow(
instance_url: &str,
query: &str,
cookies: &[crate::integrations::webview_auth::Cookie],
) -> Result<Vec<SearchResult>, String> {
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
let client = reqwest::Client::new();
// Search Knowledge Base articles
let search_url = format!(
"{}/api/now/table/kb_knowledge?sysparm_query=textLIKE{}^ORshort_descriptionLIKE{}&sysparm_limit=5",
instance_url.trim_end_matches('/'),
urlencoding::encode(query),
urlencoding::encode(query)
);
tracing::info!("Searching ServiceNow: {}", search_url);
let resp = client
.get(&search_url)
.header("Cookie", &cookie_header)
.header("Accept", "application/json")
.send()
.await
.map_err(|e| format!("ServiceNow search request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!(
"ServiceNow search failed with status {status}: {text}"
));
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse ServiceNow search response: {e}"))?;
let mut results = Vec::new();
if let Some(result_array) = json["result"].as_array() {
for item in result_array.iter().take(3) {
// Take top 3 results
let title = item["short_description"]
.as_str()
.unwrap_or("Untitled")
.to_string();
let sys_id = item["sys_id"].as_str().unwrap_or("").to_string();
let url = format!(
"{}/kb_view.do?sysparm_article={}",
instance_url.trim_end_matches('/'),
sys_id
);
let excerpt = item["text"]
.as_str()
.unwrap_or("")
.chars()
.take(300)
.collect::<String>();
// Get full article content
let content = item["text"].as_str().map(|text| {
if text.len() > 3000 {
format!("{}...", &text[..3000])
} else {
text.to_string()
}
});
results.push(SearchResult {
title,
url,
excerpt,
content,
source: "ServiceNow".to_string(),
});
}
}
Ok(results)
}
/// Search ServiceNow Incidents for related issues
pub async fn search_incidents(
instance_url: &str,
query: &str,
cookies: &[crate::integrations::webview_auth::Cookie],
) -> Result<Vec<SearchResult>, String> {
let cookie_header = crate::integrations::webview_auth::cookies_to_header(cookies);
let client = reqwest::Client::new();
// Search incidents
let search_url = format!(
"{}/api/now/table/incident?sysparm_query=short_descriptionLIKE{}^ORdescriptionLIKE{}&sysparm_limit=3&sysparm_display_value=true",
instance_url.trim_end_matches('/'),
urlencoding::encode(query),
urlencoding::encode(query)
);
tracing::info!("Searching ServiceNow incidents: {}", search_url);
let resp = client
.get(&search_url)
.header("Cookie", &cookie_header)
.header("Accept", "application/json")
.send()
.await
.map_err(|e| format!("ServiceNow incident search failed: {e}"))?;
if !resp.status().is_success() {
return Ok(Vec::new()); // Don't fail if incident search fails
}
let json: serde_json::Value = resp
.json()
.await
.map_err(|_| "Failed to parse incident response".to_string())?;
let mut results = Vec::new();
if let Some(result_array) = json["result"].as_array() {
for item in result_array.iter() {
let number = item["number"].as_str().unwrap_or("Unknown");
let title = format!(
"Incident {}: {}",
number,
item["short_description"].as_str().unwrap_or("No title")
);
let sys_id = item["sys_id"].as_str().unwrap_or("");
let url = format!(
"{}/incident.do?sys_id={}",
instance_url.trim_end_matches('/'),
sys_id
);
let description = item["description"].as_str().unwrap_or("").to_string();
let resolution = item["close_notes"].as_str().unwrap_or("").to_string();
let content = format!("Description: {description}\nResolution: {resolution}");
let excerpt = content.chars().take(200).collect::<String>();
results.push(SearchResult {
title,
url,
excerpt,
content: Some(content),
source: "ServiceNow".to_string(),
});
}
}
Ok(results)
}

View File

@ -1,5 +1,5 @@
use serde::{Deserialize, Serialize};
use tauri::{AppHandle, WebviewUrl, WebviewWindow, WebviewWindowBuilder};
use tauri::{AppHandle, Listener, WebviewUrl, WebviewWindow, WebviewWindowBuilder};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ExtractedCredentials {
@ -24,53 +24,30 @@ pub async fn authenticate_with_webview(
app_handle: AppHandle,
service: &str,
base_url: &str,
project_name: Option<&str>,
) -> Result<ExtractedCredentials, String> {
let trimmed_base_url = base_url.trim_end_matches('/');
tracing::info!(
"authenticate_with_webview called: service={}, base_url={}, project_name={:?}",
service,
base_url,
project_name
);
let login_url = match service {
"confluence" => format!("{trimmed_base_url}/login.action"),
"azuredevops" => {
// Azure DevOps - go directly to project if provided, otherwise org home
if let Some(project) = project_name {
let url = format!("{trimmed_base_url}/{project}");
tracing::info!("Azure DevOps URL with project: {}", url);
url
} else {
tracing::info!("Azure DevOps URL without project: {}", trimmed_base_url);
trimmed_base_url.to_string()
}
// Azure DevOps login - user will be redirected through Microsoft SSO
format!("{trimmed_base_url}/_signin")
}
"servicenow" => format!("{trimmed_base_url}/login.do"),
_ => return Err(format!("Unknown service: {service}")),
};
tracing::info!("Final login_url for {} = {}", service, login_url);
tracing::info!(
"Opening persistent browser for {} at {}",
service,
login_url
);
// Create persistent browser window (stays open for browsing and fresh cookie extraction)
let webview_label = format!("{service}-auth");
tracing::info!("Creating webview window with label: {}", webview_label);
let parsed_url = login_url.parse().map_err(|e| {
let err_msg = format!("Failed to parse URL '{login_url}': {e}");
tracing::error!("{err_msg}");
err_msg
})?;
tracing::info!("Parsed URL successfully: {:?}", parsed_url);
let webview = WebviewWindowBuilder::new(
&app_handle,
&webview_label,
WebviewUrl::External(parsed_url),
WebviewUrl::External(login_url.parse().map_err(|e| format!("Invalid URL: {e}"))?),
)
.title(format!(
"{service} Browser (Troubleshooting and RCA Assistant)"
@ -80,20 +57,14 @@ pub async fn authenticate_with_webview(
.resizable(true)
.center()
.focused(true)
.visible(true) // Show immediately - let user see loading
.user_agent("Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36")
.zoom_hotkeys_enabled(true)
.devtools(true)
.initialization_script("console.log('Webview initialized');")
.visible(true)
.build()
.map_err(|e| format!("Failed to create webview: {e}"))?;
tracing::info!("Webview window created successfully, setting focus");
// Ensure window is focused
// Focus the window
webview
.set_focus()
.map_err(|e| tracing::warn!("Failed to set focus: {}", e))
.map_err(|e| tracing::warn!("Failed to focus webview: {e}"))
.ok();
// Wait for user to complete login
@ -106,158 +77,121 @@ pub async fn authenticate_with_webview(
})
}
/// Extract cookies from a webview using localStorage as intermediary.
/// This works for external URLs where window.__TAURI__ is not available.
/// Extract cookies from a webview using Tauri's IPC mechanism.
/// This is the most reliable cross-platform approach.
pub async fn extract_cookies_via_ipc<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
_app_handle: &AppHandle<R>,
app_handle: &AppHandle<R>,
) -> Result<Vec<Cookie>, String> {
// Step 1: Inject JavaScript to extract cookies and store in a global variable
// We can't use __TAURI__ for external URLs, so we use a polling approach
// Inject JavaScript that will send cookies via IPC
// Note: We use window.__TAURI__ which is the Tauri 2.x API exposed to webviews
let cookie_extraction_script = r#"
(function() {
(async function() {
try {
const cookieString = document.cookie;
const cookies = [];
if (cookieString && cookieString.trim() !== '') {
const cookieList = cookieString.split(';').map(c => c.trim()).filter(c => c.length > 0);
for (const cookie of cookieList) {
const equalIndex = cookie.indexOf('=');
if (equalIndex === -1) continue;
const name = cookie.substring(0, equalIndex).trim();
const value = cookie.substring(equalIndex + 1).trim();
cookies.push({
name: name,
value: value,
domain: window.location.hostname,
path: '/',
secure: window.location.protocol === 'https:',
http_only: false,
expires: null
});
}
// Wait for Tauri API to be available
if (typeof window.__TAURI__ === 'undefined') {
console.error('Tauri API not available');
return;
}
// Store in a global variable that Rust can read
window.__TFTSR_COOKIES__ = cookies;
console.log('[TFTSR] Extracted', cookies.length, 'cookies');
return cookies.length;
const cookieString = document.cookie;
if (!cookieString || cookieString.trim() === '') {
await window.__TAURI__.event.emit('tftsr-cookies-extracted', { cookies: [] });
return;
}
const cookies = cookieString.split(';').map(c => c.trim()).filter(c => c.length > 0);
const parsed = cookies.map(cookie => {
const equalIndex = cookie.indexOf('=');
if (equalIndex === -1) return null;
const name = cookie.substring(0, equalIndex).trim();
const value = cookie.substring(equalIndex + 1).trim();
return {
name: name,
value: value,
domain: window.location.hostname,
path: '/',
secure: window.location.protocol === 'https:',
http_only: false,
expires: null
};
}).filter(c => c !== null);
// Use Tauri's event API to send cookies back to Rust
await window.__TAURI__.event.emit('tftsr-cookies-extracted', { cookies: parsed });
console.log('Cookies extracted and emitted:', parsed.length);
} catch (e) {
console.error('[TFTSR] Cookie extraction failed:', e);
window.__TFTSR_COOKIES__ = [];
window.__TFTSR_ERROR__ = e.message;
return -1;
console.error('Cookie extraction failed:', e);
try {
await window.__TAURI__.event.emit('tftsr-cookies-extracted', { cookies: [], error: e.message });
} catch (emitError) {
console.error('Failed to emit error:', emitError);
}
}
})();
"#;
// Inject the extraction script
// Set up event listener first
let (tx, mut rx) = tokio::sync::mpsc::channel::<Result<Vec<Cookie>, String>>(1);
// Listen for the custom event from the webview
let listen_id = app_handle.listen("tftsr-cookies-extracted", move |event| {
tracing::debug!("Received cookies-extracted event");
let payload_str = event.payload();
// Parse the payload JSON
match serde_json::from_str::<serde_json::Value>(payload_str) {
Ok(payload) => {
if let Some(error_msg) = payload.get("error").and_then(|e| e.as_str()) {
let _ = tx.try_send(Err(format!("JavaScript error: {error_msg}")));
return;
}
if let Some(cookies_value) = payload.get("cookies") {
match serde_json::from_value::<Vec<Cookie>>(cookies_value.clone()) {
Ok(cookies) => {
tracing::info!("Parsed {} cookies from webview", cookies.len());
let _ = tx.try_send(Ok(cookies));
}
Err(e) => {
tracing::error!("Failed to parse cookies: {e}");
let _ = tx.try_send(Err(format!("Failed to parse cookies: {e}")));
}
}
} else {
let _ = tx.try_send(Err("No cookies field in payload".to_string()));
}
}
Err(e) => {
tracing::error!("Failed to parse event payload: {e}");
let _ = tx.try_send(Err(format!("Failed to parse event payload: {e}")));
}
}
});
// Inject the script into the webview
webview_window
.eval(cookie_extraction_script)
.map_err(|e| format!("Failed to inject cookie extraction script: {e}"))?;
tracing::info!("Cookie extraction script injected, waiting for cookies...");
tracing::info!("Cookie extraction script injected, waiting for response...");
// Give JavaScript a moment to execute
tokio::time::sleep(tokio::time::Duration::from_millis(500)).await;
// Wait for cookies with timeout
let result = tokio::time::timeout(tokio::time::Duration::from_secs(10), rx.recv())
.await
.map_err(|_| {
"Timeout waiting for cookies. Make sure you are logged in and on the correct page."
.to_string()
})?
.ok_or_else(|| "Failed to receive cookies from webview".to_string())?;
// Step 2: Poll for the extracted cookies using document.title as communication channel
let mut attempts = 0;
let max_attempts = 20; // 10 seconds total (500ms * 20)
// Clean up event listener
app_handle.unlisten(listen_id);
loop {
attempts += 1;
// Store result in localStorage, then copy to document.title for Rust to read
let check_and_signal_script = r#"
try {
if (typeof window.__TFTSR_ERROR__ !== 'undefined') {
window.localStorage.setItem('tftsr_result', JSON.stringify({ error: window.__TFTSR_ERROR__ }));
} else if (typeof window.__TFTSR_COOKIES__ !== 'undefined' && window.__TFTSR_COOKIES__.length > 0) {
window.localStorage.setItem('tftsr_result', JSON.stringify({ cookies: window.__TFTSR_COOKIES__ }));
} else if (typeof window.__TFTSR_COOKIES__ !== 'undefined') {
window.localStorage.setItem('tftsr_result', JSON.stringify({ cookies: [] }));
}
} catch (e) {
window.localStorage.setItem('tftsr_result', JSON.stringify({ error: e.message }));
}
"#;
webview_window.eval(check_and_signal_script).ok();
tokio::time::sleep(tokio::time::Duration::from_millis(500)).await;
// We can't get return values from eval(), so let's use a different approach:
// Execute script that sets document.title temporarily
let read_via_title = r#"
(function() {
const result = window.localStorage.getItem('tftsr_result');
if (result) {
window.localStorage.removeItem('tftsr_result');
// Store in title temporarily for Rust to read
window.__TFTSR_ORIGINAL_TITLE__ = document.title;
document.title = 'TFTSR_RESULT:' + result;
}
})();
"#;
webview_window.eval(read_via_title).ok();
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
// Read the title
if let Ok(title) = webview_window.title() {
if let Some(json_str) = title.strip_prefix("TFTSR_RESULT:") {
// Restore original title
let restore_title = r#"
if (typeof window.__TFTSR_ORIGINAL_TITLE__ !== 'undefined') {
document.title = window.__TFTSR_ORIGINAL_TITLE__;
}
"#;
webview_window.eval(restore_title).ok();
// Parse the JSON
match serde_json::from_str::<serde_json::Value>(json_str) {
Ok(result) => {
if let Some(error) = result.get("error").and_then(|e| e.as_str()) {
return Err(format!("Cookie extraction error: {error}"));
}
if let Some(cookies_value) = result.get("cookies") {
match serde_json::from_value::<Vec<Cookie>>(cookies_value.clone()) {
Ok(cookies) => {
tracing::info!(
"Successfully extracted {} cookies",
cookies.len()
);
return Ok(cookies);
}
Err(e) => {
return Err(format!("Failed to parse cookies: {e}"));
}
}
}
}
Err(e) => {
tracing::warn!("Failed to parse result JSON: {e}");
}
}
}
}
if attempts >= max_attempts {
return Err(
"Timeout extracting cookies. This may be because:\n\
1. Confluence uses HttpOnly cookies that JavaScript cannot access\n\
2. You're not logged in yet\n\
3. The page hasn't finished loading\n\n\
Recommendation: Use 'Manual Token' authentication with a Confluence Personal Access Token instead."
.to_string(),
);
}
}
result
}
/// Build cookie header string for HTTP requests

View File

@ -1,687 +0,0 @@
/// Webview-based HTTP fetching that automatically includes HttpOnly cookies
/// Makes requests FROM the authenticated webview using JavaScript fetch API
///
/// This uses Tauri's window.location to pass results back (cross-document messaging)
use serde_json::Value;
use tauri::WebviewWindow;
use super::confluence_search::SearchResult;
/// Execute an HTTP request from within the webview context
/// This automatically includes all cookies (including HttpOnly) from the authenticated session
pub async fn fetch_from_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
url: &str,
method: &str,
body: Option<&str>,
) -> Result<Value, String> {
let request_id = uuid::Uuid::now_v7().to_string();
let (headers_js, body_js) = if let Some(b) = body {
// For POST/PUT with JSON body
(
"headers: { 'Accept': 'application/json', 'Content-Type': 'application/json' }",
format!(", body: JSON.stringify({b})"),
)
} else {
// For GET requests
("headers: { 'Accept': 'application/json' }", String::new())
};
// Inject script that:
// 1. Makes fetch request with credentials
// 2. Uses window.location.hash to communicate results back
let fetch_script = format!(
r#"
(async function() {{
const requestId = '{request_id}';
try {{
const response = await fetch('{url}', {{
method: '{method}',
{headers_js},
credentials: 'include'{body_js}
}});
if (!response.ok) {{
window.location.hash = '#trcaa-error-' + requestId + '-' + encodeURIComponent(JSON.stringify({{
error: `HTTP ${{response.status}}: ${{response.statusText}}`
}}));
return;
}}
const data = await response.json();
// Store in hash - we'll poll for this
window.location.hash = '#trcaa-success-' + requestId + '-' + encodeURIComponent(JSON.stringify(data));
}} catch (error) {{
window.location.hash = '#trcaa-error-' + requestId + '-' + encodeURIComponent(JSON.stringify({{
error: error.message
}}));
}}
}})();
"#
);
// Execute the fetch
webview_window
.eval(&fetch_script)
.map_err(|e| format!("Failed to execute fetch: {e}"))?;
// Poll for result by checking window URL/hash
for i in 0..50 {
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
// Get the current URL to check the hash
if let Ok(url_str) = webview_window.url() {
let url_string = url_str.to_string();
// Check for success
let success_marker = format!("#trcaa-success-{request_id}-");
if url_string.contains(&success_marker) {
// Extract the JSON from the hash
if let Some(json_start) = url_string.find(&success_marker) {
let json_encoded = &url_string[json_start + success_marker.len()..];
if let Ok(decoded) = urlencoding::decode(json_encoded) {
// Clear the hash
webview_window.eval("window.location.hash = '';").ok();
// Parse JSON
if let Ok(result) = serde_json::from_str::<Value>(&decoded) {
tracing::info!("Webview fetch successful");
return Ok(result);
}
}
}
}
// Check for error
let error_marker = format!("#trcaa-error-{request_id}-");
if url_string.contains(&error_marker) {
if let Some(json_start) = url_string.find(&error_marker) {
let json_encoded = &url_string[json_start + error_marker.len()..];
if let Ok(decoded) = urlencoding::decode(json_encoded) {
// Clear the hash
webview_window.eval("window.location.hash = '';").ok();
return Err(format!("Webview fetch error: {decoded}"));
}
}
}
}
if i % 10 == 0 {
tracing::debug!("Waiting for webview fetch... ({}s)", i / 10);
}
}
Err("Timeout waiting for webview fetch response (5s)".to_string())
}
/// Search Confluence using webview fetch (includes HttpOnly cookies automatically)
pub async fn search_confluence_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
base_url: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
// Extract keywords from the query for better search
// Remove common words and extract important terms
let keywords = extract_keywords(query);
// Build CQL query with OR logic for keywords
let cql = if keywords.len() > 1 {
// Multiple keywords - search for any of them
let keyword_conditions: Vec<String> =
keywords.iter().map(|k| format!("text ~ \"{k}\"")).collect();
keyword_conditions.join(" OR ")
} else if !keywords.is_empty() {
// Single keyword
let keyword = &keywords[0];
format!("text ~ \"{keyword}\"")
} else {
// Fallback to original query
format!("text ~ \"{query}\"")
};
let search_url = format!(
"{}/rest/api/search?cql={}&limit=10",
base_url.trim_end_matches('/'),
urlencoding::encode(&cql)
);
tracing::info!("Executing Confluence search via webview with CQL: {}", cql);
let response = fetch_from_webview(webview_window, &search_url, "GET", None).await?;
let mut results = Vec::new();
if let Some(results_array) = response.get("results").and_then(|v| v.as_array()) {
for item in results_array.iter().take(5) {
let title = item["title"].as_str().unwrap_or("Untitled").to_string();
let content_id = item["content"]["id"].as_str();
let space_key = item["content"]["space"]["key"].as_str();
let url = if let (Some(id), Some(space)) = (content_id, space_key) {
format!(
"{}/display/{}/{}",
base_url.trim_end_matches('/'),
space,
id
)
} else {
base_url.to_string()
};
let excerpt = item["excerpt"]
.as_str()
.unwrap_or("")
.replace("<span class=\"highlight\">", "")
.replace("</span>", "");
// Fetch full page content
let content = if let Some(id) = content_id {
let content_url = format!(
"{}/rest/api/content/{id}?expand=body.storage",
base_url.trim_end_matches('/')
);
if let Ok(content_resp) =
fetch_from_webview(webview_window, &content_url, "GET", None).await
{
if let Some(body) = content_resp
.get("body")
.and_then(|b| b.get("storage"))
.and_then(|s| s.get("value"))
.and_then(|v| v.as_str())
{
let text = strip_html_simple(body);
Some(if text.len() > 3000 {
format!("{}...", &text[..3000])
} else {
text
})
} else {
None
}
} else {
None
}
} else {
None
};
results.push(SearchResult {
title,
url,
excerpt: excerpt.chars().take(300).collect(),
content,
source: "Confluence".to_string(),
});
}
}
tracing::info!(
"Confluence webview search returned {} results",
results.len()
);
Ok(results)
}
/// Extract keywords from a search query
/// Removes stop words and extracts important terms
fn extract_keywords(query: &str) -> Vec<String> {
// Common stop words to filter out
let stop_words = vec![
"how", "do", "i", "the", "a", "an", "is", "are", "was", "were", "be", "been", "being",
"have", "has", "had", "having", "do", "does", "did", "doing", "will", "would", "should",
"could", "can", "may", "might", "must", "to", "from", "in", "on", "at", "by", "for",
"with", "about", "as", "of", "or", "and", "but", "not", "what", "when", "where", "which",
"who",
];
let mut keywords = Vec::new();
// Split on whitespace and punctuation
for word in query.split(|c: char| c.is_whitespace() || c == '?' || c == '!' || c == '.') {
let cleaned = word.trim().to_lowercase();
// Skip if empty, too short, or a stop word
if cleaned.is_empty() || cleaned.len() < 2 || stop_words.contains(&cleaned.as_str()) {
continue;
}
// Keep version numbers (e.g., "1.0.12")
if cleaned.contains('.') && cleaned.chars().any(|c| c.is_numeric()) {
keywords.push(cleaned);
continue;
}
// Keep ticket numbers and IDs (pure numbers >= 3 digits)
if cleaned.chars().all(|c| c.is_numeric()) && cleaned.len() >= 3 {
keywords.push(cleaned);
continue;
}
// Keep if it has letters
if cleaned.chars().any(|c| c.is_alphabetic()) {
keywords.push(cleaned);
}
}
// Deduplicate
keywords.sort();
keywords.dedup();
keywords
}
/// Simple HTML tag stripping (for content preview)
fn strip_html_simple(html: &str) -> String {
let mut result = String::new();
let mut in_tag = false;
for ch in html.chars() {
match ch {
'<' => in_tag = true,
'>' => in_tag = false,
_ if !in_tag => result.push(ch),
_ => {}
}
}
result.split_whitespace().collect::<Vec<_>>().join(" ")
}
/// Search ServiceNow using webview fetch
pub async fn search_servicenow_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
instance_url: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
let mut results = Vec::new();
// Search knowledge base
let kb_url = format!(
"{}/api/now/table/kb_knowledge?sysparm_query=textLIKE{}^ORshort_descriptionLIKE{}&sysparm_limit=3",
instance_url.trim_end_matches('/'),
urlencoding::encode(query),
urlencoding::encode(query)
);
tracing::info!("Executing ServiceNow KB search via webview");
if let Ok(kb_response) = fetch_from_webview(webview_window, &kb_url, "GET", None).await {
if let Some(kb_array) = kb_response.get("result").and_then(|v| v.as_array()) {
for item in kb_array {
let title = item["short_description"]
.as_str()
.unwrap_or("Untitled")
.to_string();
let sys_id = item["sys_id"].as_str().unwrap_or("");
let url = format!(
"{}/kb_view.do?sysparm_article={sys_id}",
instance_url.trim_end_matches('/')
);
let text = item["text"].as_str().unwrap_or("");
let excerpt = text.chars().take(300).collect();
let content = Some(if text.len() > 3000 {
format!("{}...", &text[..3000])
} else {
text.to_string()
});
results.push(SearchResult {
title,
url,
excerpt,
content,
source: "ServiceNow".to_string(),
});
}
}
}
// Search incidents
let inc_url = format!(
"{}/api/now/table/incident?sysparm_query=short_descriptionLIKE{}^ORdescriptionLIKE{}&sysparm_limit=3&sysparm_display_value=true",
instance_url.trim_end_matches('/'),
urlencoding::encode(query),
urlencoding::encode(query)
);
if let Ok(inc_response) = fetch_from_webview(webview_window, &inc_url, "GET", None).await {
if let Some(inc_array) = inc_response.get("result").and_then(|v| v.as_array()) {
for item in inc_array {
let number = item["number"].as_str().unwrap_or("Unknown");
let title = format!(
"Incident {}: {}",
number,
item["short_description"].as_str().unwrap_or("No title")
);
let sys_id = item["sys_id"].as_str().unwrap_or("");
let url = format!(
"{}/incident.do?sys_id={sys_id}",
instance_url.trim_end_matches('/')
);
let description = item["description"].as_str().unwrap_or("");
let resolution = item["close_notes"].as_str().unwrap_or("");
let content = format!("Description: {description}\nResolution: {resolution}");
let excerpt = content.chars().take(200).collect();
results.push(SearchResult {
title,
url,
excerpt,
content: Some(content),
source: "ServiceNow".to_string(),
});
}
}
}
tracing::info!(
"ServiceNow webview search returned {} results",
results.len()
);
Ok(results)
}
/// Search Azure DevOps wiki using webview fetch
pub async fn search_azuredevops_wiki_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
org_url: &str,
project: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
// Extract keywords for better search
let keywords = extract_keywords(query);
let search_text = if !keywords.is_empty() {
keywords.join(" ")
} else {
query.to_string()
};
// Azure DevOps wiki search API
let search_url = format!(
"{}/{}/_apis/wiki/wikis?api-version=7.0",
org_url.trim_end_matches('/'),
urlencoding::encode(project)
);
tracing::info!(
"Executing Azure DevOps wiki search via webview for: {}",
search_text
);
// First, get list of wikis
let wikis_response = fetch_from_webview(webview_window, &search_url, "GET", None).await?;
let mut results = Vec::new();
if let Some(wikis_array) = wikis_response.get("value").and_then(|v| v.as_array()) {
// Search each wiki
for wiki in wikis_array.iter().take(3) {
let wiki_id = wiki["id"].as_str().unwrap_or("");
if wiki_id.is_empty() {
continue;
}
// Search wiki pages
let pages_url = format!(
"{}/{}/_apis/wiki/wikis/{}/pages?recursionLevel=Full&includeContent=true&api-version=7.0",
org_url.trim_end_matches('/'),
urlencoding::encode(project),
urlencoding::encode(wiki_id)
);
if let Ok(pages_response) =
fetch_from_webview(webview_window, &pages_url, "GET", None).await
{
// Try to get "page" field, or use the response itself if it's the page object
if let Some(page) = pages_response.get("page") {
search_page_recursive(
page,
&search_text,
org_url,
project,
wiki_id,
&mut results,
);
} else {
// Response might be the page object itself
search_page_recursive(
&pages_response,
&search_text,
org_url,
project,
wiki_id,
&mut results,
);
}
}
}
}
tracing::info!(
"Azure DevOps wiki webview search returned {} results",
results.len()
);
Ok(results)
}
/// Recursively search through wiki pages for matching content
fn search_page_recursive(
page: &Value,
search_text: &str,
org_url: &str,
_project: &str,
wiki_id: &str,
results: &mut Vec<SearchResult>,
) {
let search_lower = search_text.to_lowercase();
// Check current page
if let Some(path) = page.get("path").and_then(|p| p.as_str()) {
let content = page.get("content").and_then(|c| c.as_str()).unwrap_or("");
let content_lower = content.to_lowercase();
// Simple relevance check
let matches = search_lower
.split_whitespace()
.filter(|word| content_lower.contains(word))
.count();
if matches > 0 {
let page_id = page.get("id").and_then(|i| i.as_i64()).unwrap_or(0);
let title = path.trim_start_matches('/').replace('/', " > ");
let url = format!(
"{}/_wiki/wikis/{}/{}/{}",
org_url.trim_end_matches('/'),
urlencoding::encode(wiki_id),
page_id,
urlencoding::encode(path.trim_start_matches('/'))
);
// Create excerpt from first occurrence
let excerpt = if let Some(pos) =
content_lower.find(search_lower.split_whitespace().next().unwrap_or(""))
{
let start = pos.saturating_sub(50);
let end = (pos + 200).min(content.len());
format!("...{}", &content[start..end])
} else {
content.chars().take(200).collect()
};
let result_content = if content.len() > 3000 {
format!("{}...", &content[..3000])
} else {
content.to_string()
};
results.push(SearchResult {
title,
url,
excerpt,
content: Some(result_content),
source: "Azure DevOps Wiki".to_string(),
});
}
}
// Recurse into subpages
if let Some(subpages) = page.get("subPages").and_then(|s| s.as_array()) {
for subpage in subpages {
search_page_recursive(subpage, search_text, org_url, _project, wiki_id, results);
}
}
}
/// Search Azure DevOps work items using webview fetch
pub async fn search_azuredevops_workitems_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
org_url: &str,
project: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
// Extract keywords
let keywords = extract_keywords(query);
// Check if query contains a work item ID (pure number)
let work_item_id: Option<i64> = keywords
.iter()
.filter(|k| k.chars().all(|c| c.is_numeric()))
.filter_map(|k| k.parse::<i64>().ok())
.next();
// Build WIQL query
let wiql_query = if let Some(id) = work_item_id {
// Search by specific ID
format!(
"SELECT [System.Id], [System.Title], [System.Description], [System.WorkItemType] \
FROM WorkItems WHERE [System.Id] = {id}"
)
} else {
// Search by text in title/description
let search_terms = if !keywords.is_empty() {
keywords.join(" ")
} else {
query.to_string()
};
// Use CONTAINS for text search (case-insensitive)
format!(
"SELECT [System.Id], [System.Title], [System.Description], [System.WorkItemType] \
FROM WorkItems WHERE [System.TeamProject] = '{project}' \
AND ([System.Title] CONTAINS '{search_terms}' OR [System.Description] CONTAINS '{search_terms}') \
ORDER BY [System.ChangedDate] DESC"
)
};
let wiql_url = format!(
"{}/{}/_apis/wit/wiql?api-version=7.0",
org_url.trim_end_matches('/'),
urlencoding::encode(project)
);
let body = serde_json::json!({
"query": wiql_query
})
.to_string();
tracing::info!("Executing Azure DevOps work item search via webview");
tracing::debug!("WIQL query: {}", wiql_query);
tracing::debug!("Request URL: {}", wiql_url);
let wiql_response = fetch_from_webview(webview_window, &wiql_url, "POST", Some(&body)).await?;
let mut results = Vec::new();
if let Some(work_items) = wiql_response.get("workItems").and_then(|v| v.as_array()) {
// Fetch details for first 5 work items
for item in work_items.iter().take(5) {
if let Some(id) = item.get("id").and_then(|i| i.as_i64()) {
let details_url = format!(
"{}/_apis/wit/workitems/{}?api-version=7.0",
org_url.trim_end_matches('/'),
id
);
if let Ok(details) =
fetch_from_webview(webview_window, &details_url, "GET", None).await
{
if let Some(fields) = details.get("fields") {
let title = fields
.get("System.Title")
.and_then(|t| t.as_str())
.unwrap_or("Untitled");
let work_item_type = fields
.get("System.WorkItemType")
.and_then(|t| t.as_str())
.unwrap_or("Item");
let description = fields
.get("System.Description")
.and_then(|d| d.as_str())
.unwrap_or("");
let clean_description = strip_html_simple(description);
let excerpt = clean_description.chars().take(200).collect();
let url = format!("{}/_workitems/edit/{id}", org_url.trim_end_matches('/'));
let full_content = if clean_description.len() > 3000 {
format!("{}...", &clean_description[..3000])
} else {
clean_description.clone()
};
results.push(SearchResult {
title: format!("{work_item_type} #{id}: {title}"),
url,
excerpt,
content: Some(full_content),
source: "Azure DevOps".to_string(),
});
}
}
}
}
}
tracing::info!(
"Azure DevOps work items webview search returned {} results",
results.len()
);
Ok(results)
}
/// Add a comment to an Azure DevOps work item
pub async fn add_azuredevops_comment_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
org_url: &str,
work_item_id: i64,
comment_text: &str,
) -> Result<String, String> {
let comment_url = format!(
"{}/_apis/wit/workitems/{work_item_id}/comments?api-version=7.0",
org_url.trim_end_matches('/')
);
let body = serde_json::json!({
"text": comment_text
})
.to_string();
tracing::info!("Adding comment to Azure DevOps work item {}", work_item_id);
let response = fetch_from_webview(webview_window, &comment_url, "POST", Some(&body)).await?;
// Extract comment ID from response
let comment_id = response
.get("id")
.and_then(|id| id.as_i64())
.ok_or_else(|| "Failed to get comment ID from response".to_string())?;
tracing::info!("Successfully added comment {comment_id} to work item {work_item_id}");
Ok(format!("Comment added successfully (ID: {comment_id})"))
}

View File

@ -1,287 +0,0 @@
/// Native webview-based search that automatically includes HttpOnly cookies
/// This bypasses cookie extraction by making requests directly from the authenticated webview
use serde::{Deserialize, Serialize};
use tauri::WebviewWindow;
use super::confluence_search::SearchResult;
/// Execute a search request from within the webview context
/// This automatically includes all cookies (including HttpOnly) from the authenticated session
pub async fn search_from_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
service: &str,
base_url: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
match service {
"confluence" => search_confluence_from_webview(webview_window, base_url, query).await,
"servicenow" => search_servicenow_from_webview(webview_window, base_url, query).await,
"azuredevops" => Ok(Vec::new()), // Not yet implemented
_ => Err(format!("Unsupported service: {}", service)),
}
}
/// Search Confluence from within the authenticated webview
async fn search_confluence_from_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
base_url: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
let search_script = format!(
r#"
(async function() {{
try {{
// Search Confluence using the browser's authenticated session
const searchUrl = '{}/rest/api/search?cql=text~"{}"&limit=5';
const response = await fetch(searchUrl, {{
headers: {{
'Accept': 'application/json'
}},
credentials: 'include' // Include cookies automatically
}});
if (!response.ok) {{
return {{ error: `Search failed: ${{response.status}}` }};
}}
const data = await response.json();
const results = [];
if (data.results && Array.isArray(data.results)) {{
for (const item of data.results.slice(0, 3)) {{
const title = item.title || 'Untitled';
const contentId = item.content?.id;
const spaceKey = item.content?.space?.key;
let url = '{}';
if (contentId && spaceKey) {{
url = `{}/display/${{spaceKey}}/${{contentId}}`;
}}
const excerpt = (item.excerpt || '')
.replace(/<span class="highlight">/g, '')
.replace(/<\/span>/g, '');
// Fetch full page content
let content = null;
if (contentId) {{
try {{
const contentUrl = `{}/rest/api/content/${{contentId}}?expand=body.storage`;
const contentResp = await fetch(contentUrl, {{
headers: {{ 'Accept': 'application/json' }},
credentials: 'include'
}});
if (contentResp.ok) {{
const contentData = await contentResp.json();
let html = contentData.body?.storage?.value || '';
// Basic HTML stripping
const div = document.createElement('div');
div.innerHTML = html;
let text = div.textContent || div.innerText || '';
content = text.length > 3000 ? text.substring(0, 3000) + '...' : text;
}}
}} catch (e) {{
console.error('Failed to fetch page content:', e);
}}
}}
results.push({{
title,
url,
excerpt: excerpt.substring(0, 300),
content,
source: 'Confluence'
}});
}}
}}
return {{ results }};
}} catch (error) {{
return {{ error: error.message }};
}}
}})();
"#,
base_url.trim_end_matches('/'),
query.replace('"', "\\\""),
base_url,
base_url,
base_url
);
// Execute JavaScript and store result in localStorage for retrieval
let storage_key = format!("__trcaa_search_{}__", uuid::Uuid::now_v7());
let callback_script = format!(
r#"
{}
.then(result => {{
localStorage.setItem('{}', JSON.stringify(result));
}})
.catch(error => {{
localStorage.setItem('{}', JSON.stringify({{ error: error.message }}));
}});
"#,
search_script,
storage_key,
storage_key
);
webview_window
.eval(&callback_script)
.map_err(|e| format!("Failed to execute search: {}", e))?;
// Poll for result in localStorage
for _ in 0..50 { // Try for 5 seconds
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
let check_script = format!("localStorage.getItem('{}')", storage_key);
let result_str = match webview_window.eval(&check_script) {
Ok(_) => {
// Try to retrieve the actual value
tokio::time::sleep(tokio::time::Duration::from_millis(50)).await;
let get_script = format!(
r#"(function() {{
const val = localStorage.getItem('{}');
if (val) {{
localStorage.removeItem('{}');
return val;
}}
return null;
}})();"#,
storage_key, storage_key
);
match webview_window.eval(&get_script) {
Ok(_) => continue, // Keep polling
Err(_) => continue,
}
}
Err(_) => continue,
};
}
// Timeout - try one final retrieval
tracing::warn!("Webview search timed out, returning empty results");
Ok(Vec::new())
}
/// Search ServiceNow from within the authenticated webview
async fn search_servicenow_from_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
instance_url: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
let search_script = format!(
r#"
(async function() {{
try {{
const results = [];
// Search knowledge base
const kbUrl = '{}/api/now/table/kb_knowledge?sysparm_query=textLIKE{}^ORshort_descriptionLIKE{}&sysparm_limit=3';
const kbResp = await fetch(kbUrl, {{
headers: {{ 'Accept': 'application/json' }},
credentials: 'include'
}});
if (kbResp.ok) {{
const kbData = await kbResp.json();
if (kbData.result && Array.isArray(kbData.result)) {{
for (const item of kbData.result) {{
const title = item.short_description || 'Untitled';
const sysId = item.sys_id || '';
const url = `{}/kb_view.do?sysparm_article=${{sysId}}`;
const text = item.text || '';
const excerpt = text.substring(0, 300);
const content = text.length > 3000 ? text.substring(0, 3000) + '...' : text;
results.push({{
title,
url,
excerpt,
content,
source: 'ServiceNow'
}});
}}
}}
}}
// Search incidents
const incUrl = '{}/api/now/table/incident?sysparm_query=short_descriptionLIKE{}^ORdescriptionLIKE{}&sysparm_limit=3&sysparm_display_value=true';
const incResp = await fetch(incUrl, {{
headers: {{ 'Accept': 'application/json' }},
credentials: 'include'
}});
if (incResp.ok) {{
const incData = await incResp.json();
if (incData.result && Array.isArray(incData.result)) {{
for (const item of incData.result) {{
const number = item.number || 'Unknown';
const title = `Incident ${{number}}: ${{item.short_description || 'No title'}}`;
const sysId = item.sys_id || '';
const url = `{}/incident.do?sys_id=${{sysId}}`;
const description = item.description || '';
const resolution = item.close_notes || '';
const content = `Description: ${{description}}\\nResolution: ${{resolution}}`;
const excerpt = content.substring(0, 200);
results.push({{
title,
url,
excerpt,
content,
source: 'ServiceNow'
}});
}}
}}
}}
return {{ results }};
}} catch (error) {{
return {{ error: error.message }};
}}
}})();
"#,
instance_url.trim_end_matches('/'),
urlencoding::encode(query),
urlencoding::encode(query),
instance_url.trim_end_matches('/'),
instance_url.trim_end_matches('/'),
urlencoding::encode(query),
urlencoding::encode(query),
instance_url.trim_end_matches('/')
);
let result: serde_json::Value = webview_window
.eval(&search_script)
.map_err(|e| format!("Failed to execute search: {}", e))?;
if let Some(error) = result.get("error") {
return Err(format!("Search error: {}", error));
}
if let Some(results_array) = result.get("results").and_then(|v| v.as_array()) {
let mut results = Vec::new();
for item in results_array {
if let Ok(search_result) = serde_json::from_value::<SearchResult>(item.clone()) {
results.push(search_result);
}
}
Ok(results)
} else {
Ok(Vec::new())
}
}
/// Search Azure DevOps from within the authenticated webview
async fn search_azuredevops_from_webview<R: tauri::Runtime>(
webview_window: &WebviewWindow<R>,
org_url: &str,
query: &str,
) -> Result<Vec<SearchResult>, String> {
// Azure DevOps search requires project parameter, which we don't have here
// This would need to be passed in from the config
// For now, return empty results
tracing::warn!("Azure DevOps webview search not yet implemented");
Ok(Vec::new())
}

View File

@ -11,7 +11,6 @@ pub mod state;
use sha2::{Digest, Sha256};
use state::AppState;
use std::sync::{Arc, Mutex};
use tauri::Manager;
#[cfg_attr(mobile, tauri::mobile_entry_point)]
pub fn run() {
@ -26,7 +25,7 @@ pub fn run() {
tracing::info!("Starting Troubleshooting and RCA Assistant application");
// Determine data directory
let data_dir = state::get_app_data_dir().expect("Failed to determine app data directory");
let data_dir = dirs_data_dir();
// Initialize database
let conn = db::connection::init_db(&data_dir).expect("Failed to initialize database");
@ -58,35 +57,6 @@ pub fn run() {
.plugin(tauri_plugin_shell::init())
.plugin(tauri_plugin_http::init())
.manage(app_state)
.setup(|app| {
// Restore persistent browser windows from previous session
let app_handle = app.handle().clone();
let state: tauri::State<AppState> = app.state();
// Clone Arc fields for 'static lifetime
let db = state.db.clone();
let settings = state.settings.clone();
let app_data_dir = state.app_data_dir.clone();
let integration_webviews = state.integration_webviews.clone();
tauri::async_runtime::spawn(async move {
let app_state = AppState {
db,
settings,
app_data_dir,
integration_webviews,
};
if let Err(e) =
commands::integrations::restore_persistent_webviews(&app_handle, &app_state)
.await
{
tracing::warn!("Failed to restore persistent webviews: {}", e);
}
});
Ok(())
})
.invoke_handler(tauri::generate_handler![
// DB / Issue CRUD
commands::db::create_issue,
@ -128,7 +98,6 @@ pub fn run() {
commands::integrations::save_integration_config,
commands::integrations::get_integration_config,
commands::integrations::get_all_integration_configs,
commands::integrations::add_ado_comment,
// System / Settings
commands::system::check_ollama_installed,
commands::system::get_ollama_install_guide,
@ -140,10 +109,48 @@ pub fn run() {
commands::system::get_settings,
commands::system::update_settings,
commands::system::get_audit_log,
commands::system::save_ai_provider,
commands::system::load_ai_providers,
commands::system::delete_ai_provider,
])
.run(tauri::generate_context!())
.expect("Error running Troubleshooting and RCA Assistant application");
}
/// Determine the application data directory.
fn dirs_data_dir() -> std::path::PathBuf {
if let Ok(dir) = std::env::var("TFTSR_DATA_DIR") {
return std::path::PathBuf::from(dir);
}
// Use platform-appropriate data directory
#[cfg(target_os = "linux")]
{
if let Ok(xdg) = std::env::var("XDG_DATA_HOME") {
return std::path::PathBuf::from(xdg).join("trcaa");
}
if let Ok(home) = std::env::var("HOME") {
return std::path::PathBuf::from(home)
.join(".local")
.join("share")
.join("trcaa");
}
}
#[cfg(target_os = "macos")]
{
if let Ok(home) = std::env::var("HOME") {
return std::path::PathBuf::from(home)
.join("Library")
.join("Application Support")
.join("trcaa");
}
}
#[cfg(target_os = "windows")]
{
if let Ok(appdata) = std::env::var("APPDATA") {
return std::path::PathBuf::from(appdata).join("trcaa");
}
}
// Fallback
std::path::PathBuf::from("./trcaa-data")
}

View File

@ -72,49 +72,3 @@ pub struct AppState {
/// These windows stay open for the user to browse and for fresh cookie extraction
pub integration_webviews: Arc<Mutex<HashMap<String, String>>>,
}
/// Determine the application data directory.
/// Returns None if the directory cannot be determined.
pub fn get_app_data_dir() -> Option<PathBuf> {
if let Ok(dir) = std::env::var("TFTSR_DATA_DIR") {
return Some(PathBuf::from(dir));
}
// Use platform-appropriate data directory
#[cfg(target_os = "linux")]
{
if let Ok(xdg) = std::env::var("XDG_DATA_HOME") {
return Some(PathBuf::from(xdg).join("trcaa"));
}
if let Ok(home) = std::env::var("HOME") {
return Some(
PathBuf::from(home)
.join(".local")
.join("share")
.join("trcaa"),
);
}
}
#[cfg(target_os = "macos")]
{
if let Ok(home) = std::env::var("HOME") {
return Some(
PathBuf::from(home)
.join("Library")
.join("Application Support")
.join("trcaa"),
);
}
}
#[cfg(target_os = "windows")]
{
if let Ok(appdata) = std::env::var("APPDATA") {
return Some(PathBuf::from(appdata).join("trcaa"));
}
}
// Fallback
Some(PathBuf::from("./trcaa-data"))
}

View File

@ -15,7 +15,6 @@ import {
Moon,
} from "lucide-react";
import { useSettingsStore } from "@/stores/settingsStore";
import { loadAiProvidersCmd, testProviderConnectionCmd } from "@/lib/tauriCommands";
import Dashboard from "@/pages/Dashboard";
import NewIssue from "@/pages/NewIssue";
@ -46,38 +45,13 @@ const settingsItems = [
export default function App() {
const [collapsed, setCollapsed] = useState(false);
const [appVersion, setAppVersion] = useState("");
const { theme, setTheme, setProviders, getActiveProvider } = useSettingsStore();
const { theme, setTheme } = useSettingsStore();
const location = useLocation();
useEffect(() => {
getVersion().then(setAppVersion).catch(() => {});
}, []);
// Load providers and auto-test active provider on startup
useEffect(() => {
const initializeProviders = async () => {
try {
const providers = await loadAiProvidersCmd();
setProviders(providers);
// Auto-test the active provider
const activeProvider = getActiveProvider();
if (activeProvider) {
console.log("Auto-testing active AI provider:", activeProvider.name);
try {
await testProviderConnectionCmd(activeProvider);
console.log("✓ Active provider connection verified:", activeProvider.name);
} catch (err) {
console.warn("⚠ Active provider connection test failed:", activeProvider.name, err);
}
}
} catch (err) {
console.error("Failed to initialize AI providers:", err);
}
};
initializeProviders();
}, [setProviders, getActiveProvider]);
return (
<div className={theme === "dark" ? "dark" : ""}>
<div className="grid h-screen" style={{ gridTemplateColumns: collapsed ? "64px 1fr" : "240px 1fr" }}>

View File

@ -367,17 +367,6 @@ export const updateSettingsCmd = (partialSettings: Partial<AppSettings>) =>
export const getAuditLogCmd = (filter: AuditFilter) =>
invoke<AuditEntry[]>("get_audit_log", { filter });
// ─── AI Provider Persistence ──────────────────────────────────────────────────
export const saveAiProviderCmd = (provider: ProviderConfig) =>
invoke<void>("save_ai_provider", { provider });
export const loadAiProvidersCmd = () =>
invoke<ProviderConfig[]>("load_ai_providers");
export const deleteAiProviderCmd = (name: string) =>
invoke<void>("delete_ai_provider", { name });
// ─── OAuth & Integrations ─────────────────────────────────────────────────────
export interface OAuthInitResponse {
@ -428,16 +417,8 @@ export interface IntegrationConfig {
space_key?: string;
}
export const authenticateWithWebviewCmd = (
service: string,
baseUrl: string,
projectName?: string
) =>
invoke<WebviewAuthResponse>("authenticate_with_webview", {
service,
baseUrl,
projectName,
});
export const authenticateWithWebviewCmd = (service: string, baseUrl: string) =>
invoke<WebviewAuthResponse>("authenticate_with_webview", { service, baseUrl });
export const extractCookiesFromWebviewCmd = (service: string, webviewId: string) =>
invoke<ConnectionResult>("extract_cookies_from_webview", { service, webviewId });
@ -455,6 +436,3 @@ export const getIntegrationConfigCmd = (service: string) =>
export const getAllIntegrationConfigsCmd = () =>
invoke<IntegrationConfig[]>("get_all_integration_configs");
export const addAdoCommentCmd = (workItemId: number, commentText: string) =>
invoke<string>("add_ado_comment", { workItemId, commentText });

View File

@ -1,4 +1,4 @@
import React, { useState, useEffect } from "react";
import React, { useState } from "react";
import { Plus, Pencil, Trash2, CheckCircle, XCircle, Zap } from "lucide-react";
import {
Card,
@ -17,13 +17,7 @@ import {
Separator,
} from "@/components/ui";
import { useSettingsStore } from "@/stores/settingsStore";
import {
testProviderConnectionCmd,
saveAiProviderCmd,
loadAiProvidersCmd,
deleteAiProviderCmd,
type ProviderConfig,
} from "@/lib/tauriCommands";
import { testProviderConnectionCmd, type ProviderConfig } from "@/lib/tauriCommands";
export const CUSTOM_REST_MODELS = [
"ChatGPT4o",
@ -78,7 +72,6 @@ export default function AIProviders() {
updateProvider,
removeProvider,
setActiveProvider,
setProviders,
} = useSettingsStore();
const [editIndex, setEditIndex] = useState<number | null>(null);
@ -89,20 +82,6 @@ export default function AIProviders() {
const [isCustomModel, setIsCustomModel] = useState(false);
const [customModelInput, setCustomModelInput] = useState("");
// Load providers from database on mount
// Note: Auto-testing of active provider is handled in App.tsx on startup
useEffect(() => {
const loadProviders = async () => {
try {
const providers = await loadAiProvidersCmd();
setProviders(providers);
} catch (err) {
console.error("Failed to load AI providers:", err);
}
};
loadProviders();
}, [setProviders]);
const startAdd = () => {
setForm({ ...emptyProvider });
setEditIndex(null);
@ -135,27 +114,16 @@ export default function AIProviders() {
}
};
const handleSave = async () => {
const handleSave = () => {
if (!form.name || !form.api_url || !form.model) return;
try {
// Save to database
await saveAiProviderCmd(form);
// Update local state
if (editIndex != null) {
updateProvider(editIndex, form);
} else {
addProvider(form);
}
setIsAdding(false);
setEditIndex(null);
setForm({ ...emptyProvider });
} catch (err) {
console.error("Failed to save provider:", err);
setTestResult({ success: false, message: `Failed to save: ${err}` });
if (editIndex != null) {
updateProvider(editIndex, form);
} else {
addProvider(form);
}
setIsAdding(false);
setEditIndex(null);
setForm({ ...emptyProvider });
};
const handleCancel = () => {
@ -165,16 +133,6 @@ export default function AIProviders() {
setTestResult(null);
};
const handleRemove = async (index: number) => {
const provider = ai_providers[index];
try {
await deleteAiProviderCmd(provider.name);
removeProvider(index);
} catch (err) {
console.error("Failed to delete provider:", err);
}
};
const handleTest = async () => {
setIsTesting(true);
setTestResult(null);
@ -257,7 +215,7 @@ export default function AIProviders() {
<Button
variant="ghost"
size="sm"
onClick={() => handleRemove(idx)}
onClick={() => removeProvider(idx)}
>
<Trash2 className="w-3 h-3 text-destructive" />
</Button>

View File

@ -16,6 +16,7 @@ import {
import {
initiateOauthCmd,
authenticateWithWebviewCmd,
extractCookiesFromWebviewCmd,
saveManualTokenCmd,
testConfluenceConnectionCmd,
testServiceNowConnectionCmd,
@ -141,24 +142,16 @@ export default function Integrations() {
setLoading((prev) => ({ ...prev, [service]: true }));
try {
const response = await authenticateWithWebviewCmd(
service,
config.baseUrl,
config.projectName
);
const response = await authenticateWithWebviewCmd(service, config.baseUrl);
setConfigs((prev) => ({
...prev,
[service]: {
...prev[service],
webviewId: response.webview_id,
connected: true, // Mark as connected since window persists
},
[service]: { ...prev[service], webviewId: response.webview_id },
}));
setTestResults((prev) => ({
...prev,
[service]: { success: true, message: response.message },
[service]: { success: true, message: response.message + " Click 'Complete Login' when done." },
}));
} catch (err) {
console.error("Failed to open webview:", err);
@ -171,6 +164,41 @@ export default function Integrations() {
}
};
const handleCompleteWebviewLogin = async (service: string) => {
const config = configs[service];
if (!config.webviewId) {
setTestResults((prev) => ({
...prev,
[service]: { success: false, message: "No webview session found. Click 'Login via Browser' first." },
}));
return;
}
setLoading((prev) => ({ ...prev, [`complete-${service}`]: true }));
try {
const result = await extractCookiesFromWebviewCmd(service, config.webviewId);
setConfigs((prev) => ({
...prev,
[service]: { ...prev[service], connected: true, webviewId: undefined },
}));
setTestResults((prev) => ({
...prev,
[service]: { success: result.success, message: result.message },
}));
} catch (err) {
console.error("Failed to extract cookies:", err);
setTestResults((prev) => ({
...prev,
[service]: { success: false, message: String(err) },
}));
} finally {
setLoading((prev) => ({ ...prev, [`complete-${service}`]: false }));
}
};
const handleSaveToken = async (service: string) => {
const config = configs[service];
if (!config.token) {
@ -344,16 +372,9 @@ export default function Integrations() {
{config.authMode === "webview" && (
<div className="space-y-3 p-4 bg-muted/30 rounded-lg">
<p className="text-sm text-muted-foreground">
Opens a persistent browser window for you to log in. Works even when off-VPN.
The browser window stays open across app restarts and maintains your session automatically.
Opens an embedded browser for you to log in normally. Works even when off-VPN. Captures session cookies for API access.
</p>
{config.webviewId ? (
<div className="p-3 bg-green-500/10 text-green-700 dark:text-green-400 rounded text-sm">
<Check className="w-4 h-4 inline mr-2" />
Browser window is open. Log in there and leave it open - your session will persist across app restarts.
You can close this window manually when done.
</div>
) : (
<div className="flex gap-2">
<Button
onClick={() => handleConnectWebview(service)}
disabled={loading[service] || !config.baseUrl}
@ -364,10 +385,26 @@ export default function Integrations() {
Opening...
</>
) : (
"Open Browser"
"Login via Browser"
)}
</Button>
)}
{config.webviewId && (
<Button
variant="secondary"
onClick={() => handleCompleteWebviewLogin(service)}
disabled={loading[`complete-${service}`]}
>
{loading[`complete-${service}`] ? (
<>
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
Saving...
</>
) : (
"Complete Login"
)}
</Button>
)}
</div>
</div>
)}
@ -634,7 +671,7 @@ export default function Integrations() {
<p className="text-sm font-semibold">Authentication Method Comparison:</p>
<ul className="text-xs text-muted-foreground space-y-1 list-disc list-inside">
<li><strong>OAuth2:</strong> Most secure, but requires pre-registered app. May not work with enterprise SSO.</li>
<li><strong>Browser Login:</strong> Best for VPN environments. Opens a persistent browser window that stays open across app restarts. Your session is maintained automatically.</li>
<li><strong>Browser Login:</strong> Best for VPN environments. Lets you authenticate off-VPN, extracts session cookies for API use.</li>
<li><strong>Manual Token:</strong> Most reliable fallback. Requires generating API tokens manually from each service.</li>
</ul>
</div>

View File

@ -6,7 +6,6 @@ interface SettingsState extends AppSettings {
addProvider: (provider: ProviderConfig) => void;
updateProvider: (index: number, provider: ProviderConfig) => void;
removeProvider: (index: number) => void;
setProviders: (providers: ProviderConfig[]) => void;
setActiveProvider: (name: string) => void;
setTheme: (theme: "light" | "dark") => void;
getActiveProvider: () => ProviderConfig | undefined;
@ -36,7 +35,6 @@ export const useSettingsStore = create<SettingsState>()(
set((state) => ({
ai_providers: state.ai_providers.filter((_, i) => i !== index),
})),
setProviders: (providers) => set({ ai_providers: providers }),
setActiveProvider: (name) => set({ active_provider: name }),
setTheme: (theme) => set({ theme }),
pii_enabled_patterns: Object.fromEntries(
@ -55,14 +53,12 @@ export const useSettingsStore = create<SettingsState>()(
}),
{
name: "tftsr-settings",
// Don't persist ai_providers to localStorage - they're stored in encrypted database
partialize: (state) => ({
theme: state.theme,
active_provider: state.active_provider,
default_provider: state.default_provider,
default_model: state.default_model,
ollama_url: state.ollama_url,
pii_enabled_patterns: state.pii_enabled_patterns,
...state,
ai_providers: state.ai_providers.map((provider) => ({
...provider,
api_key: "",
})),
}),
}
)