diff --git a/README.md b/README.md index 59f82c81..257832b1 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,4 @@ -# TFTSR — IT Triage & RCA Desktop Application +# Troubleshooting and RCA Assistant A structured, AI-backed desktop tool for IT incident triage, 5-Whys root cause analysis, RCA document generation, and blameless post-mortems. Runs fully offline via Ollama local models, or connects to cloud AI providers. @@ -166,7 +166,7 @@ To use Claude via AWS Bedrock (ideal for enterprise environments with existing A nohup litellm --config ~/.litellm/config.yaml --port 8000 > ~/.litellm/litellm.log 2>&1 & ``` -4. **Configure in TFTSR:** +4. **Configure in Troubleshooting and RCA Assistant:** - Provider: **OpenAI** (OpenAI-compatible) - Base URL: `http://localhost:8000/v1` - API Key: `sk-your-secure-key` (from config) diff --git a/docs/wiki/AI-Providers.md b/docs/wiki/AI-Providers.md index 91880a0f..0630b1ad 100644 --- a/docs/wiki/AI-Providers.md +++ b/docs/wiki/AI-Providers.md @@ -113,7 +113,7 @@ The domain prompt is injected as the first `system` role message in every new co --- -## 6. Custom Provider (MSI GenAI & Others) +## 6. Custom Provider (Custom REST & Others) **Status:** ✅ **Implemented** (v0.2.6) @@ -137,25 +137,26 @@ Standard OpenAI `/chat/completions` endpoint with Bearer authentication. --- -### Format: MSI GenAI +### Format: Custom REST **Motorola Solutions Internal GenAI Service** — Enterprise AI platform with centralized cost tracking and model access. | Field | Value | |-------|-------| | `config.provider_type` | `"custom"` | -| `config.api_format` | `"msi_genai"` | +| `config.api_format` | `"custom_rest"` | | API URL | `https://genai-service.commandcentral.com/app-gateway` (prod)
`https://genai-service.stage.commandcentral.com/app-gateway` (stage) | | Auth Header | `x-msi-genai-api-key` | | Auth Prefix | `` (empty - no Bearer prefix) | | Endpoint Path | `` (empty - URL includes full path `/api/v2/chat`) | -**Available Models:** +**Available Models (dropdown in Settings):** - `VertexGemini` — Gemini 2.0 Flash (Private/GCP) - `Claude-Sonnet-4` — Claude Sonnet 4 (Public/Anthropic) - `ChatGPT4o` — GPT-4o (Public/OpenAI) - `ChatGPT-5_2-Chat` — GPT-4.5 (Public/OpenAI) -- See [GenAI API User Guide](../GenAI%20API%20User%20Guide.md) for full model list +- Full list is sourced from [GenAI API User Guide](../GenAI%20API%20User%20Guide.md) +- Includes a `Custom model...` option to manually enter any model ID **Request Format:** ```json @@ -187,9 +188,9 @@ Standard OpenAI `/chat/completions` endpoint with Bearer authentication. **Configuration (Settings → AI Providers → Add Provider):** ``` -Name: MSI GenAI +Name: Custom REST (MSI GenAI) Type: Custom -API Format: MSI GenAI +API Format: Custom REST API URL: https://genai-service.stage.commandcentral.com/app-gateway Model: VertexGemini API Key: (your MSI GenAI API key from portal) @@ -208,13 +209,13 @@ Auth Prefix: (leave empty) | Error | Cause | Solution | |-------|-------|----------| | 403 Forbidden | Invalid API key or insufficient permissions | Verify key in MSI GenAI portal, check model access | -| Missing `userId` field | Configuration not saved | Ensure UI shows User ID field when `api_format=msi_genai` | +| Missing `userId` field | Configuration not saved | Ensure UI shows User ID field when `api_format=custom_rest` | | No conversation history | `sessionId` not persisted | Session ID stored in `ProviderConfig.session_id` — currently per-provider, not per-conversation | **Implementation Details:** -- Backend: `src-tauri/src/ai/openai.rs::chat_msi_genai()` +- Backend: `src-tauri/src/ai/openai.rs::chat_custom_rest()` - Schema: `src-tauri/src/state.rs::ProviderConfig` (added `user_id`, `api_format`, custom auth fields) -- Frontend: `src/pages/Settings/AIProviders.tsx` (conditional UI for MSI GenAI) +- Frontend: `src/pages/Settings/AIProviders.tsx` (conditional UI for Custom REST + model dropdown) - CSP whitelist: `https://genai-service.stage.commandcentral.com` and production domain --- @@ -228,9 +229,9 @@ All providers support the following optional configuration fields (v0.2.6+): | `custom_endpoint_path` | `Option` | Override endpoint path | `/chat/completions` | | `custom_auth_header` | `Option` | Custom auth header name | `Authorization` | | `custom_auth_prefix` | `Option` | Prefix before API key | `Bearer ` | -| `api_format` | `Option` | API format (`openai` or `msi_genai`) | `openai` | +| `api_format` | `Option` | API format (`openai` or `custom_rest`) | `openai` | | `session_id` | `Option` | Session ID for stateful APIs | None | -| `user_id` | `Option` | User ID for cost tracking (MSI GenAI) | None | +| `user_id` | `Option` | User ID for cost tracking (Custom REST MSI contract) | None | **Backward Compatibility:** All fields are optional and default to OpenAI-compatible behavior. Existing provider configurations are unaffected. diff --git a/docs/wiki/Home.md b/docs/wiki/Home.md index ebbe7feb..fe15cb0c 100644 --- a/docs/wiki/Home.md +++ b/docs/wiki/Home.md @@ -1,6 +1,6 @@ -# TFTSR — IT Triage & RCA Desktop Application +# Troubleshooting and RCA Assistant -**TFTSR** is a secure desktop application for guided IT incident triage, root cause analysis (RCA), and post-mortem documentation. Built with Tauri 2.x (Rust + WebView) and React 18. +**Troubleshooting and RCA Assistant** is a secure desktop application for guided IT incident triage, root cause analysis (RCA), and post-mortem documentation. Built with Tauri 2.x (Rust + WebView) and React 18. **CI:** ![build](http://172.0.0.29:3000/sarman/tftsr-devops_investigation/actions/workflows/test.yml/badge.svg) — rustfmt · clippy · 64 Rust tests · tsc · vitest — all green @@ -25,7 +25,7 @@ - **5-Whys AI Triage** — Interactive guided root cause analysis via multi-turn AI chat - **PII Auto-Redaction** — Detects and redacts sensitive data before any AI send - **Multi-Provider AI** — OpenAI, Anthropic Claude, Google Gemini, Mistral, AWS Bedrock (via LiteLLM), MSI GenAI (Motorola internal), local Ollama (fully offline) -- **Custom Provider Support** — Flexible authentication (Bearer, custom headers) and API formats (OpenAI-compatible, MSI GenAI) +- **Custom Provider Support** — Flexible authentication (Bearer, custom headers) and API formats (OpenAI-compatible, Custom REST) - **External Integrations** — Confluence, ServiceNow, Azure DevOps with OAuth2 PKCE flows - **SQLCipher AES-256** — All issue history and credentials encrypted at rest - **RCA + Post-Mortem Generation** — Auto-populated Markdown templates, exportable as MD/PDF diff --git a/index.html b/index.html index a432c7b0..7edb9234 100644 --- a/index.html +++ b/index.html @@ -4,7 +4,7 @@ - TFTSR — IT Triage & RCA + Troubleshooting and RCA Assistant
diff --git a/src-tauri/src/ai/openai.rs b/src-tauri/src/ai/openai.rs index 99fade4d..7972c2b9 100644 --- a/src-tauri/src/ai/openai.rs +++ b/src-tauri/src/ai/openai.rs @@ -6,6 +6,10 @@ use crate::state::ProviderConfig; pub struct OpenAiProvider; +fn is_custom_rest_format(api_format: Option<&str>) -> bool { + matches!(api_format, Some("custom_rest") | Some("msi_genai")) +} + #[async_trait] impl Provider for OpenAiProvider { fn name(&self) -> &str { @@ -29,17 +33,39 @@ impl Provider for OpenAiProvider { messages: Vec, config: &ProviderConfig, ) -> anyhow::Result { - // Check if using MSI GenAI format + // Check if using custom REST format let api_format = config.api_format.as_deref().unwrap_or("openai"); - if api_format == "msi_genai" { - self.chat_msi_genai(messages, config).await + // Backward compatibility: accept legacy msi_genai identifier + if is_custom_rest_format(Some(api_format)) { + self.chat_custom_rest(messages, config).await } else { self.chat_openai(messages, config).await } } } +#[cfg(test)] +mod tests { + use super::is_custom_rest_format; + + #[test] + fn custom_rest_format_is_recognized() { + assert!(is_custom_rest_format(Some("custom_rest"))); + } + + #[test] + fn legacy_msi_format_is_recognized_for_compatibility() { + assert!(is_custom_rest_format(Some("msi_genai"))); + } + + #[test] + fn openai_format_is_not_custom_rest() { + assert!(!is_custom_rest_format(Some("openai"))); + assert!(!is_custom_rest_format(None)); + } +} + impl OpenAiProvider { /// OpenAI-compatible API format (default) async fn chat_openai( @@ -113,8 +139,8 @@ impl OpenAiProvider { }) } - /// MSI GenAI custom format - async fn chat_msi_genai( + /// Custom REST format (MSI GenAI payload contract) + async fn chat_custom_rest( &self, messages: Vec, config: &ProviderConfig, @@ -173,7 +199,7 @@ impl OpenAiProvider { body["modelConfig"] = model_config; } - // Use custom auth header and prefix (no prefix for MSI GenAI) + // Use custom auth header and prefix (no prefix for this custom REST contract) let auth_header = config .custom_auth_header .as_deref() @@ -185,7 +211,7 @@ impl OpenAiProvider { .post(&url) .header(auth_header, auth_value) .header("Content-Type", "application/json") - .header("X-msi-genai-client", "tftsr-devops-investigation") + .header("X-msi-genai-client", "troubleshooting-rca-assistant") .json(&body) .send() .await?; @@ -193,7 +219,7 @@ impl OpenAiProvider { if !resp.status().is_success() { let status = resp.status(); let text = resp.text().await?; - anyhow::bail!("MSI GenAI API error {status}: {text}"); + anyhow::bail!("Custom REST API error {status}: {text}"); } let json: serde_json::Value = resp.json().await?; @@ -212,7 +238,7 @@ impl OpenAiProvider { Ok(ChatResponse { content, model: config.model.clone(), - usage: None, // MSI GenAI doesn't provide token usage in response + usage: None, // This custom REST contract doesn't provide token usage in response }) } } diff --git a/src-tauri/src/commands/ai.rs b/src-tauri/src/commands/ai.rs index bc9fc657..216d8f2d 100644 --- a/src-tauri/src/commands/ai.rs +++ b/src-tauri/src/commands/ai.rs @@ -278,7 +278,9 @@ pub async fn test_provider_connection( let provider = create_provider(&provider_config); let messages = vec![Message { role: "user".into(), - content: "Reply with exactly: TFTSR connection test successful.".into(), + content: + "Reply with exactly: Troubleshooting and RCA Assistant connection test successful." + .into(), }]; provider .chat(messages, &provider_config) diff --git a/src-tauri/src/docs/postmortem.rs b/src-tauri/src/docs/postmortem.rs index 3f5b3489..8a8df607 100644 --- a/src-tauri/src/docs/postmortem.rs +++ b/src-tauri/src/docs/postmortem.rs @@ -148,7 +148,7 @@ pub fn generate_postmortem_markdown(detail: &IssueDetail) -> String { md.push_str("---\n\n"); md.push_str(&format!( - "_Generated by TFTSR IT Triage on {}_\n", + "_Generated by Troubleshooting and RCA Assistant on {}_\n", chrono::Utc::now().format("%Y-%m-%d %H:%M UTC") )); diff --git a/src-tauri/src/docs/rca.rs b/src-tauri/src/docs/rca.rs index 4870f4a0..66836996 100644 --- a/src-tauri/src/docs/rca.rs +++ b/src-tauri/src/docs/rca.rs @@ -133,7 +133,7 @@ pub fn generate_rca_markdown(detail: &IssueDetail) -> String { md.push_str("---\n\n"); md.push_str(&format!( - "_Generated by TFTSR IT Triage on {}_\n", + "_Generated by Troubleshooting and RCA Assistant on {}_\n", chrono::Utc::now().format("%Y-%m-%d %H:%M UTC") )); diff --git a/src-tauri/src/integrations/webview_auth.rs b/src-tauri/src/integrations/webview_auth.rs index 132bbeea..8fdb8d82 100644 --- a/src-tauri/src/integrations/webview_auth.rs +++ b/src-tauri/src/integrations/webview_auth.rs @@ -49,7 +49,9 @@ pub async fn authenticate_with_webview( &webview_label, WebviewUrl::External(login_url.parse().map_err(|e| format!("Invalid URL: {e}"))?), ) - .title(format!("{service} Browser (TFTSR)")) + .title(format!( + "{service} Browser (Troubleshooting and RCA Assistant)" + )) .inner_size(1000.0, 800.0) .min_inner_size(800.0, 600.0) .resizable(true) diff --git a/src-tauri/src/lib.rs b/src-tauri/src/lib.rs index 9317957f..c83c0b2e 100644 --- a/src-tauri/src/lib.rs +++ b/src-tauri/src/lib.rs @@ -21,7 +21,7 @@ pub fn run() { ) .init(); - tracing::info!("Starting TFTSR application"); + tracing::info!("Starting Troubleshooting and RCA Assistant application"); // Determine data directory let data_dir = dirs_data_dir(); @@ -107,7 +107,7 @@ pub fn run() { commands::system::get_audit_log, ]) .run(tauri::generate_context!()) - .expect("Error running TFTSR application"); + .expect("Error running Troubleshooting and RCA Assistant application"); } /// Determine the application data directory. diff --git a/src-tauri/src/state.rs b/src-tauri/src/state.rs index 36cce24d..97b51736 100644 --- a/src-tauri/src/state.rs +++ b/src-tauri/src/state.rs @@ -29,14 +29,14 @@ pub struct ProviderConfig { /// If None, defaults to "Bearer " #[serde(skip_serializing_if = "Option::is_none")] pub custom_auth_prefix: Option, - /// Optional: API format ("openai" or "msi_genai") + /// Optional: API format ("openai" or "custom_rest") /// If None, defaults to "openai" #[serde(skip_serializing_if = "Option::is_none")] pub api_format: Option, - /// Optional: Session ID for stateful APIs like MSI GenAI + /// Optional: Session ID for stateful custom REST APIs #[serde(skip_serializing_if = "Option::is_none")] pub session_id: Option, - /// Optional: User ID for MSI GenAI (CORE ID email) + /// Optional: User ID for custom REST API cost tracking (CORE ID email) #[serde(skip_serializing_if = "Option::is_none")] pub user_id: Option, } diff --git a/src-tauri/tauri.conf.json b/src-tauri/tauri.conf.json index b3ebd7d0..5b7449b1 100644 --- a/src-tauri/tauri.conf.json +++ b/src-tauri/tauri.conf.json @@ -1,5 +1,5 @@ { - "productName": "TFTSR", + "productName": "Troubleshooting and RCA Assistant", "version": "0.2.10", "identifier": "com.tftsr.devops", "build": { @@ -14,7 +14,7 @@ }, "windows": [ { - "title": "TFTSR \u2014 IT Triage & RCA", + "title": "Troubleshooting and RCA Assistant", "width": 1280, "height": 800, "resizable": true, @@ -36,9 +36,9 @@ ], "resources": [], "externalBin": [], - "copyright": "TFTSR Contributors", + "copyright": "Troubleshooting and RCA Assistant Contributors", "category": "Utility", - "shortDescription": "IT Incident Triage & RCA Tool", - "longDescription": "Structured AI-backed tool for IT incident triage, 5-whys root cause analysis, and post-mortem documentation with offline Ollama support." + "shortDescription": "Troubleshooting and RCA Assistant", + "longDescription": "Structured AI-backed assistant for IT troubleshooting, 5-whys root cause analysis, and post-mortem documentation with offline Ollama support." } } \ No newline at end of file diff --git a/src/App.tsx b/src/App.tsx index 1f2ba40a..82226fea 100644 --- a/src/App.tsx +++ b/src/App.tsx @@ -59,7 +59,7 @@ export default function App() {
{!collapsed && ( - TFTSR + Troubleshooting and RCA Assistant )}
diff --git a/src/pages/Settings/AIProviders.tsx b/src/pages/Settings/AIProviders.tsx index dec32ea1..f4911c34 100644 --- a/src/pages/Settings/AIProviders.tsx +++ b/src/pages/Settings/AIProviders.tsx @@ -19,6 +19,35 @@ import { import { useSettingsStore } from "@/stores/settingsStore"; import { testProviderConnectionCmd, type ProviderConfig } from "@/lib/tauriCommands"; +export const CUSTOM_REST_MODELS = [ + "ChatGPT4o", + "ChatGPT4o-mini", + "ChatGPT-o3-mini", + "Gemini-2_0-Flash-001", + "Gemini-2_5-Flash", + "Claude-Sonnet-3_7", + "Openai-gpt-4_1-mini", + "Openai-o4-mini", + "Claude-Sonnet-4", + "ChatGPT-o3-pro", + "OpenAI-ChatGPT-4_1", + "OpenAI-GPT-4_1-Nano", + "ChatGPT-5", + "VertexGemini", + "ChatGPT-5_1", + "ChatGPT-5_1-chat", + "ChatGPT-5_2-Chat", + "Gemini-3_Pro-Preview", + "Gemini-3_1-flash-lite-preview", +] as const; + +export const CUSTOM_MODEL_OPTION = "__custom_model__"; +export const LEGACY_API_FORMAT = "msi_genai"; +export const CUSTOM_REST_FORMAT = "custom_rest"; + +export const normalizeApiFormat = (format?: string): string | undefined => + format === LEGACY_API_FORMAT ? CUSTOM_REST_FORMAT : format; + const emptyProvider: ProviderConfig = { name: "", provider_type: "openai", @@ -50,19 +79,39 @@ export default function AIProviders() { const [form, setForm] = useState({ ...emptyProvider }); const [testResult, setTestResult] = useState<{ success: boolean; message: string } | null>(null); const [isTesting, setIsTesting] = useState(false); + const [isCustomModel, setIsCustomModel] = useState(false); + const [customModelInput, setCustomModelInput] = useState(""); const startAdd = () => { setForm({ ...emptyProvider }); setEditIndex(null); setIsAdding(true); setTestResult(null); + setIsCustomModel(false); + setCustomModelInput(""); }; const startEdit = (index: number) => { - setForm({ ...ai_providers[index] }); + const provider = ai_providers[index]; + const apiFormat = normalizeApiFormat(provider.api_format); + const nextForm = { ...provider, api_format: apiFormat }; + + setForm(nextForm); setEditIndex(index); setIsAdding(true); setTestResult(null); + + const isCustomRestProvider = + nextForm.provider_type === "custom" && apiFormat === CUSTOM_REST_FORMAT; + const knownModel = CUSTOM_REST_MODELS.includes(nextForm.model as (typeof CUSTOM_REST_MODELS)[number]); + + if (isCustomRestProvider && !knownModel) { + setIsCustomModel(true); + setCustomModelInput(nextForm.model); + } else { + setIsCustomModel(false); + setCustomModelInput(""); + } }; const handleSave = () => { @@ -244,11 +293,21 @@ export default function AIProviders() {
- setForm({ ...form, model: e.target.value })} - placeholder="gpt-4o" - /> + {form.provider_type === "custom" + && normalizeApiFormat(form.api_format) === CUSTOM_REST_FORMAT ? ( + setForm({ ...form, model: e.target.value })} + placeholder="Select API Format below to choose model" + disabled + /> + ) : ( + setForm({ ...form, model: e.target.value })} + placeholder="gpt-4o" + /> + )}
@@ -285,7 +344,7 @@ export default function AIProviders() { onValueChange={(v) => { const format = v; const defaults = - format === "msi_genai" + format === CUSTOM_REST_FORMAT ? { custom_endpoint_path: "", custom_auth_header: "x-msi-genai-api-key", @@ -297,6 +356,10 @@ export default function AIProviders() { custom_auth_prefix: "Bearer ", }; setForm({ ...form, api_format: format, ...defaults }); + if (format !== CUSTOM_REST_FORMAT) { + setIsCustomModel(false); + setCustomModelInput(""); + } }} > @@ -304,11 +367,11 @@ export default function AIProviders() { OpenAI Compatible - MSI GenAI + Custom REST

- Select the API format. MSI GenAI uses a different request/response structure. + Select the API format. Custom REST uses a non-OpenAI request/response structure.

@@ -349,12 +412,12 @@ export default function AIProviders() { placeholder="Bearer " />

- Prefix added before API key (e.g., "Bearer " for OpenAI, empty for MSI GenAI) + Prefix added before API key (e.g., "Bearer " for OpenAI, empty for Custom REST)

- {/* MSI GenAI specific: User ID field */} - {form.api_format === "msi_genai" && ( + {/* Custom REST specific: User ID field */} + {normalizeApiFormat(form.api_format) === CUSTOM_REST_FORMAT && (
)} + + {/* Custom REST specific: model dropdown with custom option */} + {normalizeApiFormat(form.api_format) === CUSTOM_REST_FORMAT && ( +
+ + + {isCustomModel && ( + { + const value = e.target.value; + setCustomModelInput(value); + setForm({ ...form, model: value }); + }} + placeholder="Enter custom model ID" + /> + )} +
+ )} )} diff --git a/src/pages/Settings/Integrations.tsx b/src/pages/Settings/Integrations.tsx index 42635506..0fd41a6e 100644 --- a/src/pages/Settings/Integrations.tsx +++ b/src/pages/Settings/Integrations.tsx @@ -453,7 +453,7 @@ export default function Integrations() {

Integrations

- Connect TFTSR with your existing tools and platforms. Choose the authentication method that works best for your environment. + Connect Troubleshooting and RCA Assistant with your existing tools and platforms. Choose the authentication method that works best for your environment.

diff --git a/tests/unit/aiProvidersCustomRest.test.ts b/tests/unit/aiProvidersCustomRest.test.ts new file mode 100644 index 00000000..bc4819f9 --- /dev/null +++ b/tests/unit/aiProvidersCustomRest.test.ts @@ -0,0 +1,25 @@ +import { describe, it, expect } from "vitest"; +import { + CUSTOM_MODEL_OPTION, + CUSTOM_REST_FORMAT, + CUSTOM_REST_MODELS, + LEGACY_API_FORMAT, + normalizeApiFormat, +} from "@/pages/Settings/AIProviders"; + +describe("AIProviders Custom REST helpers", () => { + it("maps legacy msi_genai api_format to custom_rest", () => { + expect(normalizeApiFormat(LEGACY_API_FORMAT)).toBe(CUSTOM_REST_FORMAT); + }); + + it("keeps openai api_format unchanged", () => { + expect(normalizeApiFormat("openai")).toBe("openai"); + }); + + it("contains the guide model list and custom model option sentinel", () => { + expect(CUSTOM_REST_MODELS).toContain("ChatGPT4o"); + expect(CUSTOM_REST_MODELS).toContain("VertexGemini"); + expect(CUSTOM_REST_MODELS).toContain("Gemini-3_Pro-Preview"); + expect(CUSTOM_MODEL_OPTION).toBe("__custom_model__"); + }); +}); diff --git a/tsconfig.json b/tsconfig.json index 90c8a375..09dca661 100644 --- a/tsconfig.json +++ b/tsconfig.json @@ -17,7 +17,7 @@ "noFallthroughCasesInSwitch": true, "baseUrl": ".", "paths": { "@/*": ["src/*"] }, - "types": ["vitest/globals"] + "types": ["vitest/globals", "@testing-library/jest-dom"] }, "include": ["src", "tests/unit"], "references": [{ "path": "./tsconfig.node.json" }]