feat: add custom_rest provider mode and rebrand application name

Rename custom API format handling from custom_rest to custom_rest with backward compatibility, add guided model selection with custom entry in provider settings, and rebrand app naming to Troubleshooting and RCA Assistant across UI, metadata, and docs.

Made-with: Cursor
This commit is contained in:
Shaun Arman 2026-04-04 15:35:58 -05:00
parent 0bc20f09f6
commit c4ea32e660
18 changed files with 222 additions and 57 deletions

View File

@ -1,4 +1,4 @@
# TFTSR — IT Triage & RCA Desktop Application # Troubleshooting and RCA Assistant
A structured, AI-backed desktop tool for IT incident triage, 5-Whys root cause analysis, RCA document generation, and blameless post-mortems. Runs fully offline via Ollama local models, or connects to cloud AI providers. A structured, AI-backed desktop tool for IT incident triage, 5-Whys root cause analysis, RCA document generation, and blameless post-mortems. Runs fully offline via Ollama local models, or connects to cloud AI providers.
@ -166,7 +166,7 @@ To use Claude via AWS Bedrock (ideal for enterprise environments with existing A
nohup litellm --config ~/.litellm/config.yaml --port 8000 > ~/.litellm/litellm.log 2>&1 & nohup litellm --config ~/.litellm/config.yaml --port 8000 > ~/.litellm/litellm.log 2>&1 &
``` ```
4. **Configure in TFTSR:** 4. **Configure in Troubleshooting and RCA Assistant:**
- Provider: **OpenAI** (OpenAI-compatible) - Provider: **OpenAI** (OpenAI-compatible)
- Base URL: `http://localhost:8000/v1` - Base URL: `http://localhost:8000/v1`
- API Key: `sk-your-secure-key` (from config) - API Key: `sk-your-secure-key` (from config)

View File

@ -113,7 +113,7 @@ The domain prompt is injected as the first `system` role message in every new co
--- ---
## 6. Custom Provider (MSI GenAI & Others) ## 6. Custom Provider (Custom REST & Others)
**Status:** ✅ **Implemented** (v0.2.6) **Status:** ✅ **Implemented** (v0.2.6)
@ -137,25 +137,26 @@ Standard OpenAI `/chat/completions` endpoint with Bearer authentication.
--- ---
### Format: MSI GenAI ### Format: Custom REST
**Motorola Solutions Internal GenAI Service** — Enterprise AI platform with centralized cost tracking and model access. **Motorola Solutions Internal GenAI Service** — Enterprise AI platform with centralized cost tracking and model access.
| Field | Value | | Field | Value |
|-------|-------| |-------|-------|
| `config.provider_type` | `"custom"` | | `config.provider_type` | `"custom"` |
| `config.api_format` | `"msi_genai"` | | `config.api_format` | `"custom_rest"` |
| API URL | `https://genai-service.commandcentral.com/app-gateway` (prod)<br>`https://genai-service.stage.commandcentral.com/app-gateway` (stage) | | API URL | `https://genai-service.commandcentral.com/app-gateway` (prod)<br>`https://genai-service.stage.commandcentral.com/app-gateway` (stage) |
| Auth Header | `x-msi-genai-api-key` | | Auth Header | `x-msi-genai-api-key` |
| Auth Prefix | `` (empty - no Bearer prefix) | | Auth Prefix | `` (empty - no Bearer prefix) |
| Endpoint Path | `` (empty - URL includes full path `/api/v2/chat`) | | Endpoint Path | `` (empty - URL includes full path `/api/v2/chat`) |
**Available Models:** **Available Models (dropdown in Settings):**
- `VertexGemini` — Gemini 2.0 Flash (Private/GCP) - `VertexGemini` — Gemini 2.0 Flash (Private/GCP)
- `Claude-Sonnet-4` — Claude Sonnet 4 (Public/Anthropic) - `Claude-Sonnet-4` — Claude Sonnet 4 (Public/Anthropic)
- `ChatGPT4o` — GPT-4o (Public/OpenAI) - `ChatGPT4o` — GPT-4o (Public/OpenAI)
- `ChatGPT-5_2-Chat` — GPT-4.5 (Public/OpenAI) - `ChatGPT-5_2-Chat` — GPT-4.5 (Public/OpenAI)
- See [GenAI API User Guide](../GenAI%20API%20User%20Guide.md) for full model list - Full list is sourced from [GenAI API User Guide](../GenAI%20API%20User%20Guide.md)
- Includes a `Custom model...` option to manually enter any model ID
**Request Format:** **Request Format:**
```json ```json
@ -187,9 +188,9 @@ Standard OpenAI `/chat/completions` endpoint with Bearer authentication.
**Configuration (Settings → AI Providers → Add Provider):** **Configuration (Settings → AI Providers → Add Provider):**
``` ```
Name: MSI GenAI Name: Custom REST (MSI GenAI)
Type: Custom Type: Custom
API Format: MSI GenAI API Format: Custom REST
API URL: https://genai-service.stage.commandcentral.com/app-gateway API URL: https://genai-service.stage.commandcentral.com/app-gateway
Model: VertexGemini Model: VertexGemini
API Key: (your MSI GenAI API key from portal) API Key: (your MSI GenAI API key from portal)
@ -208,13 +209,13 @@ Auth Prefix: (leave empty)
| Error | Cause | Solution | | Error | Cause | Solution |
|-------|-------|----------| |-------|-------|----------|
| 403 Forbidden | Invalid API key or insufficient permissions | Verify key in MSI GenAI portal, check model access | | 403 Forbidden | Invalid API key or insufficient permissions | Verify key in MSI GenAI portal, check model access |
| Missing `userId` field | Configuration not saved | Ensure UI shows User ID field when `api_format=msi_genai` | | Missing `userId` field | Configuration not saved | Ensure UI shows User ID field when `api_format=custom_rest` |
| No conversation history | `sessionId` not persisted | Session ID stored in `ProviderConfig.session_id` — currently per-provider, not per-conversation | | No conversation history | `sessionId` not persisted | Session ID stored in `ProviderConfig.session_id` — currently per-provider, not per-conversation |
**Implementation Details:** **Implementation Details:**
- Backend: `src-tauri/src/ai/openai.rs::chat_msi_genai()` - Backend: `src-tauri/src/ai/openai.rs::chat_custom_rest()`
- Schema: `src-tauri/src/state.rs::ProviderConfig` (added `user_id`, `api_format`, custom auth fields) - Schema: `src-tauri/src/state.rs::ProviderConfig` (added `user_id`, `api_format`, custom auth fields)
- Frontend: `src/pages/Settings/AIProviders.tsx` (conditional UI for MSI GenAI) - Frontend: `src/pages/Settings/AIProviders.tsx` (conditional UI for Custom REST + model dropdown)
- CSP whitelist: `https://genai-service.stage.commandcentral.com` and production domain - CSP whitelist: `https://genai-service.stage.commandcentral.com` and production domain
--- ---
@ -228,9 +229,9 @@ All providers support the following optional configuration fields (v0.2.6+):
| `custom_endpoint_path` | `Option<String>` | Override endpoint path | `/chat/completions` | | `custom_endpoint_path` | `Option<String>` | Override endpoint path | `/chat/completions` |
| `custom_auth_header` | `Option<String>` | Custom auth header name | `Authorization` | | `custom_auth_header` | `Option<String>` | Custom auth header name | `Authorization` |
| `custom_auth_prefix` | `Option<String>` | Prefix before API key | `Bearer ` | | `custom_auth_prefix` | `Option<String>` | Prefix before API key | `Bearer ` |
| `api_format` | `Option<String>` | API format (`openai` or `msi_genai`) | `openai` | | `api_format` | `Option<String>` | API format (`openai` or `custom_rest`) | `openai` |
| `session_id` | `Option<String>` | Session ID for stateful APIs | None | | `session_id` | `Option<String>` | Session ID for stateful APIs | None |
| `user_id` | `Option<String>` | User ID for cost tracking (MSI GenAI) | None | | `user_id` | `Option<String>` | User ID for cost tracking (Custom REST MSI contract) | None |
**Backward Compatibility:** **Backward Compatibility:**
All fields are optional and default to OpenAI-compatible behavior. Existing provider configurations are unaffected. All fields are optional and default to OpenAI-compatible behavior. Existing provider configurations are unaffected.

View File

@ -1,6 +1,6 @@
# TFTSR — IT Triage & RCA Desktop Application # Troubleshooting and RCA Assistant
**TFTSR** is a secure desktop application for guided IT incident triage, root cause analysis (RCA), and post-mortem documentation. Built with Tauri 2.x (Rust + WebView) and React 18. **Troubleshooting and RCA Assistant** is a secure desktop application for guided IT incident triage, root cause analysis (RCA), and post-mortem documentation. Built with Tauri 2.x (Rust + WebView) and React 18.
**CI:** ![build](http://172.0.0.29:3000/sarman/tftsr-devops_investigation/actions/workflows/test.yml/badge.svg) — rustfmt · clippy · 64 Rust tests · tsc · vitest — all green **CI:** ![build](http://172.0.0.29:3000/sarman/tftsr-devops_investigation/actions/workflows/test.yml/badge.svg) — rustfmt · clippy · 64 Rust tests · tsc · vitest — all green
@ -25,7 +25,7 @@
- **5-Whys AI Triage** — Interactive guided root cause analysis via multi-turn AI chat - **5-Whys AI Triage** — Interactive guided root cause analysis via multi-turn AI chat
- **PII Auto-Redaction** — Detects and redacts sensitive data before any AI send - **PII Auto-Redaction** — Detects and redacts sensitive data before any AI send
- **Multi-Provider AI** — OpenAI, Anthropic Claude, Google Gemini, Mistral, AWS Bedrock (via LiteLLM), MSI GenAI (Motorola internal), local Ollama (fully offline) - **Multi-Provider AI** — OpenAI, Anthropic Claude, Google Gemini, Mistral, AWS Bedrock (via LiteLLM), MSI GenAI (Motorola internal), local Ollama (fully offline)
- **Custom Provider Support** — Flexible authentication (Bearer, custom headers) and API formats (OpenAI-compatible, MSI GenAI) - **Custom Provider Support** — Flexible authentication (Bearer, custom headers) and API formats (OpenAI-compatible, Custom REST)
- **External Integrations** — Confluence, ServiceNow, Azure DevOps with OAuth2 PKCE flows - **External Integrations** — Confluence, ServiceNow, Azure DevOps with OAuth2 PKCE flows
- **SQLCipher AES-256** — All issue history and credentials encrypted at rest - **SQLCipher AES-256** — All issue history and credentials encrypted at rest
- **RCA + Post-Mortem Generation** — Auto-populated Markdown templates, exportable as MD/PDF - **RCA + Post-Mortem Generation** — Auto-populated Markdown templates, exportable as MD/PDF

View File

@ -4,7 +4,7 @@
<meta charset="UTF-8" /> <meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" /> <link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>TFTSR — IT Triage & RCA</title> <title>Troubleshooting and RCA Assistant</title>
</head> </head>
<body> <body>
<div id="root"></div> <div id="root"></div>

View File

@ -6,6 +6,10 @@ use crate::state::ProviderConfig;
pub struct OpenAiProvider; pub struct OpenAiProvider;
fn is_custom_rest_format(api_format: Option<&str>) -> bool {
matches!(api_format, Some("custom_rest") | Some("msi_genai"))
}
#[async_trait] #[async_trait]
impl Provider for OpenAiProvider { impl Provider for OpenAiProvider {
fn name(&self) -> &str { fn name(&self) -> &str {
@ -29,17 +33,39 @@ impl Provider for OpenAiProvider {
messages: Vec<Message>, messages: Vec<Message>,
config: &ProviderConfig, config: &ProviderConfig,
) -> anyhow::Result<ChatResponse> { ) -> anyhow::Result<ChatResponse> {
// Check if using MSI GenAI format // Check if using custom REST format
let api_format = config.api_format.as_deref().unwrap_or("openai"); let api_format = config.api_format.as_deref().unwrap_or("openai");
if api_format == "msi_genai" { // Backward compatibility: accept legacy msi_genai identifier
self.chat_msi_genai(messages, config).await if is_custom_rest_format(Some(api_format)) {
self.chat_custom_rest(messages, config).await
} else { } else {
self.chat_openai(messages, config).await self.chat_openai(messages, config).await
} }
} }
} }
#[cfg(test)]
mod tests {
use super::is_custom_rest_format;
#[test]
fn custom_rest_format_is_recognized() {
assert!(is_custom_rest_format(Some("custom_rest")));
}
#[test]
fn legacy_msi_format_is_recognized_for_compatibility() {
assert!(is_custom_rest_format(Some("msi_genai")));
}
#[test]
fn openai_format_is_not_custom_rest() {
assert!(!is_custom_rest_format(Some("openai")));
assert!(!is_custom_rest_format(None));
}
}
impl OpenAiProvider { impl OpenAiProvider {
/// OpenAI-compatible API format (default) /// OpenAI-compatible API format (default)
async fn chat_openai( async fn chat_openai(
@ -113,8 +139,8 @@ impl OpenAiProvider {
}) })
} }
/// MSI GenAI custom format /// Custom REST format (MSI GenAI payload contract)
async fn chat_msi_genai( async fn chat_custom_rest(
&self, &self,
messages: Vec<Message>, messages: Vec<Message>,
config: &ProviderConfig, config: &ProviderConfig,
@ -173,7 +199,7 @@ impl OpenAiProvider {
body["modelConfig"] = model_config; body["modelConfig"] = model_config;
} }
// Use custom auth header and prefix (no prefix for MSI GenAI) // Use custom auth header and prefix (no prefix for this custom REST contract)
let auth_header = config let auth_header = config
.custom_auth_header .custom_auth_header
.as_deref() .as_deref()
@ -185,7 +211,7 @@ impl OpenAiProvider {
.post(&url) .post(&url)
.header(auth_header, auth_value) .header(auth_header, auth_value)
.header("Content-Type", "application/json") .header("Content-Type", "application/json")
.header("X-msi-genai-client", "tftsr-devops-investigation") .header("X-msi-genai-client", "troubleshooting-rca-assistant")
.json(&body) .json(&body)
.send() .send()
.await?; .await?;
@ -193,7 +219,7 @@ impl OpenAiProvider {
if !resp.status().is_success() { if !resp.status().is_success() {
let status = resp.status(); let status = resp.status();
let text = resp.text().await?; let text = resp.text().await?;
anyhow::bail!("MSI GenAI API error {status}: {text}"); anyhow::bail!("Custom REST API error {status}: {text}");
} }
let json: serde_json::Value = resp.json().await?; let json: serde_json::Value = resp.json().await?;
@ -212,7 +238,7 @@ impl OpenAiProvider {
Ok(ChatResponse { Ok(ChatResponse {
content, content,
model: config.model.clone(), model: config.model.clone(),
usage: None, // MSI GenAI doesn't provide token usage in response usage: None, // This custom REST contract doesn't provide token usage in response
}) })
} }
} }

View File

@ -278,7 +278,9 @@ pub async fn test_provider_connection(
let provider = create_provider(&provider_config); let provider = create_provider(&provider_config);
let messages = vec![Message { let messages = vec![Message {
role: "user".into(), role: "user".into(),
content: "Reply with exactly: TFTSR connection test successful.".into(), content:
"Reply with exactly: Troubleshooting and RCA Assistant connection test successful."
.into(),
}]; }];
provider provider
.chat(messages, &provider_config) .chat(messages, &provider_config)

View File

@ -148,7 +148,7 @@ pub fn generate_postmortem_markdown(detail: &IssueDetail) -> String {
md.push_str("---\n\n"); md.push_str("---\n\n");
md.push_str(&format!( md.push_str(&format!(
"_Generated by TFTSR IT Triage on {}_\n", "_Generated by Troubleshooting and RCA Assistant on {}_\n",
chrono::Utc::now().format("%Y-%m-%d %H:%M UTC") chrono::Utc::now().format("%Y-%m-%d %H:%M UTC")
)); ));

View File

@ -133,7 +133,7 @@ pub fn generate_rca_markdown(detail: &IssueDetail) -> String {
md.push_str("---\n\n"); md.push_str("---\n\n");
md.push_str(&format!( md.push_str(&format!(
"_Generated by TFTSR IT Triage on {}_\n", "_Generated by Troubleshooting and RCA Assistant on {}_\n",
chrono::Utc::now().format("%Y-%m-%d %H:%M UTC") chrono::Utc::now().format("%Y-%m-%d %H:%M UTC")
)); ));

View File

@ -49,7 +49,9 @@ pub async fn authenticate_with_webview(
&webview_label, &webview_label,
WebviewUrl::External(login_url.parse().map_err(|e| format!("Invalid URL: {e}"))?), WebviewUrl::External(login_url.parse().map_err(|e| format!("Invalid URL: {e}"))?),
) )
.title(format!("{service} Browser (TFTSR)")) .title(format!(
"{service} Browser (Troubleshooting and RCA Assistant)"
))
.inner_size(1000.0, 800.0) .inner_size(1000.0, 800.0)
.min_inner_size(800.0, 600.0) .min_inner_size(800.0, 600.0)
.resizable(true) .resizable(true)

View File

@ -21,7 +21,7 @@ pub fn run() {
) )
.init(); .init();
tracing::info!("Starting TFTSR application"); tracing::info!("Starting Troubleshooting and RCA Assistant application");
// Determine data directory // Determine data directory
let data_dir = dirs_data_dir(); let data_dir = dirs_data_dir();
@ -107,7 +107,7 @@ pub fn run() {
commands::system::get_audit_log, commands::system::get_audit_log,
]) ])
.run(tauri::generate_context!()) .run(tauri::generate_context!())
.expect("Error running TFTSR application"); .expect("Error running Troubleshooting and RCA Assistant application");
} }
/// Determine the application data directory. /// Determine the application data directory.

View File

@ -29,14 +29,14 @@ pub struct ProviderConfig {
/// If None, defaults to "Bearer " /// If None, defaults to "Bearer "
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub custom_auth_prefix: Option<String>, pub custom_auth_prefix: Option<String>,
/// Optional: API format ("openai" or "msi_genai") /// Optional: API format ("openai" or "custom_rest")
/// If None, defaults to "openai" /// If None, defaults to "openai"
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub api_format: Option<String>, pub api_format: Option<String>,
/// Optional: Session ID for stateful APIs like MSI GenAI /// Optional: Session ID for stateful custom REST APIs
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub session_id: Option<String>, pub session_id: Option<String>,
/// Optional: User ID for MSI GenAI (CORE ID email) /// Optional: User ID for custom REST API cost tracking (CORE ID email)
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub user_id: Option<String>, pub user_id: Option<String>,
} }

View File

@ -1,5 +1,5 @@
{ {
"productName": "TFTSR", "productName": "Troubleshooting and RCA Assistant",
"version": "0.2.10", "version": "0.2.10",
"identifier": "com.tftsr.devops", "identifier": "com.tftsr.devops",
"build": { "build": {
@ -14,7 +14,7 @@
}, },
"windows": [ "windows": [
{ {
"title": "TFTSR \u2014 IT Triage & RCA", "title": "Troubleshooting and RCA Assistant",
"width": 1280, "width": 1280,
"height": 800, "height": 800,
"resizable": true, "resizable": true,
@ -36,9 +36,9 @@
], ],
"resources": [], "resources": [],
"externalBin": [], "externalBin": [],
"copyright": "TFTSR Contributors", "copyright": "Troubleshooting and RCA Assistant Contributors",
"category": "Utility", "category": "Utility",
"shortDescription": "IT Incident Triage & RCA Tool", "shortDescription": "Troubleshooting and RCA Assistant",
"longDescription": "Structured AI-backed tool for IT incident triage, 5-whys root cause analysis, and post-mortem documentation with offline Ollama support." "longDescription": "Structured AI-backed assistant for IT troubleshooting, 5-whys root cause analysis, and post-mortem documentation with offline Ollama support."
} }
} }

View File

@ -59,7 +59,7 @@ export default function App() {
<div className="flex items-center justify-between px-4 py-4 border-b"> <div className="flex items-center justify-between px-4 py-4 border-b">
{!collapsed && ( {!collapsed && (
<span className="text-lg font-bold text-foreground tracking-tight"> <span className="text-lg font-bold text-foreground tracking-tight">
TFTSR Troubleshooting and RCA Assistant
</span> </span>
)} )}
<button <button

View File

@ -35,7 +35,7 @@ export default function Dashboard() {
<div> <div>
<h1 className="text-3xl font-bold">Dashboard</h1> <h1 className="text-3xl font-bold">Dashboard</h1>
<p className="text-muted-foreground mt-1"> <p className="text-muted-foreground mt-1">
IT Triage & Root Cause Analysis Troubleshooting and Root Cause Analysis Assistant
</p> </p>
</div> </div>
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">

View File

@ -19,6 +19,35 @@ import {
import { useSettingsStore } from "@/stores/settingsStore"; import { useSettingsStore } from "@/stores/settingsStore";
import { testProviderConnectionCmd, type ProviderConfig } from "@/lib/tauriCommands"; import { testProviderConnectionCmd, type ProviderConfig } from "@/lib/tauriCommands";
export const CUSTOM_REST_MODELS = [
"ChatGPT4o",
"ChatGPT4o-mini",
"ChatGPT-o3-mini",
"Gemini-2_0-Flash-001",
"Gemini-2_5-Flash",
"Claude-Sonnet-3_7",
"Openai-gpt-4_1-mini",
"Openai-o4-mini",
"Claude-Sonnet-4",
"ChatGPT-o3-pro",
"OpenAI-ChatGPT-4_1",
"OpenAI-GPT-4_1-Nano",
"ChatGPT-5",
"VertexGemini",
"ChatGPT-5_1",
"ChatGPT-5_1-chat",
"ChatGPT-5_2-Chat",
"Gemini-3_Pro-Preview",
"Gemini-3_1-flash-lite-preview",
] as const;
export const CUSTOM_MODEL_OPTION = "__custom_model__";
export const LEGACY_API_FORMAT = "msi_genai";
export const CUSTOM_REST_FORMAT = "custom_rest";
export const normalizeApiFormat = (format?: string): string | undefined =>
format === LEGACY_API_FORMAT ? CUSTOM_REST_FORMAT : format;
const emptyProvider: ProviderConfig = { const emptyProvider: ProviderConfig = {
name: "", name: "",
provider_type: "openai", provider_type: "openai",
@ -50,19 +79,39 @@ export default function AIProviders() {
const [form, setForm] = useState<ProviderConfig>({ ...emptyProvider }); const [form, setForm] = useState<ProviderConfig>({ ...emptyProvider });
const [testResult, setTestResult] = useState<{ success: boolean; message: string } | null>(null); const [testResult, setTestResult] = useState<{ success: boolean; message: string } | null>(null);
const [isTesting, setIsTesting] = useState(false); const [isTesting, setIsTesting] = useState(false);
const [isCustomModel, setIsCustomModel] = useState(false);
const [customModelInput, setCustomModelInput] = useState("");
const startAdd = () => { const startAdd = () => {
setForm({ ...emptyProvider }); setForm({ ...emptyProvider });
setEditIndex(null); setEditIndex(null);
setIsAdding(true); setIsAdding(true);
setTestResult(null); setTestResult(null);
setIsCustomModel(false);
setCustomModelInput("");
}; };
const startEdit = (index: number) => { const startEdit = (index: number) => {
setForm({ ...ai_providers[index] }); const provider = ai_providers[index];
const apiFormat = normalizeApiFormat(provider.api_format);
const nextForm = { ...provider, api_format: apiFormat };
setForm(nextForm);
setEditIndex(index); setEditIndex(index);
setIsAdding(true); setIsAdding(true);
setTestResult(null); setTestResult(null);
const isCustomRestProvider =
nextForm.provider_type === "custom" && apiFormat === CUSTOM_REST_FORMAT;
const knownModel = CUSTOM_REST_MODELS.includes(nextForm.model as (typeof CUSTOM_REST_MODELS)[number]);
if (isCustomRestProvider && !knownModel) {
setIsCustomModel(true);
setCustomModelInput(nextForm.model);
} else {
setIsCustomModel(false);
setCustomModelInput("");
}
}; };
const handleSave = () => { const handleSave = () => {
@ -244,11 +293,21 @@ export default function AIProviders() {
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<Label>Model</Label> <Label>Model</Label>
{form.provider_type === "custom"
&& normalizeApiFormat(form.api_format) === CUSTOM_REST_FORMAT ? (
<Input
value={form.model}
onChange={(e) => setForm({ ...form, model: e.target.value })}
placeholder="Select API Format below to choose model"
disabled
/>
) : (
<Input <Input
value={form.model} value={form.model}
onChange={(e) => setForm({ ...form, model: e.target.value })} onChange={(e) => setForm({ ...form, model: e.target.value })}
placeholder="gpt-4o" placeholder="gpt-4o"
/> />
)}
</div> </div>
</div> </div>
<div className="grid grid-cols-2 gap-4"> <div className="grid grid-cols-2 gap-4">
@ -285,7 +344,7 @@ export default function AIProviders() {
onValueChange={(v) => { onValueChange={(v) => {
const format = v; const format = v;
const defaults = const defaults =
format === "msi_genai" format === CUSTOM_REST_FORMAT
? { ? {
custom_endpoint_path: "", custom_endpoint_path: "",
custom_auth_header: "x-msi-genai-api-key", custom_auth_header: "x-msi-genai-api-key",
@ -297,6 +356,10 @@ export default function AIProviders() {
custom_auth_prefix: "Bearer ", custom_auth_prefix: "Bearer ",
}; };
setForm({ ...form, api_format: format, ...defaults }); setForm({ ...form, api_format: format, ...defaults });
if (format !== CUSTOM_REST_FORMAT) {
setIsCustomModel(false);
setCustomModelInput("");
}
}} }}
> >
<SelectTrigger> <SelectTrigger>
@ -304,11 +367,11 @@ export default function AIProviders() {
</SelectTrigger> </SelectTrigger>
<SelectContent> <SelectContent>
<SelectItem value="openai">OpenAI Compatible</SelectItem> <SelectItem value="openai">OpenAI Compatible</SelectItem>
<SelectItem value="msi_genai">MSI GenAI</SelectItem> <SelectItem value={CUSTOM_REST_FORMAT}>Custom REST</SelectItem>
</SelectContent> </SelectContent>
</Select> </Select>
<p className="text-xs text-muted-foreground"> <p className="text-xs text-muted-foreground">
Select the API format. MSI GenAI uses a different request/response structure. Select the API format. Custom REST uses a non-OpenAI request/response structure.
</p> </p>
</div> </div>
@ -349,12 +412,12 @@ export default function AIProviders() {
placeholder="Bearer " placeholder="Bearer "
/> />
<p className="text-xs text-muted-foreground"> <p className="text-xs text-muted-foreground">
Prefix added before API key (e.g., "Bearer " for OpenAI, empty for MSI GenAI) Prefix added before API key (e.g., "Bearer " for OpenAI, empty for Custom REST)
</p> </p>
</div> </div>
{/* MSI GenAI specific: User ID field */} {/* Custom REST specific: User ID field */}
{form.api_format === "msi_genai" && ( {normalizeApiFormat(form.api_format) === CUSTOM_REST_FORMAT && (
<div className="space-y-2"> <div className="space-y-2">
<Label>User ID (CORE ID)</Label> <Label>User ID (CORE ID)</Label>
<Input <Input
@ -367,6 +430,52 @@ export default function AIProviders() {
</p> </p>
</div> </div>
)} )}
{/* Custom REST specific: model dropdown with custom option */}
{normalizeApiFormat(form.api_format) === CUSTOM_REST_FORMAT && (
<div className="space-y-2">
<Label>Model</Label>
<Select
value={isCustomModel ? CUSTOM_MODEL_OPTION : form.model}
onValueChange={(value) => {
if (value === CUSTOM_MODEL_OPTION) {
setIsCustomModel(true);
if (CUSTOM_REST_MODELS.includes(form.model as (typeof CUSTOM_REST_MODELS)[number])) {
setForm({ ...form, model: "" });
setCustomModelInput("");
}
} else {
setIsCustomModel(false);
setCustomModelInput("");
setForm({ ...form, model: value });
}
}}
>
<SelectTrigger>
<SelectValue placeholder="Select a model..." />
</SelectTrigger>
<SelectContent>
{CUSTOM_REST_MODELS.map((model) => (
<SelectItem key={model} value={model}>
{model}
</SelectItem>
))}
<SelectItem value={CUSTOM_MODEL_OPTION}>Custom model...</SelectItem>
</SelectContent>
</Select>
{isCustomModel && (
<Input
value={customModelInput}
onChange={(e) => {
const value = e.target.value;
setCustomModelInput(value);
setForm({ ...form, model: value });
}}
placeholder="Enter custom model ID"
/>
)}
</div>
)}
</div> </div>
</> </>
)} )}

View File

@ -453,7 +453,7 @@ export default function Integrations() {
<div> <div>
<h1 className="text-3xl font-bold">Integrations</h1> <h1 className="text-3xl font-bold">Integrations</h1>
<p className="text-muted-foreground mt-1"> <p className="text-muted-foreground mt-1">
Connect TFTSR with your existing tools and platforms. Choose the authentication method that works best for your environment. Connect Troubleshooting and RCA Assistant with your existing tools and platforms. Choose the authentication method that works best for your environment.
</p> </p>
</div> </div>

View File

@ -0,0 +1,25 @@
import { describe, it, expect } from "vitest";
import {
CUSTOM_MODEL_OPTION,
CUSTOM_REST_FORMAT,
CUSTOM_REST_MODELS,
LEGACY_API_FORMAT,
normalizeApiFormat,
} from "@/pages/Settings/AIProviders";
describe("AIProviders Custom REST helpers", () => {
it("maps legacy msi_genai api_format to custom_rest", () => {
expect(normalizeApiFormat(LEGACY_API_FORMAT)).toBe(CUSTOM_REST_FORMAT);
});
it("keeps openai api_format unchanged", () => {
expect(normalizeApiFormat("openai")).toBe("openai");
});
it("contains the guide model list and custom model option sentinel", () => {
expect(CUSTOM_REST_MODELS).toContain("ChatGPT4o");
expect(CUSTOM_REST_MODELS).toContain("VertexGemini");
expect(CUSTOM_REST_MODELS).toContain("Gemini-3_Pro-Preview");
expect(CUSTOM_MODEL_OPTION).toBe("__custom_model__");
});
});

View File

@ -17,7 +17,7 @@
"noFallthroughCasesInSwitch": true, "noFallthroughCasesInSwitch": true,
"baseUrl": ".", "baseUrl": ".",
"paths": { "@/*": ["src/*"] }, "paths": { "@/*": ["src/*"] },
"types": ["vitest/globals"] "types": ["vitest/globals", "@testing-library/jest-dom"]
}, },
"include": ["src", "tests/unit"], "include": ["src", "tests/unit"],
"references": [{ "path": "./tsconfig.node.json" }] "references": [{ "path": "./tsconfig.node.json" }]