tftsr-devops_investigation/node_modules/parse5/dist/common/token.d.ts
Shaun Arman 8839075805 feat: initial implementation of TFTSR IT Triage & RCA application
Implements Phases 1-8 of the TFTSR implementation plan.

Rust backend (Tauri 2.x, src-tauri/):
- Multi-provider AI: OpenAI-compatible, Anthropic, Gemini, Mistral, Ollama
- PII detection engine: 11 regex patterns with overlap resolution
- SQLCipher AES-256 encrypted database with 10 versioned migrations
- 28 Tauri IPC commands for triage, analysis, document, and system ops
- Ollama: hardware probe, model recommendations, pull/delete with events
- RCA and blameless post-mortem Markdown document generators
- PDF export via printpdf
- Audit log: SHA-256 hash of every external data send
- Integration stubs for Confluence, ServiceNow, Azure DevOps (v0.2)

Frontend (React 18 + TypeScript + Vite, src/):
- 9 pages: full triage workflow NewIssue→LogUpload→Triage→Resolution→RCA→Postmortem→History+Settings
- 7 components: ChatWindow, TriageProgress, PiiDiffViewer, DocEditor, HardwareReport, ModelSelector, UI primitives
- 3 Zustand stores: session, settings (persisted), history
- Type-safe tauriCommands.ts matching Rust backend types exactly
- 8 IT domain system prompts (Linux, Windows, Network, K8s, DB, Virt, HW, Obs)

DevOps:
- .woodpecker/test.yml: rustfmt, clippy, cargo test, tsc, vitest on every push
- .woodpecker/release.yml: linux/amd64 + linux/arm64 builds, Gogs release upload

Verified:
- cargo check: zero errors
- tsc --noEmit: zero errors
- vitest run: 13/13 unit tests passing

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
2026-03-14 22:36:25 -05:00

85 lines
2.6 KiB
TypeScript

import type { TAG_ID } from './html.js';
export declare enum TokenType {
CHARACTER = 0,
NULL_CHARACTER = 1,
WHITESPACE_CHARACTER = 2,
START_TAG = 3,
END_TAG = 4,
COMMENT = 5,
DOCTYPE = 6,
EOF = 7,
HIBERNATION = 8
}
export interface Location {
/** One-based line index of the first character. */
startLine: number;
/** One-based column index of the first character. */
startCol: number;
/** Zero-based first character index. */
startOffset: number;
/** One-based line index of the last character. */
endLine: number;
/** One-based column index of the last character. Points directly *after* the last character. */
endCol: number;
/** Zero-based last character index. Points directly *after* the last character. */
endOffset: number;
}
export interface LocationWithAttributes extends Location {
/** Start tag attributes' location info. */
attrs?: Record<string, Location>;
}
export interface ElementLocation extends LocationWithAttributes {
/** Element's start tag location info. */
startTag?: Location;
/**
* Element's end tag location info.
* This property is undefined, if the element has no closing tag.
*/
endTag?: Location;
}
interface TokenBase {
readonly type: TokenType;
location: Location | null;
}
export interface DoctypeToken extends TokenBase {
readonly type: TokenType.DOCTYPE;
name: string | null;
forceQuirks: boolean;
publicId: string | null;
systemId: string | null;
}
export interface Attribute {
/** The name of the attribute. */
name: string;
/** The namespace of the attribute. */
namespace?: string;
/** The namespace-related prefix of the attribute. */
prefix?: string;
/** The value of the attribute. */
value: string;
}
export interface TagToken extends TokenBase {
readonly type: TokenType.START_TAG | TokenType.END_TAG;
tagName: string;
/** Used to cache the ID of the tag name. */
tagID: TAG_ID;
selfClosing: boolean;
ackSelfClosing: boolean;
attrs: Attribute[];
location: LocationWithAttributes | null;
}
export declare function getTokenAttr(token: TagToken, attrName: string): string | null;
export interface CommentToken extends TokenBase {
readonly type: TokenType.COMMENT;
data: string;
}
export interface EOFToken extends TokenBase {
readonly type: TokenType.EOF;
}
export interface CharacterToken extends TokenBase {
type: TokenType.CHARACTER | TokenType.NULL_CHARACTER | TokenType.WHITESPACE_CHARACTER;
chars: string;
}
export type Token = DoctypeToken | TagToken | CommentToken | EOFToken | CharacterToken;
export {};