Implements Phases 1-8 of the TFTSR implementation plan. Rust backend (Tauri 2.x, src-tauri/): - Multi-provider AI: OpenAI-compatible, Anthropic, Gemini, Mistral, Ollama - PII detection engine: 11 regex patterns with overlap resolution - SQLCipher AES-256 encrypted database with 10 versioned migrations - 28 Tauri IPC commands for triage, analysis, document, and system ops - Ollama: hardware probe, model recommendations, pull/delete with events - RCA and blameless post-mortem Markdown document generators - PDF export via printpdf - Audit log: SHA-256 hash of every external data send - Integration stubs for Confluence, ServiceNow, Azure DevOps (v0.2) Frontend (React 18 + TypeScript + Vite, src/): - 9 pages: full triage workflow NewIssue→LogUpload→Triage→Resolution→RCA→Postmortem→History+Settings - 7 components: ChatWindow, TriageProgress, PiiDiffViewer, DocEditor, HardwareReport, ModelSelector, UI primitives - 3 Zustand stores: session, settings (persisted), history - Type-safe tauriCommands.ts matching Rust backend types exactly - 8 IT domain system prompts (Linux, Windows, Network, K8s, DB, Virt, HW, Obs) DevOps: - .woodpecker/test.yml: rustfmt, clippy, cargo test, tsc, vitest on every push - .woodpecker/release.yml: linux/amd64 + linux/arm64 builds, Gogs release upload Verified: - cargo check: zero errors - tsc --noEmit: zero errors - vitest run: 13/13 unit tests passing Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
86 lines
3.3 KiB
Markdown
86 lines
3.3 KiB
Markdown
# Split2(matcher, mapper, options)
|
|
|
|

|
|
|
|
Break up a stream and reassemble it so that each line is a chunk.
|
|
`split2` is inspired by [@dominictarr](https://github.com/dominictarr) [`split`](https://github.com/dominictarr/split) module,
|
|
and it is totally API compatible with it.
|
|
However, it is based on Node.js core [`Transform`](https://nodejs.org/api/stream.html#stream_new_stream_transform_options).
|
|
|
|
`matcher` may be a `String`, or a `RegExp`. Example, read every line in a file ...
|
|
|
|
``` js
|
|
fs.createReadStream(file)
|
|
.pipe(split2())
|
|
.on('data', function (line) {
|
|
//each chunk now is a separate line!
|
|
})
|
|
|
|
```
|
|
|
|
`split` takes the same arguments as `string.split` except it defaults to '/\r?\n/', and the optional `limit` paremeter is ignored.
|
|
[String#split](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/String/split)
|
|
|
|
`split` takes an optional options object on it's third argument, which
|
|
is directly passed as a
|
|
[Transform](https://nodejs.org/api/stream.html#stream_new_stream_transform_options)
|
|
option.
|
|
|
|
Additionally, the `.maxLength` and `.skipOverflow` options are implemented, which set limits on the internal
|
|
buffer size and the stream's behavior when the limit is exceeded. There is no limit unless `maxLength` is set. When
|
|
the internal buffer size exceeds `maxLength`, the stream emits an error by default. You may also set `skipOverflow` to
|
|
true to suppress the error and instead skip past any lines that cause the internal buffer to exceed `maxLength`.
|
|
|
|
Calling `.destroy` will make the stream emit `close`. Use this to perform cleanup logic
|
|
|
|
``` js
|
|
var splitFile = function(filename) {
|
|
var file = fs.createReadStream(filename)
|
|
|
|
return file
|
|
.pipe(split2())
|
|
.on('close', function() {
|
|
// destroy the file stream in case the split stream was destroyed
|
|
file.destroy()
|
|
})
|
|
}
|
|
|
|
var stream = splitFile('my-file.txt')
|
|
|
|
stream.destroy() // will destroy the input file stream
|
|
```
|
|
|
|
# NDJ - Newline Delimited Json
|
|
|
|
`split2` accepts a function which transforms each line.
|
|
|
|
``` js
|
|
fs.createReadStream(file)
|
|
.pipe(split2(JSON.parse))
|
|
.on('data', function (obj) {
|
|
//each chunk now is a js object
|
|
})
|
|
.on("error", function(error) {
|
|
//handling parsing errors
|
|
})
|
|
```
|
|
|
|
However, in [@dominictarr](https://github.com/dominictarr) [`split`](https://github.com/dominictarr/split) the mapper
|
|
is wrapped in a try-catch, while here it is not: if your parsing logic can throw, wrap it yourself. Otherwise, you can also use the stream error handling when mapper function throw.
|
|
|
|
# License
|
|
|
|
Copyright (c) 2014-2021, Matteo Collina <hello@matteocollina.com>
|
|
|
|
Permission to use, copy, modify, and/or distribute this software for any
|
|
purpose with or without fee is hereby granted, provided that the above
|
|
copyright notice and this permission notice appear in all copies.
|
|
|
|
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
|
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
|
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
|
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
|
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
|
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
|
|
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|