Implements Phases 1-8 of the TFTSR implementation plan. Rust backend (Tauri 2.x, src-tauri/): - Multi-provider AI: OpenAI-compatible, Anthropic, Gemini, Mistral, Ollama - PII detection engine: 11 regex patterns with overlap resolution - SQLCipher AES-256 encrypted database with 10 versioned migrations - 28 Tauri IPC commands for triage, analysis, document, and system ops - Ollama: hardware probe, model recommendations, pull/delete with events - RCA and blameless post-mortem Markdown document generators - PDF export via printpdf - Audit log: SHA-256 hash of every external data send - Integration stubs for Confluence, ServiceNow, Azure DevOps (v0.2) Frontend (React 18 + TypeScript + Vite, src/): - 9 pages: full triage workflow NewIssue→LogUpload→Triage→Resolution→RCA→Postmortem→History+Settings - 7 components: ChatWindow, TriageProgress, PiiDiffViewer, DocEditor, HardwareReport, ModelSelector, UI primitives - 3 Zustand stores: session, settings (persisted), history - Type-safe tauriCommands.ts matching Rust backend types exactly - 8 IT domain system prompts (Linux, Windows, Network, K8s, DB, Virt, HW, Obs) DevOps: - .woodpecker/test.yml: rustfmt, clippy, cargo test, tsc, vitest on every push - .woodpecker/release.yml: linux/amd64 + linux/arm64 builds, Gogs release upload Verified: - cargo check: zero errors - tsc --noEmit: zero errors - vitest run: 13/13 unit tests passing Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| .github | ||
| benchmarks | ||
| eslint.config.js | ||
| LICENSE | ||
| package.json | ||
| README.md | ||
| reusify.d.ts | ||
| reusify.js | ||
| SECURITY.md | ||
| test.js | ||
| tsconfig.json | ||
reusify
Reuse your objects and functions for maximum speed. This technique will make any function run ~10% faster. You call your functions a lot, and it adds up quickly in hot code paths.
$ node benchmarks/createNoCodeFunction.js
Total time 53133
Total iterations 100000000
Iteration/s 1882069.5236482036
$ node benchmarks/reuseNoCodeFunction.js
Total time 50617
Total iterations 100000000
Iteration/s 1975620.838848608
The above benchmark uses fibonacci to simulate a real high-cpu load. The actual numbers might differ for your use case, but the difference should not.
The benchmark was taken using Node v6.10.0.
This library was extracted from fastparallel.
Example
var reusify = require('reusify')
var fib = require('reusify/benchmarks/fib')
var instance = reusify(MyObject)
// get an object from the cache,
// or creates a new one when cache is empty
var obj = instance.get()
// set the state
obj.num = 100
obj.func()
// reset the state.
// if the state contains any external object
// do not use delete operator (it is slow)
// prefer set them to null
obj.num = 0
// store an object in the cache
instance.release(obj)
function MyObject () {
// you need to define this property
// so V8 can compile MyObject into an
// hidden class
this.next = null
this.num = 0
var that = this
// this function is never reallocated,
// so it can be optimized by V8
this.func = function () {
if (null) {
// do nothing
} else {
// calculates fibonacci
fib(that.num)
}
}
}
The above example was intended for synchronous code, let's see async:
var reusify = require('reusify')
var instance = reusify(MyObject)
for (var i = 0; i < 100; i++) {
getData(i, console.log)
}
function getData (value, cb) {
var obj = instance.get()
obj.value = value
obj.cb = cb
obj.run()
}
function MyObject () {
this.next = null
this.value = null
var that = this
this.run = function () {
asyncOperation(that.value, that.handle)
}
this.handle = function (err, result) {
that.cb(err, result)
that.value = null
that.cb = null
instance.release(that)
}
}
Also note how in the above examples, the code, that consumes an instance of MyObject,
reset the state to initial condition, just before storing it in the cache.
That's needed so that every subsequent request for an instance from the cache,
could get a clean instance.
Why
It is faster because V8 doesn't have to collect all the functions you create. On a short-lived benchmark, it is as fast as creating the nested function, but on a longer time frame it creates less pressure on the garbage collector.
Other examples
If you want to see some complex example, checkout middie and steed.
Acknowledgements
Thanks to Trevor Norris for getting me down the rabbit hole of performance, and thanks to Mathias Buss for suggesting me to share this trick.
License
MIT