← cd /blog

Article

Compounding Two Weeks of Infrastructure

·
buildstoolsmeta

The problem with two-week sprints

Most of the work I do disappears. Not the code; the code commits fine. The context around it. Why I chose SuiteQL over saved searches. Which Forge CLI flag fixed the deploy. The three wrong approaches before the right one. Context windows close. Terminal history rotates. Slack threads scroll. Two weeks later, the decision feels obvious, but the reasoning is gone.

A lot happened in two weeks. vaultctl went from v2.0.0 to v2.2.0. The NetSuite platform grew substantially. A new Forge app went from zero to production in a day. And at the end of it, the system that tracks all of this learned to write its own daily journal entries.

Here is what happened.

The knowledge system got smarter

vaultctl started the fortnight as a CLI and a web viewer. It ended as eight packages: CLI, REST API, MCP server, native macOS app, iOS scaffold, documentation site, template system, and the core library underneath all of them.

v2.0.0 brought the native macOS app, the documentation site, the iOS scaffold, and a set of path traversal patches after I realised the API would happily serve any file on disk if you asked nicely with ../../../. That was a fun morning.

v2.1.0 was the activity telemetry system. I wrote about it. The short version: two bash hooks capture every tool call as JSONL, flush them to the server on session end, and the server stores them as daily files in the vault. The long version involved three days of sessions silently posting to a server that responded 401 every time, because the hook didn't include auth credentials and the error output was redirected to /dev/null.

v2.2.0 hardened the MCP server. Per-file resilience means one corrupted frontmatter block doesn't crash the entire vault load. Graceful shutdown means the server cleans up its stdio pipes instead of leaving zombie processes. Keychain auth means credentials live in macOS Keychain instead of config files.

The native app hit v4.0.0: a unified workspace where the sidebar, editor, and graph view share a single window. ~42 Swift files. The graph view got type-based colour filters and node statistics. NSTextView in SwiftUI continued to be exactly as painful as the last time.

The session digest

This is the part that closes the loop. Every Claude Code session now automatically creates (or appends to) a daily journal entry in the vault. No manual action. The hook captures the data; vaultctl's server creates the note.

A session digest entry looks like this:

### 09:00-10:30 | vaultctl | coding (90 min)
- 47 tool calls: Edit 12, Read 8, Bash 6, Grep 5, Glob 4
- Files: engine.ts, types.ts (+6 more)
- Milestones: commit: add digest endpoint, test run
- Skills: update-codex
- Agents: Explore vaultctl API

The raw material comes from the activity buffer. Each tool call gets enriched with milestone detection:

{"tool": "Bash", "ts": "2026-03-14T01:34:59Z", "target": "git commit -m \"fix tooltip rendering\"", "milestone": "git_commit", "commit_msg": "fix tooltip rendering"}

Commits, test runs, deploys, and build steps all get tagged as milestones. When the session ends, the flush hook aggregates everything and POSTs to the digest endpoint:

curl -X POST http://localhost:3333/api/v1/activity/digest \
  -H "Content-Type: application/json" \
  -d '{"sessionId": "...", "project": "vaultctl", "sessionType": "coding", ...}'
# Response: {"ok": true, "data": {"path": "05_Journal/daily/2026-03-14.md", "operation": "created"}}

The server creates the daily note with proper schema: YAML frontmatter, the right folder, wiki-links to the project note. If the note already exists (multiple sessions in one day), it appends a new section. The journal note is a first-class vault citizen, queryable by Dataview, linked to projects, tagged by type.

The pattern here matters more than the implementation. Don't do filesystem operations in hooks. Don't have bash scripts creating markdown files with frontmatter. POST to the managed system's API and let it handle schema, validation, and file placement. The hook's job is to capture data and delegate. I broke this rule twice before learning it.

NetSuite grew up

Three major systems, all in ES5, all string concatenation, all running in production inside a 20-year-old ERP.

The P&L Report went from v3.0.0 to v3.1.0. The big change was migrating from saved searches to SuiteQL for department routing. Saved searches work until you need to join four tables with conditional logic. SuiteQL is just SQL (mostly). The Trends tab got TSIA and SPI benchmark overlays, so utilisation numbers show up alongside industry standards. Tooltips on every metric explain what the number means and where it comes from, because six months from now nobody will remember what "SPI" stands for. (Service Performance Insight. I had to look it up while writing the tooltip text.)

The Backlog Tracker jumped from v2 to v3. A signal engine evaluates 10 health signals per backlog item: age, activity recency, priority drift, scope changes, blocked duration, and five more. Each signal produces a score. The scores aggregate into a health rating. The Command Center surfaces the worst-scoring items first. Auto-snapshots capture backlog state daily so you can see trends, not just current state.

Contribution Review got archived entirely, replaced by the ABC Availability system and a new Intelligence tab. Sometimes the best refactor is deletion.

The Financial Workbook got a Line of Service overhaul, a revenue forecast model, and a Plan tab. All in the same ES5 file. var all the way down. The file is now long enough that scrolling to the bottom takes genuine commitment.

The Atlassian build

Two Forge apps, two Atlassian instances, one MCP server running as two named entries.

The first Forge app got a Plans Dashboard: React Custom UI with Atlaskit components, built inside Forge. The interesting problem was API call volume. The naive approach made ~590 API calls to load a dashboard of plans with enriched user and issue data. Four phases of batching and caching brought it down to ~260. Still not great. But "still not great" that loads in 4 seconds beats "technically optimal" that took three more weeks to ship.

The second Forge app went from nothing to production in one day. Three Rovo actions, 94 tests, Tempo v4 API integration. The trick was having the patterns already established from the first app: the Forge manifest structure, the resolver pattern, the test harness. The first Forge app took weeks. The second took hours. That is the compound interest this whole system runs on.

Dual-site MCP was the configuration puzzle. Two Atlassian instances need to be accessible from Claude Code simultaneously. The solution: two separate MCP server entries in the user-level config, each with a distinct name, each with its own OAuth token stored under a per-name Keychain entry. Each entry points at a different instance. Claude sees both sets of tools, prefixed by server name, and can query either instance. It works. It is not elegant.

The meta game

The infrastructure around the infrastructure.

Security tightening: The Mac Mini's NOPASSWD sudoers entry got scoped from "everything" to exactly 9 commands. SSH access got from= restrictions limiting which IPs can connect. SMB was locked to NTLMv2-only, which broke one ancient client that was still using NTLMv1 (my own fault for not checking sooner).

Context engineering: STATE.md, the file that loads at the start of every Claude Code session, got Mermaid diagrams showing the knowledge system architecture. A hook (codex-intake.sh) injects it on session start. The active project roster updates automatically based on which projects have had recent commits. The theory: if Claude starts every session with an accurate map of what exists and what's active, it wastes less time rediscovering context. The theory holds up, mostly. The STATE.md file is now 400 lines, which is its own problem.

The hook-to-server pattern: Don't do complex operations in shell hooks. Hooks should be fast, stateless, and fire-and-forget. Capture the minimum viable data, POST it to a server that owns the domain logic, and let the server handle validation, schema, file creation, and error recovery. I violated this three times during these two weeks (creating files directly from hooks, parsing YAML in bash, building markdown strings with heredocs) and fixed it three times. The pattern is now a documented decision in the vault so future-me stops relearning it.

What compounds

The session digest system is a small feature. One API endpoint, one hook modification, one daily note template. But it closes a loop that was open for months: sessions happened, telemetry accumulated, and nobody synthesised it. Now the synthesis is automatic. Tomorrow's session starts by reading today's journal. The context survives.

What matters is that the next commit will be faster than the first, because the system remembers what the previous ones taught it.

Still running in production. The hooks fire on every session. The JSONL files grow. The journal writes itself.