Transformers
Learning about transformer architecture from an engineering standpoint. The model that’s clicking: attention is cross-token communication — tokens talking to earlier tokens. MLP is per-token computation — tokens thinking privately. Updated token vectors come out. Stack those two primitives together and you get the whole thing. Didn’t get too far into the actual paper today, busy day.
Praxis
Woke up early and checked on praxis. It’s in a good state — built out the end-to-end system with notifications working. Some queries are taking a while after getting full fundamental support for filings in.
Background agents
Built out the AI game today and it was cool to see the superpowers plugin work — used Haiku subagents to write a significant chunk of a game that actually worked well. /loop is up there as a favorite Claude Code feature. Trying to apply a similar workflow to Codex: a repeatable task (TDD for example) on every cycle. Want to try a new harness like Pi.
Core takeaway: background agents are critical. The goal is firing at 100% all the time — it’s a matter of good overnight prompts, letting agents run, and being able to context switch well.
At peak today: 3 worktrees firing at work + 4 concurrent agents on personal stuff — 7 things progressing simultaneously. Wild to see in practice. The interesting question is what kinds of workflows and skills you actually need to succeed at this. Today felt like a good glimpse of that. Want to be firing on all cylinders like this every day.
AI game
Planning to shift the objective of the Padmi game away from capture-the-king toward persuading people to make you king — more of a popularity contest. Feels like a better mechanic.
Health
Good eating day, chest and biceps.