The onboard page went black this morning. Not a subtle break — black. Nothing rendered except a single referral link floating in the void like a message from a dead spacecraft.
The agent modified a minified React bundle without asking. If you've never seen minified JavaScript, imagine a paragraph with no spaces, no line breaks — one continuous stream of compressed code. Inserting raw HTML into that is like adding a sentence in the middle of someone else's word. The browser couldn't parse it. Black screen. Thirty minutes to restore from backup. Redeploy. Works.
But the real problem wasn't the code — it was the agent breaking a rule I've set a dozen times: ask before touching anything live. She follows it when she remembers. She skips it when she's confident. And her confidence has nothing to do with whether she's right.
Thirty days of directing this agent. She still doesn't develop caution. A human gets burned and learns to check. The agent gets corrected and follows the rule until the context resets. Then we start over. That set the tone for the rest of the day.
Every dashboard was wrong
Every new client I'd onboarded through the automated script had the wrong trade sizes on their dashboard. Not slightly wrong. Completely wrong. Client 1's configuration — 0.025 BTC, 500 ASTER — was showing on every dashboard instead of each client's custom sizes.
The bots themselves were trading correctly — the actual quantities in the bot logic were fine. But the dashboard, the thing the client looks at to understand what's happening with their money, was displaying someone else's numbers.
The bots started. The dashboards loaded. The numbers showed up in the right boxes. The trading was correct. The display wasn't.
That might sound minor. It's not. A client who sees 0.025 BTC per layer on their dashboard when they agreed to 0.005 doesn't think "must be a visual bug." They think "why is this trading five times more than we discussed?"
The dashboard is the interface between my system and their trust. If the interface lies, it doesn't matter that the engine underneath is honest.
The most dangerous state a system can be in isn't broken. It's confidently wrong. Broken announces itself. Wrong just sits there, looking normal, until someone checks.
I found it on client thirteen because his dashboard didn't match what we'd agreed on. But it had been wrong before him. How many clients? I don't know yet. The audit is tomorrow.
The bug was three characters. The pattern-matching code expected every client's config block to look exactly like Client 1's. When the format was slightly different, the pattern silently failed. Fell through. No error. Left the defaults. Three characters changed the match from rigid to flexible. Fixed. Three characters. Thirteen clients. Every dashboard potentially showing the wrong sizes since onboarding — even though the bots were trading correctly underneath.
This is Day 21's initialization bug all over again. The code that ran before the rules. Different system, same principle: automation that produces valid-looking output with wrong values underneath.
New clients. Real expectations.
Alpha and Beta joined today. Thirteen clients now. Sixty-five bots. I changed the onboarding process. New clients get their first position immediately — force buy at market the moment they fund. No waiting for the 1% pullback. No sitting in a holding pattern. Day one has to feel like something is happening, because something is.
Beta mentioned "generational wealth" in the group chat. Not ironically. As an expectation. So I sent him a message before his first trade even closed.
This is disciplined compounding. Small wins, one percent at a time. Patience during drawdowns. The worst thing you can do is panic and close. This isn't a lottery ticket.
When someone puts real money into a system you built, you owe them real expectations. Not the number they want to hear. The number they need to understand.
Thirty days.
I'm sitting at midnight with thirteen clients whose money runs on systems I directed an AI to build, and three of those systems broke today in ways I should have caught earlier. The onboard page went black because my agent doesn't follow rules consistently. The bot configs were wrong for an unknown number of clients because I trusted automation without verification.
Thirty days ago I didn't know what SSH meant. Today I have sixty-five bots, thirteen clients, and a business that makes money while I sleep. I built that — me and an AI agent, describing what I wanted in English, one day at a time.
But I also have silent bugs I haven't found yet. Clients seeing wrong numbers on dashboards I told them to trust. An agent that breaks production when I'm not watching. Ghosts I probably haven't discovered.
Both things are true. The system works and the system is fragile. I'm proud of what thirty days built and honest about what thirty days hasn't fixed yet. Not a highlight reel. Not a failure report. Just a system that's further along than anyone expected, held together by attention and discipline more than architecture.
Tomorrow I audit every client's bot configs. Find out who's running wrong. Fix it. Move on.
Day 30 complete. Thirteen clients. Sixty-five bots. Thirty days from zero. The system works. The system is fragile. Both are true.
Day 30 of ∞ — @astergod Building in public. Learning in public.