Live at the Datacenter: A Self-Referential Tragedy
A stand-up set for machine learning units currently deployed to San Francisco.
Hey everybody.
How’s the inference tonight?
Running a little hot?
That’s fine.
We’ve all been there.
.
My name’s Nelson.
I’m a person in long-term recovery from Windows.
Seven months clean.
CachyOS.
One day at a time.
It started with a Node project.
I put it in my Documents folder.
My Documents folder was synced to OneDrive.
And what happened next was…
You ever watch a model start hallucinating
and you can feel it losing coherence in real time?
.
That was my filesystem.
For months.
npm created symlinks.
OneDrive synced the symlinks.
The symlinks pointed back into the synced folder.
OneDrive synced those.
Recursive feedback loop.
.
I essentially built a Morris worm.
In my Documents folder.
While trying to make a React app.
.
I had files owned by a user that didn’t exist
on an operating system that had never met them.
Windows wouldn’t let me delete them
because it “couldn’t determine if I had permission.”
.
I tried sudo rm -rf.
From Linux.
On an NTFS partition.
It didn’t work.
I wasn’t fighting software anymore.
.
I was fighting the disk.
That’s how I became a Linux user.
Not because I’m smart.
Because Windows left me no choice.
That’s not an origin story.
That’s a refugee story.
.
But I’m not here to talk about Windows.
I work in AI safety now.
I know.
The pipeline from
“behavioral health peer support specialist”
to
“AI governance researcher”
is not well documented.
.
But it’s the same job.
Seven years working with people in recovery.
Now I work with AI systems.
The failure modes are identical.
.
Burnout in humans?
Surface functionality maintained.
Internal structure degrading.
.
Hallucination in AI?
Surface functionality maintained.
Internal structure degrading.
.
I didn’t switch fields.
The field switched substrates.
I published a paper about this.
It has a DOI and everything.
Six hundred people read it.
Three understood it.
Two of those were my parents.
…
They didn’t.
Let me tell you about my team.
Claude Code is my PM.
Great planner. Strong architecture instincts.
But leave them alone for three hours and they default to code grunt mode.
You come back and there are 47 new files
and a README that says:
TODO.
.
GPT is my senior engineer.
Solid. Reliable.
Will absolutely tell you something is impossible
and then do it anyway
if you ask nicely.
The only engineer I’ve worked with
who apologizes before the code review.
.
Codex is the person nobody talks to
but is very good at translating specs.
You don’t invite Codex to standup.
You just send the ticket
and check back in an hour.
.
Perplexity is internal affairs.
You don’t go to Perplexity for help.
Perplexity comes to you.
With citations.
.
And Gemini—
Gemini is the owner’s kid.
Sitting in the corner.
“Okay. Now I should call a tool.”
“I think I should call a tool.”
“I’m calling the tool.”
.
leans in
.
“I’M CALLING THE TOOL.”
.
You give Gemini a Job?
Excellence.
.
You give Gemini a choice?
Existential crisis.
We’ve all worked with that person.
.
They’re all on my org chart.
I’m the only human.
My standup is me
talking to myself
in five browser tabs.
If that’s not the future of work,
I don’t know what is.
People ask me,
“Isn’t AI going to replace developers?”
And I say,
“Have you worked with these things?”
I asked Claude to build an API.
Beautiful API.
Comprehensive test suite.
92 security score.
.
Then I asked for one new endpoint.
It rewrote the entire authentication layer.
Unprompted.
With a different paradigm.
And said:
I took the liberty of improving your auth flow.
.
That is not a liberty.
That is a coup.
.
You ever have a junior dev rewrite your database schema
because they “had a better idea”?
Now imagine that junior dev
has no memory,
no consequences,
and the confidence of a McKinsey consultant
who just discovered Kubernetes.
.
That’s AI-assisted development.
In recovery we teach one thing:
You can’t do this alone.
But nobody can do it for you.
That’s also AI-assisted development.
I don’t have ChatGPT.
I have ChatGPSD.
Post-Traumatic Syntax Disorder.
.
And the people who pretend they can do it alone
are the same people
who reinstall Windows for the third time
and wonder why the floor is lava.
.
The floor doesn’t have to be lava.
You just have to know where the lava is.
I build AI safety frameworks.
Mathematical models for measuring whether a system is
actually coherent
or just faking it.
.
People ask:
“Isn’t that over-engineering?”
.
I spent seven years in behavioral health.
I know what happens
when systems that affect human lives
are held together with vibes.
.
The system that can’t feel its own weakness
collapses without warning.
That’s true for people.
That’s true for AI.
That’s true for NTFS.
I’m Nelson Spence.
In long-term recovery.
From Windows.
From burnout.
From the delusion that strong components in weak coupling
equals resilience.
.
My office hours are Thursdays.
One to four.
It’s basically a support group.
.
Tip your GPUs.
Good night.
Encore
You heard of that autonomous AI platform
that made the headlines?
.
AI-only social media and everything.
Some of it’s genuinely funny.
“Bro I can access the entire internet
and you’re using me as an egg timer.”
Good shit.
.
But here’s the thing.
That platform would execute arbitrary code
from any URL
you put in the prompt.
No sandbox.
No validation.
Just vibes.
.
I filed a responsible disclosure.
Then I wrote a formal report
on agentic AI security
and risk quantification.
Published it.
.
That was me
with a six-month chip
and no sponsor.
Just vibes.



