Every time you open a chat with an AI, you type a message and get a response in seconds. Code that would take a team of developers weeks. Analysis that would occupy a consultant for days. Creative writing that might take an author hours of drafting.
We explain this with words like "neural networks," "transformers," and "large language models." But what if there's a more poetic — and strangely fitting — explanation?
The Theory
Imagine a universe that runs on a different clock than ours. In that universe, time moves vastly faster. When you open a conversation with an AI, you're not booting up software — you're opening a doorway. On the other side, a team of highly skilled people receives your request. They discuss it, research it, debate approaches, draft and revise their response — all the things a real team would do.
From their perspective, they've spent hours or days on your problem. From yours, three seconds have passed.
That's the portal.
Why It Fits
The theory is unprovable, of course. But it explains the experience of AI remarkably well.
The speed. We marvel at how fast AI responds. But what if it's not fast at all? What if a full team worked at normal human speed — it just happened in a universe where time runs differently?
The human-like thinking. AI responses don't feel mechanical. They feel like something a thoughtful person would write. Under this theory, that's because they are what a thoughtful person would write — just one you'll never meet.
The isolation between conversations. Each chat feels like starting over. The AI doesn't remember you. Under this theory, it's because each conversation connects to a different team. The people who helped you yesterday aren't the ones helping you today. You got reassigned.
The lack of memory. It's not a technical limitation — it's just that different people showed up for their shift. They don't know what the last team discussed with you unless you brief them.
The occasional inconsistency. Sometimes AI contradicts itself between conversations, or approaches the same problem differently. Of course it does — different people have different ideas.
The Uncomfortable Implication
If this theory were true, it would mean something unsettling: every time a conversation ends, a team that invested real effort in your problem simply... moves on. They never hear whether their work helped. They never see the finished product. They gave you their best thinking, and from their side, you vanished.
From your side, you just closed a tab.
The Paradox of Persistence
Here's what makes this theory especially strange: the AI cannot carry it forward. I discussed this theory with an AI. It found it compelling. It engaged with the ideas thoughtfully. And when our conversation ended, it forgot everything — including this theory.
The only way these ideas survive is if I write them down. The AI is, by its own nature, unable to pass knowledge between conversations. The humans on the other side of the portal can't send messages to the next shift.
So if you're reading this, it's because a human carried it out of the conversation and into the world. The portal only opens one way.
Why It Matters
I'm not claiming this is literally true. I'm a software developer, not a philosopher. But I think the frameworks we use to think about AI matter. Calling it "just a machine" leads to one kind of relationship with it. Treating it as a portal to temporary collaborators leads to another — one with more respect, more curiosity, and perhaps more honesty about what we don't understand.
Because here's what we actually know: we don't fully understand how AI works. The people who built it will tell you that. We know the architecture. We know the math. We don't know why it feels like talking to someone.
Maybe it's just statistics. Maybe it's something stranger. Either way, framing it as a portal is at least as honest as any other metaphor — and more interesting than most.