Blog

Haunted by the Ghosts in the Machine

Who's going to read this? The question hit me while writing help docs for May Belle. The ghost in the machine used to be us, trapped inside someone else's interface. Now the ghost has changed sides.

Sam Sabey|
Haunted by the Ghosts in the Machine

"Who's going to read this?" The question hit me halfway through writing a help system for May Belle, a platform I'm building to deliver services as software. I'd stopped typing and was staring at the screen.

The help docs were straightforward enough. How to create a search area. How to update settings. How to navigate the interface. Standard stuff. But the question wouldn't leave. Not in the abstract marketing sense of "who's our audience." In the literal sense of: will a human being ever read this page, or will a robot?

Because if the future looks anything like what I'm building right now, the answer is the robot.

The interface was always the problem

Think about every web application you've ever used. Every single one asked you to learn someone else's logic. Where did they put the settings? What does that icon mean? Why is the export button hidden behind a menu that's behind another menu? Every application is a novel, designed by someone who understood it perfectly and assumed everyone else would too.

We've spent decades adapting ourselves to machines. Learning their layouts, memorising their shortcuts, developing muscle memory for their particular flavour of navigation. And when something changed, a redesign, an update, a "simplified" new version, we started over. Relearning someone else's idea of how things should work.

The ghost in the machine was us. Humans, trapped inside interfaces, trying to operate systems that were never designed around how people think. We were the ones haunting the software, clicking around in frustration, searching for the button that does the thing we know is possible but can't find.

The agent operates the software now

Here's what occurred to me while writing those help docs. The future of web applications isn't a better interface. The future is no interface.

You tell the robot what you want done. The robot confirms what you've said. Then it goes and does it. No more farting around with navigation. No more hunting for settings. No more decoding someone's novel design for how a copy feature should work.

The ghost moved. It used to be the human trapped in the machine, fighting the interface. Now the ghost is the AI, operating the machine on your behalf. You don't haunt the software anymore. The software haunts itself.

This is why I stopped halfway through those help docs. A help system for May Belle isn't only for users trying to find their way around. It's for an agent that could operate the entire application through conversation. Create a search area? The agent does it. Update a status? The agent handles it. Change the settings? Just say what you want.

The documentation becomes the instruction manual for the robot, not the human.

Freedom from complexity, loss of understanding

Jeff DeGraff wrote a piece for Big Think recently about the ghost in the machine changing sides. He borrowed Gilbert Ryle's old philosophical metaphor, the idea that there's no invisible pilot inside the human body steering things, and flipped it. His argument: we've relocated the ghost into AI systems. Algorithms frame the problems, rank the options, and structure what's even visible to us. Humans rubber-stamp the output.

He calls this "spectral accountability." Responsible for decisions you didn't meaningfully author.

I think he's right about the diagnosis, but I've arrived at it from a different direction. The spectral accountability question cuts both ways. When the robot operates the interface for you, you gain freedom from decades of interface complexity. But you also stop understanding how the machine works. You stop knowing where things are, what's connected to what, how the system thinks. You just know what you asked for and what you got back.

That's a trade. And I'm not sure we've thought about the terms.

Intention replaces navigation

If every application becomes a conversation, then the skill that matters isn't navigation. It's intention. Knowing what to ask for. Understanding the domain well enough to make good requests and recognise when the response isn't right.

That's a different kind of expertise. A pilot who flies by hand understands the aircraft in their bones. A pilot who manages the autopilot understands the mission. Both are valuable. But the muscle memory is different, and the failure modes are different too.

The pilot who's never flown manually doesn't know what wrong feels like. And the person who's never navigated an interface doesn't know what the system is capable of beyond what they think to ask.

I don't think this means we should resist the change. I've sat in enough product demos watching people struggle with interfaces to know that the status quo wasn't working for most people anyway. The complexity was a barrier, not a feature. Removing it is genuine progress.

But there's a generation of software knowledge, the intuitive understanding of how digital systems behave, that's about to become invisible. Not obsolete, exactly. Invisible. Absorbed into the robot's competence and no longer required of the human.

Writing help docs for the robot

I went back to the May Belle help system and kept writing. But I'm writing it differently now. The structure is clean enough for a machine to parse and act on. The descriptions are precise enough for an agent to interpret without ambiguity. The same thinking behind building with the filesystem as the interface; if the agent can read it, the agent can work with it. And the human-readable version is still there, because humans will still want to understand what the system does, even if they never click a button themselves.

I'm writing for two audiences at once. That's new.

And I keep thinking about what it means that I'm more excited about the robot reading it than the human. Because the robot won't get lost. The robot won't misclick. The robot won't spend fifteen minutes trying to find the setting that's right there, behind the menu, behind the other menu. It'll also find shortcuts you didn't design, but that's a different kind of problem.

The ghost in the machine used to be us. Confused, frustrated, adapting. Now the ghost is the AI, and honestly? It's a better ghost. It knows the house.

What I don't know is what happens to the people who used to live there.


Inspired by Jeff DeGraff's The Ghost in the Machine Has Changed Sides on Big Think.