UX for AIs, Not Just Humans
When we talk about UX, most of us instinctively picture a human with a phone in their hand, scrolling through an app, maybe squinting at a confusing button or sighing when a form autofill goes haywire. For decades, “user experience” has been shorthand for “human experience.” But here’s the twist: not all your users are actually human anymore. Some are bots, some are agents, some are scrapers, and increasingly, some are AI-driven systems that interact with your frontends the way a person would. And if you’re still only designing for flesh and blood eyeballs, you might be missing the fact that the machines are becoming a very real audience.
Think of it like this. You wouldn’t design a theme park with rides that only adults can use while completely ignoring kids. But that’s pretty much how a lot of modern sites treat their machine users: they exist, they matter, but they’re often stumbling through half-broken paths or misreading signals that were only ever intended for human interpretation. It worked fine when bots were secondary, just indexing your pages or scraping product listings. But today, with the rise of autonomous agents, LLM-powered crawlers, and even APIs that pretend to be people, those non-human users are actually front row in your experience design.
So let’s unpack what it means to build UX for AIs, not just humans.
The first thing to realize is that a web page is no longer just a sheet of pixels rendered in someone’s browser. It’s also a structured, semi-structured, or sometimes downright chaotic set of signals that other systems try to parse. That could be Google’s crawler, a screen reader, or an AI agent booking a hotel room on behalf of its human. In other words, there’s a hidden second audience baked into every interface. And just like with any audience, clarity matters.
Take something like semantic HTML. To a human
I ran into this recently with an airline booking site. From a human perspective, the interface was… fine. A little clunky, but doable. But I was also testing it with an AI agent hooked up to a browser automation tool. The agent kept getting stuck at the payment step because the “submit” button was hidden under three nested
There’s also the rise of what I’d call synthetic browsing. Instead of me opening ten tabs, an AI might explore those pages, summarize them, and come back with recommendations. This isn’t science fiction, it’s already happening with agents like Perplexity or specialized research bots. And the way you design your frontend will dictate how well those agents “experience” your site. Structured data, clean markup, predictable flows: these are no longer just good for SEO, they’re good for AI-X. Machine experience.
Now, before we get too far into the weeds, let’s make a quick comparison. Designing for machine UX is kind of like writing subtitles for a film. Most people can hear the dialogue just fine. But if the subtitles are sloppy, mistranslated, or missing altogether, the experience is ruined for the deaf audience, the distracted parent watching on mute, or the kid streaming it in a noisy café. Subtitles aren’t the main experience, but they massively expand who can actually enjoy the content. In the same way, semantic structure, metadata, and predictable design patterns act as subtitles for your interfaces, making them legible not just to humans, but to the growing audience of machines.
There’s also a deeper philosophical question here: if AIs are our new users, how much should we care about their experience? After all, they don’t have emotions. They won’t rage quit your checkout flow because the button is ugly. But they will fail. And their failure directly impacts humans. If your interface can’t be parsed by an AI, your business might not make the shortlist when someone’s digital assistant is shopping around. If your product page isn’t structured, maybe the AI skips over it in favor of a competitor’s cleaner setup. The machines might not leave angry Yelp reviews, but they’ll quietly redirect traffic and money somewhere else.
This is where I think a lot of teams are stuck in a 2010 mindset. They’re optimizing purely for human conversion rates, ignoring the silent middlemen who increasingly control the funnel. It’s like optimizing a store for in-person shoppers while forgetting that 80 percent of your sales actually come from delivery apps. Sure, the décor matters, but if your menu is unreadable to DoorDash, you’ve lost the game before the customer even sees your dining room.
Of course, none of this means you should turn your website into a sterile machine-only feed. We’ve all seen what happens when design panders exclusively to algorithms. Social media feeds turned into engagement bait. News sites stuffed with keywords. Nobody wants that. The real challenge is dual empathy: can you design an experience that feels intuitive to humans and legible to machines at the same time?
That’s where patterns like ARIA roles, schema.org structured data, and consistent navigation come in. Not sexy. Not flashy. But they’re the connective tissue of multi-user UX. To a human, a breadcrumb trail might just look like a handy navigation tool. To an AI, it’s a roadmap of how your site fits together. To a human, an alt tag is a minor accessibility detail. To an AI, it’s a hint that helps it understand what’s on the page. These small details add up to a machine experience that is smooth, not jagged.
And this is going to matter more and more as AIs stop just reading interfaces and start acting inside them. Think about a customer service agent that logs into a bank portal to make changes on behalf of a client. If that portal is built with human-only signals, the agent will constantly need help. But if it’s designed with machine clarity, suddenly you’ve built an ecosystem where human and AI collaboration is seamless.
There’s also a cultural angle here. For years, UX designers have prided themselves on empathy. “Walk in the shoes of your user,” the mantra goes. But what does empathy look like when your user doesn’t have shoes? Or feet? Or a body at all? In this context, empathy isn’t about imagining feelings, it’s about imagining constraints. Machines don’t get sarcasm, they don’t guess, they don’t infer. They rely on structure, clarity, and explicit signals. Designing for them is empathy of a different kind. Less about emotion, more about cognition.
This actually reminds me of how sci-fi often handles alien communication. Think of “Arrival,” where humans and aliens struggled to create a shared language that could bridge completely different ways of thinking. That’s kind of what we’re doing now with AIs. We’re building interfaces that act as translation layers, making our messy, human-friendly systems legible to a completely different kind of intelligence. And just like in the movie, the effort isn’t just about politeness, it’s about survival. If you fail to communicate, you lose the relationship.
I want to stress that this isn’t some distant-future speculation. It’s already here. Google’s crawler is already a primary audience. Accessibility tech is already parsing your layouts. Agents that can browse and act are already in the wild. We’ve entered an era where designing only for humans is not enough. And the companies that adapt to this will quietly, but massively, benefit. Because when the assistant in someone’s pocket decides which restaurant to suggest, which store to recommend, or which bank app to trust, the ones that are machine-friendly will naturally rise to the top.
So next time you’re debating whether that alt tag is worth writing, or whether semantic HTML is just pedantic, remember: you’re not just coding for the person with the laptop. You’re coding for the bots that will increasingly act as that person’s first line of experience. And if those bots can’t parse your flow, your beautifully designed human interface might never even get a chance to shine.
In short, UX is no longer just “user experience.” It’s “users’ experiences,” plural. Some of them have eyes and feelings. Some of them have parsing engines and token limits. Both deserve your attention. And if you can manage to delight humans while being legible to machines, you’re not just future-proofing your design, you’re actually acknowledging reality: that we already share the web with other intelligences.
So, design accordingly. Because your next user might not be a person at all.