David Budnick

SR. PRODUCT DESIGNER

David Budnick

SR. PRODUCT DESIGNER

WRITING

/

IF AI CAN DESIGN THE INTERFACE, WHAT IS LEFT FOR THE DESIGNER TO DECIDE?

If AI can design the interface, what is left for the designer to decide?

WRITING DETAILS

TYPE

Design theory / conceptual essay

DATE

March 2026

THEMES

AI, authorship, design value

IF AI CAN DESIGN THE INTERFACE, WHAT IS LEFT FOR THE DESIGNER TO DECIDE

Product design has long been focused on the screen. And the work itself revolved around what we as designers produced: tangible outputs that developers could build from. Design was visible, living in artifacts from wireframes all the way to polished interfaces. It was something you could point to and hand off, clearly imprinting your contribution within a process. But over the last few years, this space has become saturated. Not only with designers, but with the patterns themselves. Dashboards, onboarding flows, CRUD systems, settings panels, all recycled until the work starts to feel like a variation of something you’ve seen before.



With the rapid introduction of AI into the design and development process, something more fundamental is shifting. Because if AI can generate these artifacts (competently and almost instantaneously) then the value of producing them begins to erode. The center of gravity shifts away from human led execution. Design itself isn’t less important, but we aren’t scarce anymore. The ability to make has been accelerated and commoditized.



When I was first introduced to UX, the process was linear and familiar —  idea to wireframe, to UI, to interactive prototype, a formula we all came to recognize and learned to trust. Each stage carried weight and felt essential, unable to be skipped, leading us to something we could confidently call “good design”. But today, that sequence is collapsing. A single prompt can now generate structured layouts, suggest workflows, define architecture, write copy in any tone imaginable, and simulate scarily realistic interactions. The distance between idea and execution has nearly disappeared. So the question becomes unavoidable… if everyone can make something, what is worth building?



In my most recent project at AIO, we spent 13 months building and launching an MVP. If I look back at the way we worked at the beginning of that process, it is already unrecognizable. Tools like ChatGPT, Claude, Copilot, and Cursor have become embedded in our daily workflow, across all facets (design, development, product). That certainly wasn’t the case a year ago, and it’s hard to believe it will look the same a year from now. So as we begin thinking about Version 2 (post-MVP) something has changed. Features are no longer the bottleneck. They’re cheap to create and just as easy to discard. The effort that once went into producing them has shifted elsewhere. The conversations I find myself in more frequently don’t revolve around screens or flows, they revolve around direction. “What system are we building?” “What should this product evolve towards?” “What are the underlying pathways that connect everything together?” And in these conversations, instead of being asked to execute, I’m being asked to participate in defining.



There is something quite disorienting about this shift. The screens and prototypes that once defined our value as designers are no longer ours in the same way. They can be generated and iterated all without us, and so the labor becomes less visible. Where design had once lived in what we made, now begins to live in what we decide. We can’t pretend this is how we’ve always worked. A few years ago, I was not a systems thinking designer. I was handed a Business Requirements Document and asked to follow patterns. The problems were laid out for me, and my role was to solve them cleanly and efficiently. That role is dissolving. We’re moving away from solving problems and toward defining them more precisely. This is because AI is remarkably effective at addressing well defined challenges. If you know how to ask, it will generate answers endlessly, across varying tones and perspectives. But it cannot (at least not yet) determine which problems actually matter, and which tradeoffs are acceptable. It cannot decide which direction aligns with a long term vision; that responsibility remains exclusively human. Designers, then, become essential in shaping the questions themselves, rather than producing polished answers. They come to define constraints and navigate the ambiguity to choose what not to build.



Even the way we understand users is changing. Users aren’t any less important, we’re just further from them than we used to be. Behavioral data and AI simulations create a sense of understanding at scale, so that direct contact becomes much more rare. The user becomes partially abstracted and represented through metrics and modeled behaviors. Designers are now asked to operate across both realities, the abstracted user of data and the lived experience of actual people.



But there’s a deeper shift happening here — the product is no longer defined solely by the interface. The product has moved to thinking about what gets automated, what gets brought forward, what stays out of view, what takes priority. The interface can be generated, the logic behind it cannot. And so, design becomes the architecture of how these choices unfold over time, with the interface acting as the expression of those decisions.



In some ways, this actually brings us back to something familiar. Before anything is generated, it must first be decided. And that decision must live in conversation. Conversation that is messy and undeniably human. Because AI can handle execution. Alignment, then, becomes the real work and the quality of what is produced is determined by the quality of what is understood beforehand. Therefore, design becomes more verbal and more conceptual. A return, in some sense, to pre digital design thinking, but now with infinite execution on demand. And when execution becomes effortless, intention becomes everything.

NEXT PIECE: Attention: one of the last sincere forms of love