top of page

Vibe Coding and Vibe Design

Writer's picture: Jakob NielsenJakob Nielsen
Summary: AI transforms software development and UX design through natural language intent specification. This shift accelerates prototyping, broadens participation, and redefines roles in product creation. Human expertise remains essential for understanding user needs and ensuring quality outcomes, balancing technological innovation with professional insight.

 

Vibe coding is an AI-assisted approach to programming where developers describe what they want in natural language and let AI generate most of the code​.​ Instead of writing every line of syntax, a vibe coder “fully gives in to the vibes” of an AI assistant, outsourcing the detailed implementation to the AI. [The term “vibe coding” was coined by Andrej Karpathy in February 2025 in the post I linked from my definition. In Karpathy’s words, “It’s not really coding — I just see stuff, say stuff, run stuff, and copy-paste stuff, and it mostly works.”]

In vibe coding, the human focuses on what the software should do, while the AI figures out how to do it in code​. This high-level, conversational style of development has been described as “sorcery” for non-programmers, since it allows even amateurs to produce working software with minimal manual coding.


Software development becomes intent-based outcome specification, like many other AI interactions. Say what you want and the AI figures out how to implement it and hands you the code. (Imagen)


Some analysts feel that vibe coding is simply the next evolution of low-code platforms, which essentially allow creators to build software by describing an idea instead of using visual tools or writing code​. In this view, English (or other human language) becomes the equivalent of a programming language​. A more radical interpretation frames vibe coding as software that adapts to user vibes or emotions: one article describes it as applications leveraging AI and sentiment analysis to adjust their behavior and UI based on user mood or preferences​. In that scenario, vibe coding isn’t about how code is written, but about creating emotion-aware software that personalizes the experience (an unusual angle that diverges from Karpathy’s original meaning).


To me, vibe coding is more than a traditional low-code developer tool, which is still driven by human programming. But it’s not pure emoting either. The person who is “vibing” still has to articulate the goals of the software project: that “what.” What’s new is the level of abstraction (programming by prompt), which is why vibe coding is being hailed as a new era in software development. Thus, I view vibe coding as a  way to describe intentions at a high level and delegate the writing of working code to the  AI tools. In this view, vibe coding is one more example of AI being intent-based outcome specification, as I said as far back as 2023. (Yes, 2023 is the Bronze Age equivalent in AI terms.)


​The Bronze Age in Greece lasted 1900 years. In contrast, the Bronze Age of AI lasted two years, from the release of ChatGPT 4 as the first good AI until today, when we’re getting the next generation of highly capable AI. In my analogy, even Deep Research is but an Iron Age sword, and better is yet to come. (Midjourney)


Vibe coding is already making an immense impact. A recent video by the leadership of Y Combinator (the leading startup investor in Silicon Valley) mentioned that the top quarter of recently funded startups in terms of AI commitment now reports that AI writes 95% of their code.


Vibe Coding Changes Product Development

Vibe coding drastically lowers the barrier to creating software: Just having an idea is to get a basic tool working, achieving in hours (with AI guidance) what might have otherwise taken days of learning and coding​. This makes software creation more about high-level idea expression and less about the mechanics of coding​. The likely implications for product development include the following.


Speed and agility: Teams can spin up prototypes or new features with unprecedented speed. Instead of waiting on engineering backlogs, a product manager or designer could prompt an AI to create a functional mock-up of an app in hours. This accelerates the prototyping phase dramatically, encouraging rapid experimentation. Early adopters report being “only a few prompts away from a product” if they have a clear idea​. Faster iteration means more user feedback cycles can be incorporated in the same timeframe, leading to better design.


Broader participation in development: Because natural language is more accessible than code, vibe coding opens the door for non-engineers to contribute directly. Domain experts without coding ability can use it to create simple apps addressing their personal needs​. For product development, this means brainstorming is no longer siloed since anyone with domain knowledge can prototype a solution without first translating their ideas to a developer. In essence, vibe coding acts as a bridge between ideation and implementation, letting product ideas bloom directly into software.


Productivity for engineers: Experienced developers may benefit the most from vibe coding. By offloading boilerplate and rote programming tasks to AI, developers can focus on higher-level architecture, creative problem solving, and fine-tuning the product. As one blog noted, vibe coding lets engineers spend less time wrestling with minutiae (like syntax errors or wiring up basic CRUD operations) and more time imagining “what to build next,” focusing on the creative element of development​.


From MVPs to “software for one”: With easier prototyping, the nature of products might shift. We may see an explosion of niche, highly personalized applications: one-off tools crafted by individuals for their own use (or by small teams for a very specific audience)​. Since it takes much less effort to create a basic app, the threshold for what ideas are worth implementing is lower. Product development could become more experimental and user-driven, as users themselves build what they need. Kevin Roose dubbed these AI-built personal apps “software for one”, and while they might be simple or imperfect, the sheer increase in quantity of software could yield new solutions that a formal product team might never have attempted​. Something that’s useful for that one person might be useful for more people and is probably “minimum viable” by the very fact that the one person did find it useful. Some of these products will presumably be further developed for sale to a broader audience. Much as the advent of spreadsheets in the 1980s enabled non-programmers to build useful tools (models, calculators, etc.), widespread vibe coding could empower professionals from all fields to develop custom apps. The “citizen developer” trend would accelerate. In the future, having a good idea or domain expertise could be enough to launch a software product, even without a dedicated dev team.


Faster product cycles, higher competition: If the time and cost to build a Minimum Viable Product drop dramatically, the pace of product development will increase. Startups can build and iterate more quickly, and incumbents must respond faster. We may see hyper-rapid prototyping become the norm, with products released in “beta” within days of conception, then improved continuously via AI-generated updates. This could intensify competition, as barriers to entry for new software products are lowered. On the flip side, it may become harder for any single product to stand out, since features alone will be easily copied and auto-generated by competitors with AI. The true differentiation might shift to things like brand, data, and user experience when the basic functionality becomes cheap to replicate. Companies that leverage vibe coding effectively could outpace those that don’t, leading to an “AI adoption gap,” with legacy companies falling further behind.


These changes were summed up in the Y Combinator video as changing the job of a software engineer to that of “product engineer,” assuming most of the duties formerly assigned to product managers.



The passing of the digital torch: software engineers become product engineers, taking over the duties of product managers as AI does most of the coding. (Imagen)


Does Experience Still Count?

The Y Combinator investors also mentioned that the most aggressive use of vibe coding among the startups they have funded was seen when the founders were so young that they had never been exposed to traditional computer science education: these youngsters have never known anything except AI-supported software development, so they rely on it fully in their startups.


Traditional research has found that professional abilities peak around the age of 40, since they combine fluid intelligence (which peaks at age 20 and then declines with the biological decay of the human brain) and crystallized intelligence (which only goes up the more you learn). From 20 to 40, the growth in crystalized intelligence outpaces the decline in fluid intelligence, and people can accomplish more every year. From 40 to 50, abilities decline slightly since there’s not that much new to learn (and the biological brain decay proceeds apace).


Much as this model of professional skills has been proven over and over again empirically, that was in the past. What if AI changes the world so profoundly that past learning becomes irrelevant? Then, the 20-year-olds will be at the peak of professional performance, and anybody older will have a decayed brain without the benefit of any useful experience.


I doubt that this revised professional growth model will turn out to be true. (Admittedly, I’m old, so I’m biased, but I have seen for myself how AI use can increase the creativity of older intellectuals by compensating for their declining fluid intelligence.) True, any experience that keeps you from maximum use of AI will reduce your professional abilities. But that’s only a small corner of a product professional’s crystallized intelligence. Most of our hard-won experience will still be needed to answer that all-important “what” question.


Just because you can say what you want doesn’t free you from figuring out what to say. Same is true for vibe coding and vibe design. Determining the best thing to ask for becomes more important than ever. (Midjourney)


What Stays the Same?

As just mentioned, experience and seniority will still be valuable as companies turn to vibe coding. Many other aspects of traditional product development will also remain unchanged or only change marginally.


High-level system design, data modeling, and performance planning are still critical. An AI might generate code to meet a prompt, but it won’t automatically produce a sound architecture for a complex system.


The Y Combinator video also stated that current vibe coding tools are poor at debugging. Since software continues to have mistakes, humans are needed for debugging. However, this may change with the next generation of reasoning models (like o3 or the forthcoming GPT-5), which are supposedly better at debugging, even though they have not yet been integrated into software development platforms.


Also, as one of the Y Combinator speakers said, some startups have partly abandoned debugging in favor of simply regenerating any code that misbehaves. When AI can make a thousand lines of code in a minute, it’s faster to make a new program than to find the error in the old one. (This, of course, assumes that it’s easy to test whether the flaw remains and that newly-generated code is right more often than wrong. I wouldn’t abandon debugging yet, though I expect it to mostly be done by reasoning models in a year or two.)


Most important for my design readership, building the right product is still a human responsibility, in terms of understanding user needs, prioritizing features, and crafting a great user experience. Vibe coding can execute instructions, but deciding what the software should do and why is not automated. Product managers and designers must still do user research, market analysis, and creative brainstorming. In that sense, vibe coding changes the implementation phase more than the planning phase of the product lifecycle. The need for clear specifications is as important as ever (arguably more important, since an ambiguous request to an AI can lead it astray). So, effective communication and problem decomposition skills remain key for teams using vibe coding. They just communicate in natural language and examples instead of exclusively via technical specs.


AI becomes king of implementation, but humans will still need to set the directions for building the right product. (Leonardo)


Vibe Design

One ironic prospect of making coding easier is that design and UX become even more critical. When anyone can churn out a functional app, simply having working features is no longer a differentiator: the quality of the experience will set products apart​. In a vibe coding future, companies will hopefully invest more in understanding user needs, refining the interface, and polishing details that delight users, because those are harder for AI to get right without guidance. This leads directly into the concept of vibe design.

Vibe design applies similar AI-assisted principles to UX design and user research, by focusing on high-level intent while delegating execution to AI. The following will likely be the main components of vibe design:


Design by feel, not by pixels: Traditional digital design involves painstakingly crafting static screens or wireframes (in tools like Figma), worrying about exact pixel placements and style guides. In contrast, vibe design emphasizes describing the desired feeling or outcome of a design, and letting AI propose the visual or interactive solutions​. Rather than manually drawing every element, a designer might say to an AI tool, “The interface feels a bit too formal; make it more playful and engaging,” and the AI could suggest color changes, typography tweaks, or animation accents to achieve that vibe. This is analogous to vibe coding’s natural language prompts, except the AI’s output is a design mockup or updated UI style instead of code.


Rapid prototyping and interactive flows: Vibe design shifts the focus from static design deliverables to interactive prototypes. Designers leveraging AI can quickly generate working UI components and flows that are easy to communicate to stakeholders and test with users. These prototypes can be hooked into real data or basic code (often produced by the same or a companion AI). The result is that design and development start to overlap, so a vibe designer might produce a mini web app that is the prototype. Kshitij Agrawal summarized this approach as “mockups existed because coding was difficult; now that coding is easy, the future is prototypes.”​ Essentially, vibe coding makes it trivial to create front-end code, so designers can skip static mockups and go straight to building a UI that works. They can then refine the UX by interacting with it, focusing on the overall feel and flow. This makes user testing more realistic because participants can try a functional interface early on. The design process becomes more iterative and dynamic, with AI enabling quick changes: for instance, a designer can ask the AI to adjust spacing, layout, or component behavior on the fly between user testing sessions.


Merging of design and development roles: The line between designer and developer blurs in the vibe design paradigm. Designers become more like design engineers, using AI tools to add not just visual styles but also basic functionality to UI components without programming skills​. It’s akin to using a game engine interface where you drag, drop, and assign behaviors, potentially all assisted by AI prompts​. This empowers designers to directly create UI components that developers can plug into the product. In other words, the handoff between design and development becomes much smoother, with the deliverable changing from a spec document or sketch into actual ready-to-use elements. For user researchers, this blending means they can get functional prototypes to test much faster, and they can iterate on feedback by simply describing new ideas to the AI. The UX researcher might say “users are confused on this screen, can we simplify it?” and the AI can rapidly attempt a simplified layout. Both roles converge on a shared process of trial, feedback, and refinement with AI shortening the loop.


Continuous user research powered by AI: Vibe design also touches user research in how insights are gathered and applied. With AI able to generate variants of designs quickly, teams can A/B test different “vibes” of an interface with real users and immediately tweak based on results. Furthermore, AI can assist in analyzing qualitative feedback or usability test recordings. For example, an AI can summarize insights from days of user testing or the sentiment from hundreds of user interviews in minutes. Designers can modify these insights into prompt requests and get immediate design adjustments. This closed-loop of AI-assisted research and design could greatly accelerate the user-centered design process. In essence, vibe design could make UX research more efficient by automating the grunt work of aggregation and by providing quick implementations to test hypotheses. The human still must ask the right questions and interpret the results, but the path from identifying a UX problem to seeing a redesigned solution can be much faster.


Empowering and scaling design expertise: Just as vibe coding democratizes coding, vibe design aims to democratize design. Not everyone is a trained designer, but with AI help, more people can apply a “design mindset” to their products​​. AIverse has floated the concept of “Design Intelligence” to describe AI systems encapsulating good design principles that anyone can leverage​. For instance, a non-designer startup founder could use an AI tool to generate a reasonably well-designed UI by simply specifying the vibe (“I want it to feel friendly and modern, with an intuitive flow”). The AI, having been trained on vast design knowledge, would produce a design that adheres to UX best practices out-of-the-box.


These changes don’t eliminate the role of designers. If anything, vibe design raises the bar for design quality and frees designers to tackle more complex experiential challenges. In a future where vibe design is common, every product team member might incorporate some UX thinking into their work (because the AI makes it easy to do so), and truly egregious design mistakes might become rarer (because the AI can warn against them or auto-correct them). Designers, on the other hand, might focus on crafting the signature aesthetic of a product line and the emotional experience of individual products, using AI as a collaborator to explore a wide range of creative options quickly.


Vibe design extends the ethos of vibe coding into the realm of UX: it’s about leveraging AI to handle the detailed execution (whether that’s drawing UI components or parsing user data) so that humans can focus on the creative and empathetic aspects — understanding the user, crafting the right experience, and refining the product’s “vibes.” Just as vibe coding has shown that having an idea can be enough to start programming, vibe design suggests that having a vision for the user experience can be enough to start designing. Both paradigms point toward a future where cross-functional product teams work in tandem with AI co-creators. The practical result could be a faster, more fluid product development cycle from concept to code to user feedback, with less friction between each step.


Ultimately, the combination of vibe coding and vibe design could make product development more human-centered by letting humans concentrate on the creative vision and user value, while delegating the tedious implementation (whether of code or design) to our increasingly capable AI coworkers.


AI-powered software development, user research, and UX design are revolutionary. Focus on users, not grunt work — let machines handle the details. Stay sharp: human judgment trumps all. Embrace these tools to boost your craft, not bypass it.


Quick Explainers

I made two short videos to explain this article to people who don’t have time to read the full thing (or who want an alternate take):

Both videos were made with Humva, which is extremely easy to use. I don’t like the results as much as my videos made with HeyGen and Kling, but I spent significantly less time producing these videos, and there’s something to be said for that aspect of usability.

Top Past Articles
bottom of page