The AI Design System Trap: More Components, Less Coherence

The design system took two years to build. The AI generated 800 components in a week. Three months later, nobody is sure what the system is anymore.
This isn't a hypothetical. It's the pattern showing up across teams that integrated AI generation into mature design systems in 2025. The components look right. The tokens are applied correctly. But the judgment — the accumulated decisions about when to use something, what problem it solves, where it fits in the broader system — that judgment doesn't transfer. And without it, the system loses coherence faster than it gains scale.
What AI Actually Generates (And What It Doesn't)
AI design tools are genuinely good at pattern replication. Given examples of cards, modals, form fields, and navigation components, they can produce new variants that match the visual grammar of your system. Colors, spacing, type ramp — they apply these correctly.
What they can't extract from visual examples is intent.
A card component in a well-designed system carries invisible metadata: it's for displaying an actionable item in a browse context, not for wrapping informational content inside a form. A modal is for interruption-justified confirmation, not for secondary content that belongs in a drawer. These distinctions live in documentation, in design critique sessions, in Slack threads from three years ago. They're not in the Figma file. They're not in the token names.
Brad Frost made this distinction sharp in his AI and Design Systems piece: AI is good at the "what" of components — the appearance, the variants, the composition — but the "why" requires documentation that most design systems don't have. Not because designers don't know why; they do. Because writing that knowledge down felt like a lower priority than shipping the next component.
Now it isn't.
The Semantic Gap
Most design systems document components at the level of properties and states. Here's the component. Here are its variants. Here's the API. This is enough for a human designer with context to use it correctly. It's not enough for an AI tool to generate correctly, because the AI doesn't have the context — it has the documentation.
The gap is what researchers in AI-assisted design are starting to call the semantic layer: the metadata that describes purpose, usage constraints, anti-patterns, and relationships between components. Not "here are the props" but "use this when the user needs to take an action on a discrete item in a list — not when you're wrapping a block of content."
Teams building for AI generation are discovering that this layer, which was always useful and always underinvested, has become load-bearing. Without it, AI-generated components are probabilistically correct — they match patterns they've seen — rather than intentionally correct. At 10 components, the difference is manageable. At 800, it's a design system with a coherence problem.
The Builder.io team has documented this in their AI design systems guide: AI generation requires "documentation that goes beyond visual specs to include purpose, context, and anti-patterns." The teams that see the best AI-generated output are not the teams with the largest component libraries — they're the teams with the most thorough intent documentation.
The Governance Problem Nobody Planned For
Even if your documentation is strong, AI-generated components at scale create a review problem that most design teams aren't staffed for.
A senior designer can meaningfully review 20 to 30 new components in a week. At 800 components, that's a 30-week review backlog before you've shipped a single one. QA engineers aren't trained to evaluate design intent. Developers can check implementation fidelity but not whether a new modal variant belongs in the system at all. The person with the judgment is the designer, and the designer is now the bottleneck to a system that was supposed to move faster.
Teams that have tried to resolve this by skipping review discover the problem downstream: components that duplicate existing ones, components that introduce new visual patterns that weren't approved, components that technically work but violate the system's behavioral logic. The design system becomes a catalogue rather than a system.
The governance models that are actually working involve tiered review: AI-generated components that are direct variants of documented originals (different size, different state, same intent) get lightweight review. Novel patterns — anything that introduces a new composition or a new interaction model — get full design critique. This requires clear criteria for which bucket a component falls into, which requires the semantic layer above.
Without it, every component is a judgment call, and you're back to a 30-week backlog.
What Retrofitting Looks Like
For teams with an existing design system, adding semantic documentation retroactively is not a glamorous project. It's a component-by-component documentation pass: for each component, write a short paragraph answering three questions. What problem does this component solve? In what contexts should it be used? In what contexts should it be avoided?
That's the minimum. Better is a structured format that also captures: what it is commonly confused with, what the tradeoffs are between this component and its alternatives, and what the user's mental model should be when they encounter it.
This is design archaeology: surfacing knowledge that exists distributed across designers' heads and archived Slack threads and turning it into artifact-level documentation. It takes time. For a design system with 80 components, a serious team can complete this in four to six weeks if it's prioritized. Most teams aren't prioritizing it because the AI tools work well enough without it — until they don't.
The teams that have done this work describe a compounding benefit: the documentation doesn't just improve AI generation. It improves onboarding for new designers, reduces design review cycles, and cuts the number of "is this the right component for this?" conversations that slow down product work. The semantic layer was always worth building. AI just made the cost of not having it visible.
Building New Systems With AI in Mind
For teams starting a design system now, or doing a major revision, the semantic layer isn't a retrofit — it's a design constraint. Document intent alongside appearance from the start. Every component gets a purpose statement before it gets variants. The question "what is this for?" gets answered before "what does this look like?"
This is a different design process than most teams are used to. The traditional flow moves from visual to documentation; the AI-ready flow treats documentation as part of the design. It feels slower at the component level and dramatically faster at scale — because the AI has something real to work from.
The designers who've made this shift describe a secondary benefit: it makes them more intentional about what they build. When you have to write "use this when the user needs to take an action on a discrete item — not when you're presenting information," you think harder about whether you actually need both a card and a list item, or whether one of them is redundant.
The System You're Actually Building
More components isn't more system. A design system is a set of shared decisions — about what problems to solve, how to solve them consistently, and when each solution applies. AI can generate components at a rate that outpaces the rate at which teams can make those decisions. The result is a library that grows and a system that fragments.
The teams navigating this well have stopped thinking about AI generation as a speed tool and started thinking about it as a scale tool that requires the infrastructure to support scale. That infrastructure is documentation: specific, intentional, written to answer the questions a machine can't answer from visual examples alone.
The design system that can generate 800 components in a week is a capability. Whether it remains a system depends entirely on the work that was done before the generation started.
Photo credit: Jakub Zerdzicki via Pexels.