The ROI Engine
EP08 — "Teams seem happier" has never once defended a design system budget. Here's how to measure the value of a design system in language that survives a leadership meeting.
At some point, every design system team faces the same meeting.
A reorganization. A budget review. A new VP of Engineering who wants to understand what exactly the design system team produces and why it requires dedicated headcount. The question is polite. The subtext is not.
In most organizations I have worked in, designers are classified as complementary. A cost center. A support function. That classification has a consequence most design leaders never fully account for: it means designers are structurally excluded from budget meetings. The conversations where the system's value gets questioned happen in rooms the design team is not invited into.
So you go undercover. You find someone who was in the room. You piece together what got said and try to prepare an answer to a question you were never officially asked. This is not a communication problem. It is a classification problem. And it will not be solved by better slide decks.
"Teams seem happier" is not an answer to that question. It is, however, the most common one given. Which is why design system teams lose headcount arguments they should have won.
The Measurement Problem
Design systems are hard to measure because they are infrastructure. And infrastructure is only noticed when it fails.
When the design system works — developers implement faster, designers work with consistent constraints, QA finds fewer UI defects. Nobody attributes this to the design system. They attribute it to a good quarter.
When the design system fails — components drift, implementation takes twice as long, the brand refresh causes three weeks of rework. This gets attributed to the design system immediately.
Invisible when working. Visible when broken. This is the infrastructure paradox. And it is why design system teams never get credit for the problems that didn't happen.
The solution is not to wait for the system to fail and point at the cost. The solution is to build the measurement model before anyone asks for it.
If you cannot measure your design system's value, you are not running a product. You are running a faith-based initiative.
The Four Metrics That Actually Work
Not all metrics are equal. Some are easy to collect and meaningless to leadership. Some are hard to collect and immediately understood. The goal is the second category.
Metric 1 — Implementation Velocity
How long does it take an engineering squad to implement a new UI pattern using the design system versus building it from scratch?
This requires a baseline. Run the measurement before a squad adopts the system. Run it again six months after. The delta is your velocity gain. Express it in engineering hours. Multiply by hourly cost if leadership responds to financial framing.
A 30% reduction in implementation time is not a design metric. It is an engineering efficiency metric. Leadership understands engineering efficiency metrics.
Metric 2 — UI Defect Rate
Filter your bug tracker for UI-related defects. Compare the rate for squads that use the design system versus squads that don't. Compare the rate for the same squad before and after adoption.
UI defects have a cost: QA time, developer time, release delays. If design system adoption reduces UI defects by 25%, that reduction has a measurable cost attached. That cost is your ROI.
I once watched a design system get credited retroactively during a framework migration from Angular to React. Nobody had asked the design system to account for the transition. It just turned out that having a shared component architecture made the migration significantly less painful for the squads that had adopted it. The teams that hadn't adopted it rebuilt their UI from scratch.
That value was real. It was never tracked. It appeared in a migration report as a footnote rather than a headline because nobody had built the measurement model in advance to capture it.
Metric 3 — Adoption Rate
What percentage of product squads actively use the design system in production? Not in Figma. In production.
This is the headline metric. Everything else explains why the headline is moving in the direction it is. If adoption is growing, the other metrics tell you why. If adoption is stalling, the other metrics tell you where.
Metric 4 — Component Detach Rate
What percentage of design system components get detached from the library in production use?
A high detach rate is not a user preference. It is a product signal. It means the component doesn't fit the use case it was built for. Every detached component is a feature request that was never formally submitted.
The detach rate is the most honest feedback your design system will ever receive. It does not wait for a survey.
How to Present the Numbers
Metrics without a narrative are just numbers. The narrative is: here is what the design system was costing the organization before, here is what it costs now, and here is the delta.
One page. Three numbers. One trend line. This is the design system ROI summary that survives a leadership meeting.
Do not lead with adoption rate. Lead with implementation velocity or defect reduction. These are numbers that mean something to an engineering VP who has never opened your Figma file.
I have watched an executive highlight a design system initiative in a yearly earnings call as a strategic win he had driven. He had funded it after hearing about it somewhere. The team that built it was not mentioned. The credit followed the money, not the work. That is the default outcome when the team that builds the system does not build the story around it first.
You cannot claim credit for value you never measured. Build the measurement model before someone else builds the narrative.
The Quick Win: Start Measuring Now
Pick one squad that adopted the design system in the last six months. Pull three data points: implementation time for a recent UI pattern, number of UI defects in the last quarter, number of design review cycles for the last feature.
Compare against the six months before adoption.
"If the numbers moved in the right direction — you have your first ROI data point. If they didn't — you have your first product problem to solve."
Both outcomes are useful. Only one of them gets presented in a budget meeting. Make sure you know which one you have before the meeting starts.
The Argument
The design system that cannot prove its value will eventually be asked to prove its existence. Build the measurement model before that conversation arrives. Because by the time it arrives, it is usually too late to start collecting data.
Disagree loudly. Inspire boldly.
And remind me — falsely if needed — that somewhere a design system ROI report exists and leadership actually read it before the budget meeting.
If your design system disappeared tomorrow, could you quantify what the organization would lose?
Next in Season 2: Internal Customers →
Comments ()