Operational efficiency and operational efficacy: defining DesignOps’ metrics

As an efficiency-focused transformational function, DesignOps needs to be able to quantify and measure impact and results. Identifying the right goals and defining the right KPIs are therefore essential steps for every DesignOps strategy and roadmap to ensure impact and progress can be objectively assessed.

Patrizia Bertini
UX Collective

--

DesignOps is made by both Operational efficiency (processes and resources) and Efficacy (Behaviours)
DesignOps metrics are divided into two main types: Operational Efficiency and Operational Efficacy. (P/Bertini 2021)

A DesignOps strategy will always begin with an assessment of the status quo, which includes a measurements of all operational dimensions to allow a clear definition of the the actions needed. Coherent quantifiable metrics are a key part of this initial assessment: being able to understand and quantify design teams’ operational capacity is essential to develop a roadmap and a successful execution plan.

For this reason, Peter Druker’s quote If you can’t measure it, you can’t improve it could be considered DesignOps’ natural motto: to be able to have a positive impact on Design’s operational and spending results, it is important to have an in depth understanding of which dimensions are under performing, the systemic impact of these inefficiencies, and a definition of credible and achievable goals.

Metrics and KPIs are DesignOps’ operational tool: if efficiencies cannot be quantified, no action or intervention could deliver improvement as it means that the problem and the inefficiencies have not been properly thought through.

How to get started with Metrics? Start with clearly defined goals!

Improving efficiencies means knowing how much effort is invested into a specific task/action/step to deliver a specific output.

Efficiency is not a qualitative and subjective estimate and it is not measured by adjectives or any verbal descriptors. Efficiency is measured by numbers, percentages, and ratios that provide an objective measurement of the status quo, where impact and results are measured by percent variations and relative delta.

This means that any strategy with broad and vague statement such as
Improving teams workflows it is a meaningless statement for 2 reasons:

  • it does not identify the inefficiency and does not describe what are the aspects to be improved: it is not possible to define which elements of the workflow are currently underperforming, therefore it is not possible to define an intervention;
  • it lacks any quantifiable metric that can explicitly defines what success looks like. The lack of categorical and measurable impacts means that success cannot be objectively evaluated.

A well formed strategic DesignOps goal would rather read like: Reduce research lead time by 20% by streamlining participants’ recruitment process.

The statement above provides clarity on what success looks like and what are the areas of intervention to be tackled. And in details, this statement is effective for the following reasons:

  • it clearly defines the goal (Reduce lead time)
  • it specifies the target area (Research)
  • it names the aspect that is currently causing the biggest inefficiencies (Participants’ recruitment process)
  • it provides an unambiguous and quantifiable measurement of what success looks like (20% variation).

The statement does not mention actions and plans, but it is part of an impact-based roadmap formulation.

The goal statement needs to outline a univocal hypothesis regarding operational aspects that are showing inefficiencies and it needs to determine credible improvements in a way that can be objectively measured and quantified.

How to identify Efficiencies: the beneficiaries and the dimensions

If DesignOps is a transformational function that operates to maximise the value of Design through the generation of quantifiable value for all Design’s stakeholders, then it is essential to identify who are the beneficiaries and stakeholders that will be affected the most.

DesignOps serves and delivers value to three main stakeholders and beneficiaries:

The three key beneficiaries: The Business, the Design Lead, and the Design team (P/Bertini 2020)
DesignOps’ three key beneficiaries. (P/Bertini 2020)
  • The Design Teams: they benefit from improved E2E design processes, harmonised and integrated tools and processes, increased operational efficiencies that make their day to day work seamless and more focused.
  • The Design Leaders: they can focus on the what and on the product/service strategy while being able to maximise results and impact on outward looking metrics (see here).
  • The Business and the Organisation in general, which can increase its competitive advantage and ROI.

To define the metrics and the expected impact, it is important to begin by identifying who are the beneficiaries of the specific intervention: who is suffering the most for the current operational model? What are the inefficiencies caused by those pain points? How does this pain propagate to other stakeholders? Those are questions that look at the status quo from a user impact perspective that help assessing the beneficiaries and impact.

The other aspect to consider is DesignOps’ and its efficiency’s dimensions:

DesignOps Efficiency dimensions: Time, Cost/Resources, Quality/Scope
DesignOps Efficiency dimensions (P/Bertini 2020)

The image on the left is the so called iron triangle much used in Project Management: these three dimensions — Time, Quality/Scope, Cost/Resources — define key measurable aspects of each initiative and process.
Which of these 3 dimensions is underperforming? i.e.,
- is delivery slow?
- Is the quality of deliverables below expectations requiring a number of iterations and reviews?
- Are the tools that the company has invested incapable of fully support the E2E design process? And what is the ROI of the tools?

Merging the triangles to define the impact

The image combines the dimension and beneficiaries showing how each beneficiary can be impacted on each dimension.
Each Beneficiary can experience benefits in each dimension (P/Bertini 2020)

Once DesignOps has identified the beneficiaries and efficiencies, these two triangles can be combined to assess the areas and aspects that can be improved.

Each beneficiary can in fact be impacted in each dimension: Design Teams (DT in the figure) can increase quality of outcome, which would reduces the number of reviews or iterations, increasing speed to delivery (Time).

Or maybe the Design Lead (DL) can increase time for innovation thanks to the ability to execute ordinary, BAU, tasks at speed due to a more mature ambidexterity, which can generate competitive advantage and better user experiences (Quality).

Yet DesignOps, due to its transformational nature requiring a focus on change management, it requires not one, yet two sets of complementary and mutually influencing, metrics: Operational Efficiency and Operational Efficacy.

Operational Efficiency and Operational Efficacy: two sides of the same coin

Operational Efficiency focuses on how resources — time, money, people — are utilised and it defines the opportunities to maximise the organisations’ return of investment.Operational Efficiency focuses on the processes and it involves accomplishing a task with minimum expenditure of time and effort. In a sentence, Operational Efficiency can be defined as do things in the right way.

Operational Efficiency metrics therefore revolve around process related aspects and some core KPIs include:

  • Tools’ ROI (Cost)
  • Testing and prototyping lead time (Time)
  • Number and type of quality reviews (Quality)
  • Team productivity (Resources)
  • E2E delivery time (Time)

Operational Efficacy focuses on soft metrics that assess the inner qualities of both the teams and the execution of a design thinking approach.
This type of metrics weights behaviours that have a direct impact on operational efficiencies. In a sentence, Operational Efficacy can be defined as do the right things and promote actions and processes that add value to the team, the customer, and the organisation.

Some Operational Efficacy metrics include:

  • Empathy and ongoing user engagement
  • Tools’ engagement and utilisation metrics
  • Ideation and experimentation cycle times
  • Composition of teams’ skills and skills’ distribution
  • Perceived value of design teams by cross-functional partners
  • Employee satisfaction and retention

These two types of metrics mutually influence each other in a loop, where poor operational efficiency affects efficiency (behaviours) and vice versa.

Operational Efficiency: an example

When it became clear that designers were wasting 1 week per month in nondesign tasks such as sourcing research participants to run rapid testing sessions, it was clear that the lack of a streamlined process was causing major issues, both from an operational and behaviours point of view.

Those extra 5 days a month, meant that designers were either working longer hours, with an impact on job satisfaction and life/work balance, or were working with less accuracy, with an impact on the quality of deliverables or the number of iteration required.

It also meant that designers looked for shortcuts which were against UX best practices (talking to the same friendly participants) or against local regulations on data protection, exposing the whole organisation to possible risks.

The Operational Inefficiencies included:

  • COST: high recruitment costs ($100/$360 per recruit ex incentive)
  • TIME: slow lead time (2–4 weeks)
  • RESOURCES: Teams were working 1 extra week per month

But these operational inefficiencies has impact on behaviours (Operational Efficiency), such as:

  • Finding shortcuts that can impact the quality of the outcome
  • Reducing the number of sessions or testers, with an impact on the quality of the outcome or an increase in the number of iterations required, causing delays

The goals:

  • Reduce research participants’ sourcing lead time by 30% by streamlining the process
  • Increase number of research participants by 35%
  • Increase frequency and number of test by 25% by making the process effortless

The Results:

By solving for those operational efficiencies, there has been an impact on efficacy and behaviours: the reduced effort and the reduced lead time had generated a +300% increase in the number of research participants, that impacted the overall quality of the outcome and improved life/work balance and churn. The reduced costs (45% on average) generated budget for tools and improvement that streamlined the process and empowered designers to work at speed with quality.

Operational Efficacy: an example

Design teams were feeling uncomfortable with analytics and felt that their ability to influence decisions was limited (more on this case here): designers focus on empathy and design thinking was not consistently complemented by a data driven approach.

The analysis of the teams’ skills’ distribution and confidence showed a clear gap in the Data and Analytics area: the large majority of designers were aware of this gap and asked for support, as they could see how their limited confidence with data and analytics tools’ was affecting the relationship with cross-functional partners and Design’s role.

These operational inefficacies included:

  • Delays in getting data as data sourcing was outsourced, with an impact on project delivery times (Operational Efficiency) and quality of decisions
  • Access to data sets was provided by third party, hence data were implicitely selected and shared by the provider, influencing data interpretation and decision making (quality of product) with potentially more efficiency related issues due to poor decision
  • The lack of confidence in Analytics tools caused a low engagement with the tools, generating an extremely low ROI for Analytics tools.

The impact of how behaviours generate and impact operational efficiencies is therefore clear.

The Goals:

  • Increase analytics tools’ engagement by 20% through trainings
  • Increase number of analytics Projects by 35%
  • Increase confidence in Data and Analytics by 20%

The results:

A +110% increase in data analytics’ tools engagement also impacted engagement and relationships with cross-functional partners. Designers have shown a +29% increase in confidence in using Analytics tools and a +10% increase in confidence in data analysis and analytics within 3 months. This has increased the ROI of tools with a positive impact on spending metrics and overall efficiencies.

Final remarks

DesignOps’ value is in the ability to deliver value on a scale by fostering behaviours that impact the overall efficiency.

This article is an extended version of the keynote talk for UXIstanbul 2021 — slides available here.

The UX Collective donates US$1 for each article published on our platform. This story contributed to Bay Area Black Designers: a professional development community for Black people who are digital designers and researchers in the San Francisco Bay Area. By joining together in community, members share inspiration, connection, peer mentorship, professional development, resources, feedback, support, and resilience. Silence against systemic racism is not an option. Build the design community you believe in.

--

--

Inquisitive mind | interest: DesignOps | Innovation | Digital transformation |Co-Creation | Privacy | Experience economy | Creativity | System Thinking