In the report “The Business Value of Design” recently published by McKinsey & Company there are four themes that are key for and design leaders to focus on if they are committed to maximizing the benefits of design in their organizations:

  • Cross-functional talent
  • Continuous iteration
  • User Experience
  • Analytical leadership

In many different engagements, I have interviewed who struggle because they have no to work with, they are drowning in . Some hesitate to bring the questions to their teams. Other designers have been exploring ways to the of design — whether we focus on the of our work as a UX team or the work we do as part of a product team and the organization. As designers, in our journey towards an analytical leadership mindset, we encounter some myths and we make some mistakes that we have to overcome.

Common myths about data-informed design

Here are some of the most common myths designers encounter when working with data

Data means numbers

rolling the behavior of hundreds, thousands or millions of people into a single number is not always useful, or reliable. Many times, we are dealing with qualitative insights, and teams struggle to accept non-numeric information. Our product teams, business leaders and even ourselves, often do not consider qualitative data as measurable and we ignore it.

As designers, we need to work with our teams to articulate the need to supplement the hard data we collect, which indicates what happened, with the soft data that answers the why question, what was the customer thinking, what motivated their behavior. We need to the user’s motivations and needs behind their behavior.

Data Is the objective truth

Quantitative data compiled by software rather than humans make it seem like hard fact. However, the algorithms that collect data sets are created by humans, who interpret them and assign meaning and there is an inherit bias in that meaning. Big or small, no data is perfect. There are limitations and bias present in every type of data analysis. Design practitioners and leaders have the responsibility to minimize that bias as much as possible, or at least identify it, describe it and provide context.

Bigger is always better

‘Big Data’ does not have the power to reveal or predict behaviors about the people that use our products. We need to collaborate with Data Scientist to understand the data and draw connections between the behaviors and the quantitative information we receive from our customers. The data analytics will not tell us everything we need to know about user behavior. We have to meaningful categories of metrics for our products that help us evaluate, understand and keep track of actionable outcomes.

Data is for others (Managers, Developers, or Data Scientists), not for designers

It is tempting to look for data to prove or disprove decisions we make as individuals, teams or organizations. However, we should not use data to confront designers with leadership or their teams. We should not look at data to be the arbitrator to prove who is right or wrong. We should leverage the data to learn, make improvements and discover new possibilities together, as a team. We need use data to help us tell the story of the people using the solutions and services we are putting out in the marketplace.

Data undermines our ability to innovate

In a recent interview for DesignImpact — Establishing a bond of trust to expand Design’s influence and impact — Sami Niemelä says “that those data-driven organizations have their eyes in the rear-view mirror.” For some, data is seen as the antithesis of innovation because it is backward looking, it is tactical rather than strategic and it skims the surface. While hard metrics are important, Niemelä considers that design practitioners need to bring back the soft metrics like design quality, empathy, customer feedback, and ethics.

The problem is not in the data itself, but in how we are using it. If we want to use data effectively to help us inform design decisions, we need to embrace the complexity of both quantitative and qualitative perspectives.

The myth of the right way to use data to inform design

Designers need to understand that there is no magic bullet. There is not a single process or a unique approach to work with data. Teams and organizations have to find their own approach in a manner that makes sense to them. A few keys to consider include:

  • Use data from a variety of sources to inform your design
  • Include numbers and context
  • Use data to track changes over time, explore new patterns, and dig deeper on problems
  • Decide on meaningful categories and metrics that help your team tell a story about the customer experience
  • Develop a way to share and discuss data in your organization, and start by defining the basics together with your team and your peers.

Common mistakes designers and organizations make when using data

Here are some of the most common mistakes designers, product teams and organizations make when working with data.

Using data to drive decisions, rather than inform decisions

Data alone may force a team to throw away a good idea or experiment at the first sign of trouble. Instead, being data-informed is about using the data we have and combine it with qualitative feedback, our own design intuition, our practices, and experience. From that point, we experiment, learn, iterate and validate the product and services we design and we continuously deliver value to our customers and to the organization.

The best teams understand which discovery tools to use and when and, most importantly, how to leverage the data they collect. That is the ultimate measure of the value of discovery work but it’s much harder to quantify.

Jeff Gothelf — Optimizing your team’s velocity (of learning)

Mistaking a vanity metric for a meaningful metric

Vanity metrics are often focused on the number of releases, product features, acquisition, and adoption, but do not tell us anything about product quality or whether we are meeting the customer experience needs. Vanity metrics are measuring activities that make us feel good but do not actually tell us if we are making progress. We need to be relentless in defining metrics that truly capture the value we are creating for people, and measure the impact design has in the organization.

Drowning in the data we collect

Some teams tracking and collect data that they do not use. Designers and product managers feel they are drowning in data, they feel overwhelmed and unprepared to understand it. Instead, product teams should clearly articulate what metrics may be useful to them based on the nature of the application or service and the business goals. Metrics should be based on and aligned with those specific product goals.

Exploring data before formulating our hypotheses

We have to resist the temptation of retrofitting our hypotheses to match the data we collect. As a team, we have to have a clear understanding of the riskiest hypotheses first and then device the experiments that will help us identify meaningful ways to validate them and measure them. We cannot leave it to chance and determine whether or not we were successful after our product is already out in the market. Our teams and our organization should have a clear picture of what the goal is, and how we will measure if our work and our products achieved it.

Embracing the opportunities to create business impact with design

We have the responsibility to elevate the UX practice in our organizations and measure the design impact. As design leaders, we need to overcome the myths and watch out for the mistakes we could make when working with data. We need to foster the cultural environment in our teams to involve customers in the design process, review and analyze the data we have and be prepared to act based on what we learn.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here