Don’t only deliver a long report

No one is going to read the whole thing if that’s the only deliverable you . In addition to describing your detailed protocol and results, create a “one-sheeter” that pulls everything together. This way people can start with the high level results and, if they want to drill down into the details, they have the other resource to do so.

Bonus points if you include clear recommendations that are derived from these , and even more if you include a way to implement them. This makes your research actionable.

Don’t forget visuals

Business, technical, and managerial roles all speak a similar language. They are fluent in tracking metrics (productivity, revenue, and change) and do so in standardized ways that are called “KPI’s,” or Key Performance Indicators. The people in these roles tend to be well versed in deciphering trend lines, graphs and charts. Translate your research findings, whether they are quantitative ratings from a survey or even qualitative findings from interviews and visualize your findings using methods that your audience is familiar with. Humans are visual creatures, so when communicating abstract trends or insights, pull your findings into a clear visual format.

Don’t just email your findings out

Set up a “report out” meeting, either in person or using your favorite teleconferencing software. Share your screen, put up your research summary and visuals and review the highlights of the study. Q&A can happen in real time as you walk through the research so your audience won’t be left alone to interpret any part of your process or the results.

Get the most important stuff out there first, like 1) why you ran the study 2) what the results were and 3) how you did it. Those are the top priority items to go over with stakeholders, in that order, to get everyone on the same page.

Don’t undermine your qualitative findings

Especially when presenting to quant heavy thinkers, such as business folks, scientists, developers and leadership, I defend qualitative insights. When a survey comes back with 50 responses with open-ended feedback comments, often the counter is that those comments are not statistically significant. Though that may be true from a data analytics perspective — when looking at sample size in relation to population size requires a high level of confidence in statistical decision making — qualitative research is not meant to fill that need.

If you were to run an optimization test and you see that one design wins over another with a high level of significance, often you are missing the “why.” Let’s say thousands of customers interacted with a new feature — Hooray! If we do not ask them why they did, however, we won’t be able to repeat that success or scale it to other features. It is important to support the quantitative metrics by understanding the users’ thinking behind the interaction. Use a mixed method of quantitative and qualitative reporting to complement each other when you communicate your results to audiences that are accustomed to reviewing quantitative data.

Don’t report only hyped up metrics

You do yourself and your work a disservice if you only cater to the request of a stakeholder that isolates one metric from the context that it lives within. To put it into perspective, let’s say that someone in leadership comes to you and wants to know what the usability rating is for a new design (how easy it is to navigate through something, or “use” it). If you do not also ask users the question, “how valuable is this” design, then the metric measuring usability won’t matter if it’s not valuable to the user.

As a representative championing the needs of customers, you must take on the role of a gatekeeper of how data is perceived. This means that you must include context for the research that you do. You are the expert tasked with getting the full picture of how people interact with experiences. Don’t leave out parts of the story simply because they weren’t asked of you.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here