Apathy, ignorance, and obviousness can pose problems for well-meaning research findings
A few days ago, I was reading an insightful blog post regarding product detail pages written by a researcher at a well-known e-commerce research company. The post was a comprehensive lineup of results and conclusions that suggested e-commerce sites would be more likely to convert shoppers to customers if they include both graphics and text in the product details pages, and to do so in particular formats for highest readability and understanding.
The concept was simple: if shoppers can better grasp the item they are trying to evaluate and buy, then the correct combination of pictures and descriptive words can help answer questions buyers didn’t even know they had about the product, thereby alleviating the stress of purchasing it. If consumers feel good that any major concerns have been addressed and are no longer blocking them from making the purchase, they will be more likely to act on a purchase.
This is a valuable finding and one that deserves a top spot in the e-commerce research domain. Right? I mean, wow, what a great tidbit to know that the particular way your website illustrates a product using a combination of pictures and text influences a shopper’s likelihood to buy! Researchers would rejoice at such findings. The big shocker, however, was not from this blog post itself, but from a lonely comment at the end of the post, from an anonymous person, that read: “OMG this is so dumb, you don’t need research to know this.”
Having a research-minded heart, I felt a twinge of sadness reading that comment. But I also identified with the comment in a very human way: sometimes it does seem like you don’t need research to know something. And to the layperson, not having done the research, it can appear as if the researcher has just regurgitated some common sense (“oh duh”) fact to the world that is not earth shattering, progressive or even very interesting. Unfortunately, it is hard to overcome the apparent obviousness of some research findings and conclusions especially if they cover topics that laypersons are very familiar with such as e-commerce sites.
“OMG this is so dumb, you don’t need research to know this.”
So here’s the thing: there’s a disconnect between what non-researchers expect from a research effort and what researchers put forth to the world. When researchers learn about doing good science, we are taught how to design and implement a rigorous study and control variables, recruit participants and collect data without bias, and then analyze data and discuss our findings as if we are speaking to fellow academics or researchers.
Rarely do we get the opportunity to speak as if we’re talking to someone unaffiliated with the project — instead, we use big words and turn them into acronyms and get lots of praise for telling people we are pushing the boundaries of what has been done previously. All of that is fine and dandy when your audience is academic; however, bridging the scientist-practitioner gap is not so straightforward or easy when you are presenting research to the public.