Designers employ research methods to understand and improve on the web. Researchers seek evidence of frustration in their users, so their team can better target and clear obstacles in the journey. With current methodologies, it takes a lot of time to do this well. A 60-minute moderated usability study could take another 120 minutes to analyze and code. Considering the sample sizes needed for statistical significance, for many teams, cost becomes prohibitive to carrying out robust research. Instead, they adopt guerilla techniques that are useful for uncovering only the most obvious of users’ problems.

Data installs the emotion chip to further his growth as an artificial lifeform. Affective Computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affective states. Image from Star Trek: The Next Generation.

We are wired to express our emotions with body language. Our posture, micro-expressions, and gestures — they all change depending on how we are feeling. Affective Computing uses sensors that understand body language to simulate empathy. Spend enough time watching people use web interfaces, patterns of behavior indicating frustration begin to emerge. Some of these patterns, if properly understood and defined, could be detected by sensors that track user input. We could better prioritize our research efforts on existing services by using sensors to automatically find patterns of frustration.

How might we detect patterns of frustration in real-time?

To develop a sensor that can detect signs of frustration, we need to understand how strong emotions can change the way people interact with devices. Attentional Control Theory can help us better define the relationship between frustration and user input, and explore connections between attention and action. From this body of work, we know that our minds pre-render a path from where our cursor currently is, to where it should go, for the sake of efficiency.

Typically, we can ignore distracting stimuli and only compute paths to essential elements on a screen.

We lose our ability to control our attention as we dip into a negative emotional state. We shift from a goal-oriented approach to a stimulus-driven method. Suddenly, our minds begin to compute paths to unimportant elements; we get distracted, imprecise, and our cursors begin to dawdle. Dawdling isn’t always an indicator of frustration, as sometimes stimulus-driven methods are exactly what we’d want. For example, dawdling might be expected when a user browses a feed with no immediate goal except to find something interesting to click on.

Dawdling happens when we shift from a goal-oriented approach to a stimulus-driven method. Movements are slower and travel a longer path.

To get information on how mouse movements change when we’re frustrated, researchers Hibbeln, Schneider, Jenkins, and Valacich set up a test. An e-commerce site was developed and purposefully made slow to frustrate one group of participants. They found that mouse distance increased by 30% and mouse speed decreased by 17% as compared to the control group. The researchers concluded that these metrics were more than 80% successful at inferring negative emotion.

Dawdle.js — open source code developed to detect frustration

Existing research indicates that cursor dawdling when performing goal-directed tasks is correlated with negative emotion. To spot dawdling, we need to track changes in mouse distance and speed, looking for decreases in a user’s relative mouse speed by at least 17% and increases in their relative distance traveled by 30%. By comparing patterns across users we are able to create a heat map of potential problem areas and prioritize our research efforts.

I’ve created a Github repository with my approach to detecting dawdling with javascript.

Using active listening techniques to alleviate frustration

We’ve all used the technique of active listening to alleviate another person’s emotional state. Some researchers believe active listening could be a way for computers to undo the frustration they cause their users. Triggering a message right as the user is feeling frustrated could be a practical way to alleviate negative emotion and prevent a user from leaving. To illustrate, here’s a story about an MIT student named ‘Bob’.

Bob uses MATLAB to complete complex mathematical calculations.

Its command-line interface was hard to use, and in a bout of frustration, Bob keyed in a four-letter expletive that began with the letter ‘F’ and ended with ‘K’.

He hit enter.

“Your place or mine?” responded the terminal.

Bob laughed, his mood immediately brightened.

Researchers Klein, Moon, and Picard explore the idea of creating interfaces that employ active listening. They created an intentionally difficult web-based game and put out a call for play-testers. Participants played several rounds of the game, and were asked a few questions about their experience. They were then directed back to the game, now with an option to quit and end the play-test session. The researchers looked to see if the contents of the questionnaire changed how long participants played the game after being allowed to quit.

Rendition of the questionnaire employing active listening techniques, from the Klein, Moon, and Picard study

For one of the groups, the researchers implemented active listening techniques. The form would process a participant’s input and summarize it back to them. The result? Participants who were given the active listening questionnaire played longer than those who did not. This finding is supported by the work of other researchers, like Jeng-Yi Tzeng, whose data shows that slightly apologetic responses go a long way in creating more desirable psychological experiences for users.

What do user interfaces capable of active listening even look like? Emotion researchers use software tools to gather data on emotional states. The Self-Assessment Manikin is a scale of pictographs illustrating categories of emotion. The Product Emotion Measurement Instrument (PrEmo) was developed as a self-assessment tool to measure emotions elicited by product design. These instruments have been designed to understand how users are feeling, and respond accordingly.

How might we design experiences that sense and respond to emotion? Perhaps we can alleviate negative emotion by simply acknowledging when the experience has failed to meet a user’s expectation. Affective Computing has the potential to automate the most tedious parts of user research, while also undoing the stress created when we inevitably fail our users.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here