“And the next one is a game project,” the middle-aged lecturer said and opened a new slide on the screen. The slide showed an overview of the usability test for an online game. I woke up from my daydreaming, as I am a gamer and interested in game development. But the next thing the lecturer said was not about designing a great game.

“We recruited the test participants from avid gamers, and it was painful,” he made a face. “You know, working with those nerds. So hard to communicate, speaking gibberish — oh, I don’t want to do that again!”

It could have been meant to be a joke, as a few of the students laughed. But I didn’t understand. What was he, a visiting lecturer from some usability company, talking about? Was he trying to ridicule the test participants, who had bothered to come to the usability lab to give valuable feedback? What kind of message was he communicating to the young designers?

I know sometimes it is hard to communicate with people from fields you are not familiar with. Learning jargons takes time and effort. But what if the test participants were middle-aged managers who wore suits and worked for a big company in an industry that use lots of technical terms? Would the lecturer still have laughed at them?

Social categorization and othering

We tend to prefer people who are similar to us, and sometimes hate people who are different from us. In social psychology, such process is called othering. Othering happens when we categorize people based on some features, such as gender or ethnicity, and distinguish groups that we belong (ingroup) and don’t belong (outgroup). We see ingroup people positively and outgroup people negatively, because we want to have a positive self-image.

Distinguishing ingroup and outgroup is a very common psychological process. Categorization is an essential cognitive skill for human being, and favoring ingroup people doesn’t mean you are a bad person. Everyone does that. But you must be careful if you are a designer, because such categorization affects your understanding of others. People not only favor ingroup members, but also process information more deeply for them than outgroup members. Ingroup members seem more diverse to us, while outgroup members all seem alike (Boldry et al., 2007). We can remember detailed information about individual ingroup members, while remember less detailed or positive information about outgroup members (Howard & Rothbart, 1980). In other words, our understanding of outgroup people is often shallow and stereotypical.

Are tools neutral?

People think technologies are neutral. C’mon, your devices and apps are just tools, how can tools be ? But it is a little naive to see any tools as neutral. Designers project their views of the world into the of things, whether intentionally or not. For example, Kat Ely describes how everyday things are designed by men and for men, marginalizing women and minorities. Seat belts, medicine, work environments, power tools — they are all traditionally created by men and for men, making it harder for other people to use. It happens because of the imbalance in the technology and design industry, Ely points out, as most tech and design positions are still held by males. Gender is a basic social category and, if you are a male, you tend to understand males better than people of other genders.

Designers have traditionally used so-called I-methodology, projecting their own preferences and skills onto the target users. It could easily reproduce biases of the designers. Human-centered design was born as an alternative to such methodology; it incorporates people’s perspectives instead of designers’, putting “human” needs at the center of the design.

But even human-centered design can be biased, unless it is used with care and knowledge. First you need to carefully choose the sample population in the user research and user testing. Even if you choose proper and diverse people, you could misunderstand the needs of people who are different from you. In the processes such as interviews and focus groups, you might fail to let them speak up. You might see your projection instead of the real people. And in the persona method, which is very popular among people, you might lose meaningful details of the user research findings and fall into stereotyping. Actually, personas are meant to be stereotypical; Alan Cooper explicitly indicated “stereotypical personas are more effective,” and emphasized how “shooting for believability, not diversity” is important.

How technology gets biased

Researchers began to explore the issues of social bias in the interaction design. It is often discussed in the context of gender, as it is one of the most salient biases found in the technology field. Gender-based digital divide seems to persist, even in the young population with higher education (recently I conducted a survey with university students, which showed surprisingly stereotypical gender differences). Besides, looking at gender issues is a good start to reconsider the traditional design practice, as the philosopher Martha Nussbaum says in her discussion on social justice:

Looking at women’s lives helps us see the inadequacy of traditional approaches; and the urgency of women’s problems gives us a very strong motivation to prefer a non-traditional approach. — Women’s Capabilities and Social Justice

Breslin and Wadhwa introduce four main “gendering” problems in human-computer interaction.

  • Disparate numbers: tech positions are filled by white or Asian men.
  • The I-methodology: designers tend to reproduce the norms of the group they belong.
  • Design stereotypes: designers tend to use stereotypes when they design for people who belong to the other groups than themselves.
  • Focusing on the difference: researchers reinforce traditional binary categorization by focusing on the differences based on the grouping.

These four factors are intertwined with each other, encoding specific norms and values into technology in our everyday life. Sometimes it is visible (e.g. stereotypical pink colors for girls apps), but often it is embedded deep enough and not easily noticed. For example, Oudshoorn et al. (2004) revealed how the designers of the Digital City Amsterdam created products that were only usable for people with specific learning style, which was found more often in men than in women. Such design issues often remain unnoticed, steering people with different characteristics away from the usage and thus depriving them of the opportunities they deserve (which, in turn, might enhance stereotypes against those people and create more disparity).

How can we reduce bias in design?

The following are a few suggestions to reduce bias and stereotypes in design. (For more details, see Breslin &Wadhwa, 2018 and Bath, 2009.)

1. User involvement

Communicating with diverse people helps to prevent the unintended biases and stereotypes. ISO guidelines for human-centered design recommend that we involve users throughout the process of design and development. A design project should start with a thorough requirement analysis of the intended users, using direct methods such as interviews, observation and focus groups. Also, user testing should be conducted with diverse users if your product is targeted for diverse population. But as we have discussed, user involvement does not automatically assure bias-free design.

2. Recognizing scripts and metaphors

When we design things, we use certain scripts explicitly or implicitly. An obvious example is when we use personas and user scenarios. We create imaginary characters and write scripts about what they think, do and feel. Such scripts can reflect our biases and wrong assumptions, even when they are based on the user research.

Narrative Transformation and Mind Scripting are proposed as tools for designers to notice and understand their own values and assumptions in the design activities. Critical Technical Practice would also be helpful to make hidden metaphors in the everyday design practice visible. In addition, Value Sensitive Design is proposed as a tool to actively inscribe desirable values into technology and products.

3. Designing accountably

Design can not include everything. Some user needs are always cut out during the design process. In the complex digital society, design success can be evaluated only within the specific situations in which the product is designed for. What designers can do is to be aware of their design decisions, to be accountable for the “cuts” they made in the design process. Did your design decision exclude certain group of people? Would it harm or hinder particular groups? Can you justify your decisions?

Design is never neutral

Human-centered design took the power back from systems to humans. Most designers today recognize the importance of human needs as well as system requirements. The next step is re-examining the scope and assumptions of “human” in human-centered approaches, to distribute the benefits of technology to a broader population.

The wants and needs of young, healthy, middle-class people with connections and a reasonable amount of spare cash are overrepresented among Start-up City’s priorities…Structural social injustice and systemic racism are harder to tackle — and that’s where the tech sector has, until recently, thrown up its hands. (Laurie Penny on a tale of two cities)

Design is never neutral. Each of your design decision enables or restricts the ways in which people interact with the technology and information. It forms people’s performances in everyday life. It creates meanings, reinforcing or altering the existing norms about who we are and how we live.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here