Photo by Christin Hume on Unsplash

What are instances where users might need specialized control over their ?

My favorite example: on OKCupid, you have the opportunity to select “I don’t want to see or be seen by straight people.” On first glance, this might seem strange. Why would someone specifically want this option?

Allowing users to have this choice can cut down on a few different things. One, if users are looking to connect romantically, they can skip out on a load of matches that aren’t right for them, and vice versa.

Outside of that context, allowing users to customize their experience can creating community. Perhaps users would like to connect with folks who have similar experiences of gender and/or sexuality. The internet allows these experiences to be more accessible to those who might not have local groups to attend, or those who aren’t out to their families. The more control you give users in these instances, the more freedom they gain in very tangible ways — someone who can’t be out as queer in Mozambique can connect with a friend in Ohio.

“A meshwork of green ropes against a red background” by Clint Adair on Unsplash

Systems like these are also a first step of creating a -sensitive moderation system.

What makes a good report system?

A good report system is user first, conscious of structural bias, and has clear, actionable ways to protect marginalized people. At a previous job I held, there was a ‘code of conduct’ posted towards the front of the store. This code was for the employees but gave the customers a clear guide of what to expect in the space.

Photo by Maria Freyenbacher on Unsplash

How would someone create a solid report system?

Allow users to contest bans — this acts as a built in safe-guard for any people who might have reported another user in error, or for a reason that falls outside of the guidelines of what’s appropriate for reporting. This also allows the users to know that the administration of the website are attentive towards their users — both those who are using the report feature and those on the receiving end of it.

Have clear guidelines for what is a ban-able offense that is communicated multiple times to users. This is important for two reasons:

  • Users need to know what they should report. This is a way of creating in your online community — if the community is explicit about not allowing transphobia, homophobia, misogyny, etc, it’s clear to the members of the community what the expectations are of them. It should be clear to marginalized users that their concerns are part of how reports are handled.
  • Users need to know what shouldn’t be reported. If you have an opinion about a user’s gender presentation on Tinder, that opinion should be kept to yourself!
  • Have people review evidence — a moderator can and will have internal biases as a result of being human. Having multiple people decide on fairness and what should happen in a given situation will allow for a better system. Someone might look at a message that a woman-identified user receives and not see an issue with it, but another person might see clear misogyny.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here