As popular online platforms grow, facilitating more communication between strangers, they have been at the center of conversations about racial bias and efforts to confront it. In November, Twitter updated its hateful conduct policy and gave users more tools to report and limit their exposure to tweets they deem objectionable. And after a study conducted by Harvard Business School researchers found that guests with distinctly African American names are 16 percent less likely to be accepted for Airbnb bookings, the short-term lodging company created a nondiscrimination policy.
Nextdoor, a social network that connects neighborhood residents, credits the research of Partnership member Jennifer Eberhardt with helping the company redesign a section of its site to decrease the incidence of reports that some users and observers saw as racial profiling. In these posts, suspects were identified only by race.
In an interview with the online magazine Backchannel, Nextdoor CEO Nirav Tolia said Eberhardt’s work taught him about the importance of “decision points,” key moments when a system or process might help someone make a more informed, less biased choice. Nextdoor redesigned the crime and safety section of the site to force people to consider details like the hair and clothing of the person in question and characteristics of any vehicles involved before reporting race. Users are now primed to think concretely about what happened and the specific characteristics of who was involved before they can focus on race.
The number of posts to that section of the site fell 25 percent. This could be because some users are deterred by the extra steps, but Tolia says he expects the overall quality and relevance of the reports to increase as a result of the new format.
You can read about Nextdoor’s efforts here and a profile of Eberhardt and her work on Stanford Magazine’s website.