Inman

How Nextdoor reduced racist posts by 75%

lucky-business / Shutterstock.com

For many of us, social media is an integral part of our lives — it’s how we connect, share ideas, gather the information that helps us to make everyday decisions and sometimes escape from the woes of life with a cat video.

On the other hand, social media has also become an outlet for racists, bigots, bullies and the like to target people of color, members of the LGBTQ community and those who fall outside of the American “mainstream.”

Activists have called for Facebook and Twitter to take swifter action against discriminatory posts and tweets, but both platforms have struggled with how to confront and control these narratives. Some users assert that any sort of censorship violates their freedom of speech.

One way the two social media giants might tackle the issue is to follow the lead of Nextdoor, a hyperlocal social media platform that has decided to tackle discrimination and racial profiling head-on with a new system that makes users think before they post.

A pattern of racial profiling emerges

Last year, Fusion writer Pendarvis Shaw called attention to Nextdoor’s burgeoning racial profiling problem with his article “Nextdoor, the social network for neighbors, is becoming a home for racial profiling.

Shaw recounted the story of Nextdoor user Meredith Alhberg, who was hosting a party in her East Oakland Ivy Hill neighborhood.

As the partygoers made their way to Alhberg’s home, she noticed a flood of notifications from her Nextdoor mobile app that warned of “sketchy men,” one of whom was described as a “thin, youngish African American guy wearing a black beanie, white t-shirt with dark opened button down shirt over it, dark pants, tan shoes, gold chain.”

The Nextdoor post about Alhberg’s friends. (Photo credit: Fusion)

Alhberg immediately realized the “sketchy men” were friends of hers who simply had a hard time finding her home. She responded to the worried neighbor and told them to stand down — they were her friends and they got lost.

Shaw signed up for Nextdoor and combed through the posts for the North Oakland neighborhood where he grew up. He found more of the same.

“It’s a racially-mixed community of retired black people, younger white artists, and quite a few teachers — not a perfectly safe neighborhood, exactly, but another rapidly gentrifying Oakland enclave,” Shaw wrote.

“The posters there also seemed to see skin color as a reason for suspicion.”

Nextdoor’s former approach

At the time of the article, Nextdoor’s member guidelines asked users to “refrain from using profanity or posting messages that will be perceived as discriminatory.”

Fellow users could flag a comment as inappropriate, and a moderator would review it. If found to be inappropriate, the user could have their account suspended.

Nextdoor seemed to hope that because members had to (and still must) use their real names and addresses, this would serve as a deterrent to discriminatory behavior. Furthermore, the platform said that it stopped “short of proactively censoring discussions to avoid diluting the authenticity of its communities.

A change of heart

“I’m a person of color, so it really cut deep,” Nextdoor CEO Nirav Tolia said in a follow-up Fusion article.

“We hated the idea that something we built would be viewed as racist … I hadn’t seen it in my own neighborhood’s Nextdoor and so didn’t realize it was an issue for us.

“Once I got past that, I was powered by the challenge to do something about it.”

On August 24, Tolia wrote a blog post on Nextdoor’s site that announced a new approach to discrimination and racial profiling on the site.

[Tweet “On August 24, @Nextdoor updated Crime & Safety guidelines to reduce racial profiling.”]

“Racism is one of the worst problems facing society today,” he wrote.

“As Nextdoor has become one of the places where neighbors talk about how to make their local communities better, it is natural for the issue of race to be discussed and debated. But it’s not acceptable when mentions of race take the form of racial profiling.”

From there, Tolia outlined the platform’s updated Crime and Safety guidelines, including:

  • Racial profiling tag on posts
  • Updates to member guidelines
  • Mandatory warning screen before posting in Crime and Safety

Nextdoor’s new mandatory warning.

Out of the three updates, the largest change of all seems to be the mandatory warning that each user will see before making a post.

[Tweet “Mandatory @Nextdoor Crime&Safety warning asks for specific appearance details”]

The mandatory warning asks “What details can I add that will help distinguish this person from other similar people?” and requires users to fill out specific details before they’re allowed to leave a post in the Crime and Safety area of the site. They are:

  • Hair (Hat, hair, color, style)
  • Top (Shirt, jacket, color, style)
  • Bottom (Pants, skirt, color, style)
  • Shoes (Shoe, brand, color, style)

Users are also able to provide the approximate age, build and race of the person they saw.

How Nextdoor developed these requirements

Tolia notes the mandatory warning was built around the work of Stanford psychologist Jennifer Eberhardt, who studies how race plays into the judicial system and trains police officers to recognize and push past implicit biases.

“We tried to create decision points,” Tolia told Fusion. “To get people to stop and think as they’re observing people to cut down on implicit bias.”

To create and test the requirements, Tolia partnered with community organizations such as Neighbors for Racial Justice and 100 Black Men, police departments from across the country, as well as representatives from the City of Oakland, including Vice Mayor, Annie Campbell Washington, council member Desley Brooks and others.

During the test pilot, which started in April, Nextdoor called upon people within and outside of the company to read posts and rate them on a scale of 1 to 4, 1 being “not profiling” to 4 being “definitely profiling.”

The reviewers didn’t know who wrote the post or whether the post was published before or after the updated guidelines were in effect.

By the end of the pilot in August, the number of discriminatory or racial profiling posts had dropped by 75 percent.

[Tweet “.@Nextdoor pilot test showed 75% reduction in discriminatory posts.”]

Building a better community; creating a trend

Tolia notes that some posts still slip through the cracks, but he’s hoping that the Nextdoor community will stand up and use the racial profiling flag to catch those posts.

And Tolia adds that he doesn’t expect to completely rid the platform of racism, but he felt that Nextdoor had to take action.

“We don’t think Nextdoor can stamp out racism,” said Tolia, “but we feel a moral and business obligation to be part of the solution.”

Following in Nextdoor’s footsteps is Airbnb, which announced a racial bias policy to address the complaints of users of color who report being denied lodging due to their race.

After recognizing a pattern of racial discrimination, Airbnb conducted a study. According to the company, the study “generally confirmed public reports that minorities struggle more than others to book a listing.”

By November 1, Airbnb will experiment with reducing the visibility of users’ photos on booking pages and instead promoting “reputation information,” such as reviews of their conduct at previous homes.

Furthermore, the company will enact a feature that automatically blocks dates on the host’s calendar when the host rejects a potential guest. Airbnb hopes this will make hosts think twice before rejecting a guest based on race.

Also, if a user is discriminated against, Airbnb has promised to find them an alternative place to stay — whether that’s another Airbnb listing or a hotel room.

The company is starting an initiative to hire more employees of color by recruiting at historically black colleges and universities. By diversifying its workforce, Airbnb says it’ll be able to better address claims of discrimination.

Email Marian McPherson