Community app Nextdoor launched its latest feature in a series of anti-racist initiatives, this time prompting users to think before they post "offensive or hurtful" language in their neighborhood forums. The new "anti-racism notification" specifically alerts users if they are trying to post discriminatory or overtly racist language, including posts with the phrases "All Lives Matter" and "Blue Lives Matter."
Nextdoor, founded in 2008, is a community networking app that connects users with others living in their area to share news, resources, and engage with one another using community forums. Designed to encourage discussion, the app has received ample criticism for not protecting its users from offensive language in past years, prompting the company to make large-scale commitments to creating safer, kinder forums.
In 2019, Nextdoor unveiled the Kindness Reminder — a feature that prompted commenters to review Community Guidelines and edit their posts before they went live if the posts were found to contain phrases frequently reported by other users in the past.
Earlier this year, the company committed to supporting the Black Lives Matter movement and its Black users through updated Community Guidelines, as well as an anti-racism resources hub. Nextdoor also pledged to circulate resources with community leaders on the app to facilitate more inclusive dialogue.
The new "anti-racism notification" is an expansion of the Kindness Reminder technology and the app's anti-racism initiatives. The company explained in an email to Mashable that the tool was created with oversight from "activists, academics, and experts to help understand how to combat incivility in neighborhood conversations."
If a user's comment is flagged for offending language, a pop-up notification pauses the user's post and prompts them to reconsider — the user can choose to edit their response immediately or continue posting if they feel the comment doesn't violate community guidelines. Alongside the notification's release, the app also published a series of blog posts explaining how to talk to your neighbors about race and expanded its anti-racism hub to include more information about implicit racism through unconscious biases and white privilege, and even how to engage with conversations about the Derek Chauvin trial.
According to Nextdoor, the Kindness Reminder led to a "30% reduction in uncivil content" posted on the app after its launch in 2019. The company hopes that a similar anti-racist tool will result in a steady decline of racist and discriminatory posts.
But, fundamentally, the new feature is more a test of Nextdoor's commitment to changing its long, problematic history of enabling spaces for "casual racial profiling" and other forms of covert racism, including collaborations with local police departments to facilitate crime reporting, despite user concerns. In 2020, the CEO of Nextdoor said that racism was no longer tolerated in the app, but users and critics alike contended the app wasn't living up to its big commitments. Since then, the company has continued expanding on its pledge to weed out racism in its forums.
The anti-racism notification attempts to address the issue of underreported (and then frequently unaddressed) racism by preventing such comments in the first place. And it could be a step in the right direction. Or it could be a continuation of the trend users saw last year — commitments that didn't fundamentally change the way the app is used or address the ways racism is perpetuated on a systemic level.
With AI providing only a prompt to discourage posting, there's concern that posts will be missed, that offending users will simply choose to ignore the prompts and post anyway, or that they'll just take their racism elsewhere. And, as was the case in previous years, community leaders will be a determining factor is how these features affect change. Known as "Neighborhood Leads and Community Reviewers" in the app, these users facilitate conversations and respond to users far quicker than any AI. As Nextdoor explains, these users ensure neighborhood guidelines are being followed. Leads welcome new members, moderate conversations, vote to remove comments, and promote other users to leadership roles. Only Nextdoor can remove offending members, however.
Ahead of the notification release, Nextdoor launched new trainings for Leads in collaboration with consulting group The New Quo. The training focuses on "inclusive moderation" strategies and explicitly includes implicit bias training. Initiatives like this get more at the heart of the issue: community accountability led by users and supported, at every step, by the company.
Whether or not these prompts will truly discourage offending posts is a test of time.
Related Video: How to know if you violated the First Amendment
Topics Social Good Racial Justice