"We've made a long-term decision that we'd rather not have the spikes on engagement from toxic-type content, so we're going to moderate that, because we think in the long run it creates a much healthier platform that you don't have to go back and fix as we scale.".Our army already exists, and it scales as the platform scales," she told Axios. "We don't have to go build an army of people who have no context for the conversation.Friar said she's also making "tough choices" when it comes to increasing engagement on the platform. Nextdoor CEO Sarah Friar says the social network is fundamentally different from Facebook and others because it laid out clear conversation guidelines and has had content moderators from the beginning. It also immediately removes certain phrases - such as "white lives matter" -that users have flagged as offensive.There are now 120,000 community reviewers, who volunteer to monitor their neighborhood's discussions. The site pledged to recruit more Black content moderators and provide bias training to all moderators.Nextdoor also pulled its controversial "Forward to Police" feature that lets users send posts directly to local police, causing concerns that the tools aided racial profiling.For example, last summer the social network came under fire when some moderators removed Black Lives Matter posts and allowed racist comments to remain on the site.Political divisions, racial profiling and criticism of its content moderation practices seeped in. Nextdoor's neighborhood networks became microcosms of national tensions. Then the pandemic hit, followed by George Floyd's murder. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |