YouTube has released a new feature that prevents angry and hostile commentators from making offensive remarks at quick prompts that are triggered before the insult is published, asking them to revisit and edit their remarks. .. This plan is outlined in a blog post published today.
In addition, YouTube has filters for comments that are pending for review. Authors can automatically hide, approve, or report messages without being forced to read through the content of offensive or threatening messages.
We’re releasing a new feature that lets commenters rethink comments that are offensive or offensive before posting, and a filter that allows creators to review negative comments that are automatically held for review.
— Ryan Wyatt (@Fwiz) December 3, 2020
Users trying to post offensive content will see a pop-up before publishing their comments. This suggests that you stop posting and edit. Users can send comments by pressing “Post Anyway”, but the idea is that they have seconds to revisit their options. This is consistent with the recent steps taken by other social media platforms, but it’s unclear how effective this very mild deterrent is.
Android users who speak English will see a new prompt starting today. It is unknown when or when the feature will be rolled out to other platforms and languages.
YouTube claims to have stepped up its internal efforts on hate speech, increasing the number of daily hate speech comment deletions by a factor of 46 since early 2021. YouTube closed 1.8 million channels in the last quarter, of which more than 54,000 shutdowns occurred in hate speech.
The company will also release a new survey in 2021 to ask creators to voluntarily share information about race, ethnicity, sexual orientation and gender. The purpose of this data is to find out how different groups of content are handled internally. system.