YouTube has a feature that has warned users who have offensive comments.
To encourage respectful conversations, YouTube is launching a new feature that will warn users when their comment may be offensive to others, giving them the option to reflect before posting .
From the reminder, the commenter can move forward with posting the comment as is, or take a few extra moments to edit the comment before posting it.
Creators will be able to manage comments.
The notification will appear before the comment YouTube’s AI-based systems deem offensive.
Johanna Wright, Vice President of Product Management At YouTube, said that in order to help creators better manage comments and connect with their audience, the company will test a new filter on YouTube Studio for potentially inappropriate and hurtful comments that have been automatically held for review.
Johanna Wright also said:
“So that creators don’t ever need to read them if they don’t want to. We’ll also be streamlining the comment moderation tools to make this process even easier for creators,”
At the beginning of year YouTube would ask creators on a voluntary basis to provide YouTube with their gender, sexual orientation, race and ethnicity.
“We’ll then look closely at how content from different communities is treated in our search and discovery and monetisation systems. We’ll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others”.
YouTube revealed that since early 2019, it has increased the number of daily hate speech comment removals by 46 times.
“In the last quarter, of the more than 1.8 million channels we terminated for violating our policies, more than 54,000 terminations were for hate speech,” the company added.
This is a very positive move in the right direction considering how crude the online community can be which can stop people from creating meaningful content.