YouTube has always been on constant check over abusive comments on its platform, and since has been making efforts to curtail such actions.
Earlier, the platform decided to send a nudge alert to individuals at the time of posting any hate or toxic comment on the platform.
At present, YouTube has taken another step forward to take action on such hate speech through its new feature that will more aggressively nudge individuals of their abusive comments.
YouTube, the streaming service by Google, said that it will send a notification alert to people whose abusive comments have been removed from the platform.
If an individual continues to post abusive comments despite receiving the notification then the platform will block the user from posting any more comments for the next 24 hours. YouTube confirmed the success of the feature in a statement.
A present, hateful speech detection is only available for English, but the company is now planning to include more languages so it becomes easier to detect such comments.
YouTube also said that it has been working to improve its AI-powered detection systems. In the first half of 2022, YouTube said that it had taken down 1.1 billion “spam comments”.
Comments
Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.
We have migrated to a new commenting platform. If you are already a registered user of TheHindu Businessline and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.