YouTube intends to take another step towards creating a less toxic environment in the comments section of your videos.
The company announced today, through its forums, that it will include in its application for Android a filter based on a quick analysis of what you are writing.
Simply put, if the algorithm thinks you are going to opt for some kind of less-than-unique speech, you will be asked if you really want to write what you are writing.
YouTube’s hope is that you think twice
With the introduction of this measure, the platform seeks to take the user to carry out an analysis of conscience and, at least on a case-by-case basis, to make him understand that his posture will not be the most correct.
Even if it does not translate into a permanent global change, it will at least have the potential to deconstruct negative behavioral patterns.
Reprehensible behaviors keep growing
YouTube officials even say that in the past quarter, more than 50,000 channels were banned due to hate speech, three times more than in the same period last year.
However, if the user does not show conscience problems and decides to publish a malicious comment, nothing will prevent him from doing so.
EBox editors recommend: