Despite a claim in 2017 that they would do more to crack down on Hate Speech on their platform, after a massive exodus Hate Videos: YouTube Has Done almost Nothing to Prevent Hate Speech from Showing Up of advertisers, YouTube is still the main source of hate videos worldwide, with almost nothing done to actually prevent them from showing up on their site.
Videos of the shooting in New Zealand kept on showing up yesterday and today, even though a simple keyword addition to their programming would have solved it.
(“Mosque Shooting”)
In fact, their own policy is not to remove most of them, but if they are found, to put them in a “digital purgatory” where you have to “agree” to see the often racist and violent videos.
Their claim is those videos aren’t “monetized” but many of them still drive people, and thus revenue to other videos which are. Why not remove them completely?
In fact, well known hate peddlers including the BraveHeart Show, are still being shown despite articles since 2010 bringing up this little hate channel, where the Pastor is known to burn Korans.
I’ve personally brought up this one channel as an example of their inability to police their own content over and over to YouTuve – and they’ve told me they would look into it, over and over. It’s almost like they are keeping it up to “make a point” that they are okay with hate speech, and just will “take care of it” in their own way.
YouTube has created algorithms for finding stolen music and videos, and has even given special access to the music companies to help flag and take them down. YouTube relies on good relationships with the companies that own TV shows, movies, and music because they SUED them years ago to force them to work with them.
So the site spent a lot of time and money developing a Content ID system. YouTube lets anyone register their content with Content ID and hunt down infringers in fact, but does absolutely nothing similar with hate speech.
Why is that?
While I understand 600,000 hours of new video are added to YouTube each day. However, creating a system which these videos can be flagged by experts and an algorithm that looks for keywords (like Mosque Shooting) isn’t actually that hard.
In fact, The problem cannot be solved by humans and it shouldn’t be solved by humans,” Google’s chief business officer Philipp Schindler told Bloomberg News. They seem to agree.
YouTube has also been the top source for misinformation about mass shootings.
The Guardian newspaper for example found that survivors and the relatives of victims of numerous shootings have been subject to a wide range of online abuse and threats, some tied to popular conspiracy theory ideas featured prominently on YouTube.
As I mentioned on Twitter, quite a few employees of YouTube read my newsletters and read the articles. They are aware of these issues – and I’ve personally spoken and written to them about it for years. They only decided to take action in 2017 after we shamed them, and they seem to be taking the same exact policy decision now.
I've had meetings with @YouTube for over a decade about putting filters up for hate videos. hundreds of executives read my newsletters.
— Pesach 'Pace' Lattin (@pacelattin) March 15, 2019
They don't care.
They've developed technology for the movie and music companies to detect stolen tracks.
Cc @RedTapeChron@Digiday
Part of the reason is people like Candace Owens pushing the theory that removing anti-Jewish and anti-Islam videos is an attack on “conservative speech.” They’ve started to consider this far-right wing ideology “mainstream” because of the pushback by some talking heads that everyone has a right to spew their hatred.
YouTube and google have a responsibility to it’s advertisers that this doesn’t continue – even though likely it will.
They’ve shown a lack of being proactive and have only ever adjusted when they are caught lying or exposed.