Twitter's enforcement of its new anti-hate rules is having a very immediate and tangible effect. Daily Dot has noticed that Twitter banned the account of Jayda Fransen, the British extremist whose bogus anti-Muslim videos were retweeted by Donald Tr…
Engadget RSS Feed
In June, we announced four steps we’re taking to combat terrorist content on YouTube:
We shared our progress across these steps in August and wanted to update you again on where things are today.
Better detection and faster removal
We’ve always used a mix of human flagging and human review together with technology to address controversial content on YouTube. In June, we introduced machine learning to flag violent extremism content and escalate it for human review. We continue to get faster here:
Inevitably, both humans and machines make mistakes, and as we have increased the volume of videos for review by our teams, we have made some errors. We know we can get better and we are committed to making sure our teams are taking action on the right content. We are working on ways to educate those who share video meant to document or expose violence on how to add necessary context.
More experts
Outside experts are essential to advising us on our policies and flagging content for additional inputs that better train our systems. Our partner NGOs bring expert knowledge of complex issues like hate speech, radicalization, and terrorism.
We have added 35 NGOs to our Trusted Flagger program, which is 70 percent of the way towards our goal. These new partner NGOs represent 20 different countries and include NGOs like the International Center for the Study of Radicalization at King’s College London and The Wahid Institute in Indonesia, which is dedicated to promoting religious freedom and tolerance.
Tougher standards
We started applying tougher treatment to videos that aren’t illegal and don’t violate our Guidelines, but contain controversial religious or supremacist content. These videos remain on YouTube, but they are behind a warning interstitial, aren’t recommended, monetized, and don’t have key features including comments, suggested videos, and likes. This is working as intended and helping us strike a balance between upholding free expression, by providing a historical record of content in the public interest, while also keeping these videos from being widely spread or recommended to others.
Amplify voices speaking out against hate and extremism
We continue to support programs that counter extremist messages. We are researching expansion for Jigsaw’s Redirect Method to apply this model to new languages and search terms. We’re heavily investing in our YouTube Creators for Change program to support Creators who are using YouTube to tackle social issues and promote awareness, tolerance and empathy. Every month these Creators release exciting and engaging new videos and campaigns to counter hate and social divisiveness:
In addition to this work supporting voices to counter hate and extremism, last month Google.org announced a $ 5 million innovation fund to counter hate and extremism. This funding will support technology-driven solutions, as well as grassroots efforts like community youth projects that help build communities and promote resistance to radicalization.
Terrorist and violent extremist material should not be spread online. We will continue to heavily invest to fight the spread of this content, provide updates to governments, and collaborate with other companies through the Global Internet Forum to Counter Terrorism. There remains more to do so we look forward to continuing to share our progress with you.
The YouTube Team