Posts Tagged: commitment

[Opinion] Despite recent progress, Android OEMs still lag behind Apple’s commitment to software updates

There are two major smartphone platforms in 2020: Apple iOS and Android OS, and they borrow features from one another in a never-ending arms race. As an Android user, I roll my eyes when Apple invents a feature Android users have enjoyed for years, and most of the tech world goes crazy for it. I […]

Come comment on this article: [Opinion] Despite recent progress, Android OEMs still lag behind Apple’s commitment to software updates

Visit TalkAndroid


TalkAndroid

Nokia 2.3 announced with dual cameras, two-day battery, and a commitment to software updates

HMD Global and Nokia have announced the Nokia 2.3, the latest smartphone in Nokia’s line of budget-friendly devices. There’s a lot of familiarity with Nokia’s other phones, but a few key improvements that make this a really compelling mid-range offering at a great price. The Nokia 2.3 features a large 6.2-inch full HD display, which […]

Come comment on this article: Nokia 2.3 announced with dual cameras, two-day battery, and a commitment to software updates

Visit TalkAndroid


TalkAndroid

An update on our commitment to fight violent extremist content online

In June, we announced four steps we’re taking to combat terrorist content on YouTube:

  1. Better detection and faster removal powered by machine learning;
  2. More expert partners to help identify violative content;
  3. Tougher standards for videos that are controversial but do not violate our policies; and
  4. Amplified voices speaking out against hate and extremism.

We shared our progress across these steps in August and wanted to update you again on where things are today.

Better detection and faster removal

We’ve always used a mix of human flagging and human review together with technology to address controversial content on YouTube. In June, we introduced machine learning to flag violent extremism content and escalate it for human review. We continue to get faster here:

  • Over 83 percent of the videos we removed for violent extremism in the last month were taken down before receiving a single human flag, up 8 percentage points since August.
  • Our teams have manually reviewed over a million videos to improve this flagging technology by providing large volumes of training examples.

Inevitably, both humans and machines make mistakes, and as we have increased the volume of videos for review by our teams, we have made some errors. We know we can get better and we are committed to making sure our teams are taking action on the right content. We are working on ways to educate those who share video meant to document or expose violence on how to add necessary context.

More experts

Outside experts are essential to advising us on our policies and flagging content for additional inputs that better train our systems. Our partner NGOs bring expert knowledge of complex issues like hate speech, radicalization, and terrorism.

We have added 35 NGOs to our Trusted Flagger program, which is 70 percent of the way towards our goal. These new partner NGOs represent 20 different countries and include NGOs like the International Center for the Study of Radicalization at King’s College London and The Wahid Institute in Indonesia, which is dedicated to promoting religious freedom and tolerance.

Tougher standards

We started applying tougher treatment to videos that aren’t illegal and don’t violate our Guidelines, but contain controversial religious or supremacist content. These videos remain on YouTube, but they are behind a warning interstitial, aren’t recommended, monetized, and don’t have key features including comments, suggested videos, and likes. This is working as intended and helping us strike a balance between upholding free expression, by providing a historical record of content in the public interest, while also keeping these videos from being widely spread or recommended to others.

Amplify voices speaking out against hate and extremism

We continue to support programs that counter extremist messages. We are researching expansion for Jigsaw’s Redirect Method to apply this model to new languages and search terms. We’re heavily investing in our YouTube Creators for Change program to support Creators who are using YouTube to tackle social issues and promote awareness, tolerance and empathy. Every month these Creators release exciting and engaging new videos and campaigns to counter hate and social divisiveness:

  • In September, three of our fellows, from Australia, the U.K., and the U.S., debuted their videos on the big screen at the Tribeca TV festival, tackling topics like racism, xenophobia, and experiences of first generation immigrants.
  • Local YouTube Creators in Indonesia partnered with the MAARIF Institute and YouTube Creators for Change Ambassador, Cameo Project, to visit ten different cities and train thousands of high school students on promoting tolerance and speaking out against hate speech and extremism.
  • We’re adding two new local Creators for Change chapters, in Israel and Spain, to the network of chapters around the world.

In addition to this work supporting voices to counter hate and extremism, last month Google.org announced a $ 5 million innovation fund to counter hate and extremism. This funding will support technology-driven solutions, as well as grassroots efforts like community youth projects that help build communities and promote resistance to radicalization.

Terrorist and violent extremist material should not be spread online. We will continue to heavily invest to fight the spread of this content, provide updates to governments, and collaborate with other companies through the Global Internet Forum to Counter Terrorism. There remains more to do so we look forward to continuing to share our progress with you.

The YouTube Team


YouTube Blog

An update on our commitment to fight terror content online

A little over a month ago, we told you about the four new steps we’re taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.

We wanted to give you an update on these commitments:

Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:

  • Speed and efficiency: Our machine learning systems are faster and more effective than ever before. Over 75 percent of the videos we’ve removed for violent extremism over the past month were taken down before receiving a single human flag.
  • Accuracy: The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.
  • Scale: With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we’ve removed for violent extremism, as well as the rate at which we’ve taken this kind of content down.

We are encouraged by these improvements, and will continue to develop our technology in order to make even more progress. We are also hiring more people to help review and enforce our policies, and will continue to invest in technical resources to keep pace with these issues and address them responsibly.

More experts: Of course, our systems are only as good as the the data they’re based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time.

Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.

Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.

And over the weekend, we hosted our latest Creators for Change workshop in Bandung, Indonesia, where creators teamed up with Indonesia’s Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.

Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead.

The YouTube Team


YouTube Blog