We are committed to tackling the challenge of quickly removing content that violates our Community Guidelines and reporting on our progress. That’s why in April we launched a quarterly YouTube Community Guidelines Enforcement Report. As part of this ongoing commitment to transparency, today we’re expanding the report to include additional data like channel removals, the number of comments removed, and the policy reason why a video or channel was removed.
We previously shared how technology is helping our human review teams remove content with speed and volume that could not be achieved with people alone. Finding all violative content on YouTube is an immense challenge, but we see this as one of our core responsibilities and are focused on continuously working towards removing this content before it is widely viewed.
When we detect a video that violates our Guidelines, we remove the video and apply a strike to the channel. We terminate entire channels if they are dedicated to posting content prohibited by our Community Guidelines or contain a single egregious violation, like child sexual exploitation. The vast majority of attempted abuse comes from bad actors trying to upload spam or adult content: over 90% of the channels and over 80% of the videos that we removed in September 2018 were removed for violating our policies on spam or adult content.
Looking specifically at the most egregious, but low-volume areas, like violent extremism and child safety, our significant investment in fighting this type of content is having an impact: Well over 90% of the videos uploaded in September 2018 and removed for Violent Extremism or Child Safety had fewer than 10 views.
Each quarter we may see these numbers fluctuate, especially when our teams tighten our policies or enforcement on a certain category to remove more content. For example, over the last year we’ve strengthened our child safety enforcement, regularly consulting with experts to make sure our policies capture a broad range of content that may be harmful to children, including things like minors fighting or engaging in potentially dangerous dares. Accordingly, we saw that 10.2% of video removals were for child safety, while Child Sexual Abuse Material (CSAM) represents a fraction of a percent of the content we remove.
As with videos, we use a combination of smart detection technology and human reviewers to flag, review, and remove spam, hate speech, and other abuse in comments.
We’ve also built tools that allow creators to moderate comments on their videos. For example, creators can choose to hold all comments for review, or to automatically hold comments that have links or may contain offensive content. Over one million creators now use these tools to moderate their channel’s comments.1
We’ve also been increasing our enforcement against violative comments:
We are committed to making sure that YouTube remains a vibrant community, where creativity flourishes, independent creators make their living, and people connect worldwide over shared passions and interests. That means we will be unwavering in our fight against bad actors on our platform and our efforts to remove egregious content before it is viewed. We know there is more work to do and we are continuing to invest in people and technology to remove violative content quickly. We look forward to providing you with more updates.
In December we shared how we’re expanding our work to remove content that violates our policies. Today, we’re providing an update and giving you additional insight into our work, including the release of the first YouTube Community Guidelines Enforcement Report.
Providing More Information
We are taking an important first step by releasing a quarterly report on how we’re enforcing our Community Guidelines. This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.
We’re also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged to us for review against our Community Guidelines.
Machines Helping to Address Violative Content
Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed. And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).
Highlights from the report — reflecting data from October – December 2017 — show:
For example, at the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.
The Value of People + Machines
Deploying machine learning actually means more people reviewing content, not fewer. Our systems rely on human review to assess whether content violates our policies. You can learn more about our flagging and human review process in this video:
Last year we committed to bringing the total number of people working to address violative content to 10,000 across Google by the end of 2018. At YouTube, we’ve staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams.
We continue to invest in the network of over 150 academics, government partners, and NGOs who bring valuable expertise to our enforcement systems, like the International Center for the Study of Radicalization at King’s College London, Anti-Defamation League, and Family Online Safety Institute. This includes adding more child safety focused partners from around the globe, like Childline South Africa, ECPAT Indonesia, and South Korea’s Parents’ Union on Net.
We are committed to making sure that YouTube remains a vibrant community with strong systems to remove violative content and we look forward to providing you with more information on how those systems are performing and improving over time.
— The YouTube Team
1 This number does not include videos that were removed when an entire channel was removed. Most channel-level removals are due to spam violations and we believe that the percentage of violative content for spam is even higher.
2Not only do these 8 million videos represent a fraction of a percent of YouTube’s overall views, but that fraction of a percent has been steadily decreasing over the last five quarters.
3This excludes videos that were automatically matched as known violent extremist content at point of upload – which would all have zero views.
Google likes to bounce back and forth between light and dark modes on their apps, and we’ve seen our share of both over the past few years. Material Design updates brought its share of bright white app and website interfaces, but we’ve started to see things swing back towards darker themes lately, and YouTube has […]
Come comment on this article: Dark mode is coming to YouTube’s Android app
Other than its innovated VR180 format and YouTube TV expansion, Google's video streaming site found time to mention its subscription package during Vidcon 2017. For 2017 the YouTube Red Originals lineup has a mix of new series on the way including an…
Engadget RSS Feed
YouTube is where you come to watch your favorite creators — whether that means jamming with Alex Aiono, gaming with Strawburry17, or hanging out with Logan Paul. That’s why we’re working on a redesign of the desktop experience that highlights your favorite videos and creators while making YouTube easier and more fun to use.
Starting today, we’re opening up a preview of the new design to a small group of people from all around the world so we can get feedback. While we hope you’ll love what we’ve been working on, we’re also really excited to involve the YouTube community so we can make the site even better before sharing it more broadly.
We’re applying Material Design to YouTube to deliver a beautiful, delightful and intuitive user experience. The key principles of this new design are:
The site design is built on a new, faster framework named Polymer, which enables quicker feature development from here on out. And today, we are introducing one of the first new features developed on Polymer: Dark Theme. Developed to cut down on glare and let you take in the true colors of the videos you watch, Dark Theme turns your background dark throughout your entire YouTube experience. This is only the beginning — you can look forward to more powerful new features coming soon!
If you want to try out YouTube’s latest look, you can opt-in to preview the new design at youtube.com/new. You can return to the current design by selecting “Restore classic YouTube” from the Account Menu. And don’t forget to send us feedback from the Account Menu.
We’re still working on the new site, so we hope you’ll try it out now and let us know what you think!
Brian Marquardt, Product Manager, recently watched “Pen-Pineapple-Apple-Pen/PIKO-TARO.”
At YouTube, we believe in giving everyone a voice. So this U.S. elections season, we’re committed to making sure that people–especially young people–use their voice by voting.
With November just around the corner, election-related content is exploding. Over 200,000 election videos have been uploaded to YouTube every day since the July Conventions and you’ve watched more than 110 million hours of candidate and issues-related content on YouTube.
But while people are clearly engaged with the election online, we want to make sure they get involved “in real life,” too. Today, we’re announcing YouTube’s get out the vote campaign, #voteIRL, where together with the YouTube creator community, we’re helping get young people to the polls. Check out our new #voteIRL anthem video featuring some of YouTube’s top talent, including Bethany Mota, Hannah Hart, Kingsley, Hank Green and more.
Did you know it only takes 1:34 to register to vote?1 With voter registration deadlines looming in October, it’s fast and easy to register to vote using registration tools built by Google. Starting today, look out for familiar faces making 1:34 videos where they do anything from hosting their radio show (hey Ryan Seacrest) to doing their eyebrows, while encouraging their fans to go register. We teamed up with AwesomenessTV, Fullscreen, Machinima, and Maker Studios, so watch for more 1:34 videos every day until National Voter Registration Day.
In addition to the link above, you can also find voter registration tools directly on YouTube. Watch out for registration reminders on the homepage, watch page, and search results page on September 27, Voter Registration Day.
Voting requires you to get educated with the latest and greatest from the candidates. That’s why we’re excited to announce that we’re live streaming the presidential debates from more news organizations than ever before including NBC News, PBS, Fox News, The Washington Post, Univision, and Telemundo. You can also follow your favorite YouTube creators, including The Young Turks and Complex News, who will be on the ground reporting from the debates using YouTube Live directly from their phones.
Stay tuned to youtube.com/youtube and our social media channels as we release new videos, report from the presidential debates, and bring you closer to the election (and the polls) this November. And make sure you’re registered to vote!
Claire Stapleton, YouTube Elections team, recently watched “Maymo the Dog Runs for President: Maymo 2016.”
1 We got a group at YouTube together and registered in every state and the average was just 1:34.
Last week, YouTube started supporting live 360-degree video streams in a bid for more-immersive video content. Though users have been able to upload and watch 360-degree video for over a year, it's only now that Google is introducing the option to be…
Engadget RSS Feed