Posts Tagged: families

An update on our efforts to protect minors and families

Responsibility is our number one priority, and chief among our areas of focus is protecting minors and families. Over the years, we’ve heavily invested in a number of technologies and efforts to protect young people on our platform, such as our CSAI Match technology. And in 2015, because YouTube has never been for kids under 13, we created YouTube Kids as a way for kids to be able to safely explore their interests and for parents to have more control. Accounts belonging to people under 13 are terminated when discovered. In fact, we terminate thousands of accounts per week as part of this process.

We also enforce a strong set of policies to protect minors on our platform, including those that prohibit exploiting minors, encouraging dangerous or inappropriate behaviors, and aggregating videos of minors in potentially exploitative ways. In the first quarter of 2019 alone, we removed more than 800,000 videos for violations of our child safety policies, the majority of these before they had ten views.

The vast majority of videos featuring minors on YouTube, including those referenced in recent news reports, do not violate our policies and are innocently posted  a family creator providing educational tips, or a parent sharing a proud moment. But when it comes to kids, we take an extra cautious approach towards our enforcement and we’re always making improvements to our protections. Here are a few updates we’ve made over the past several months:

  • Restricting live features: We updated enforcement of our live streaming policy to specifically disallow younger minors from live streaming unless they are clearly accompanied by an adult. Channels not in compliance with this policy may lose their ability to live stream. We also launched new classifiers (machine learning tools that help us identify specific types of content) on our live products to find and remove more of this content.
  • Disabling comments on videos featuring minors: We disabled comments on tens of millions of videos featuring minors across the platform, to limit the risk of exploitation. Additionally, we implemented a classifier that helped us remove 2x the number of violative comments. We recognize that comments are a core part of the YouTube experience and creators have told us they feel we removed a valuable way for them to connect with and grow audiences. But we strongly believe this is an important step to keeping young people safe on YouTube.
  • Reducing recommendations: We expanded our efforts from earlier this year around limiting recommendations of borderline content to include videos featuring minors in risky situations. While the content itself does not violate our policies, we recognize the minors could be at risk of online or offline exploitation. We’ve already applied these changes to tens of millions of videos across YouTube.

Over the last 2+ years, we’ve been making regular improvements to the machine learning classifier that helps us protect minors and families. We rolled out our most recent improvement earlier this month. With this update, we’ll be able to better identify videos that may put minors at risk and apply our protections, including those described above, across even more videos.

To stay informed of the latest research and advances in child safety, we work with civil society and law enforcement. In the last two years, we’ve shared tens of thousands of reports with NCMEC, leading to numerous law enforcement investigations.1 Additionally, we share our technologies and expertise with the industry, and consult with outside experts to complement our team of in-house experts.

YouTube is a company made up of parents and families, and we’ll always do everything we can to prevent any use of our platform that attempts to exploit or endanger minors. Kids and families deserve the best protection we have to offer: We’re committed to investing in the teams and technology to make sure they get it.

The YouTube Team


1 Updated stats on June 3


YouTube Blog

Samsung might eventually combine the Galaxy S and Galaxy Note families

A new rumor suggests Samsung might eventually be merging their Galaxy S and Galaxy Note lines, creating just a single line of flagship phones instead of two offsetting lineups every year. With the Galaxy Note 8 sticking pretty close to the Galaxy S8+ it was a tough sell for anyone looking to upgrade, considering the […]

Come comment on this article: Samsung might eventually combine the Galaxy S and Galaxy Note families

Visit TalkAndroid


TalkAndroid

California Rep. requests 23andMe to help reunite children with families

California Representative Jackie Speier reportedly asked DNA-testing company 23andMe to help reunite children separated from their parents at the US-Mexico border due to Trump's 'zero tolerance' immigration policies. She told Buzzfeed that she was co…
Engadget RSS Feed

5 ways we’re toughening our approach to protect families on YouTube and YouTube Kids

In recent months, we’ve noticed a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not. While some of these videos may be suitable for adults, others are completely unacceptable, so we are working to remove them from YouTube. Here’s what we’re doing:

  1. Tougher application of our Community Guidelines and faster enforcement through technology: We have always had strict policies against child endangerment, and we partner closely with regional authorities and experts to help us enforce these policies and report to law enforcement through NCMEC. In the last couple of weeks we expanded our enforcement guidelines around removing content featuring minors that may be endangering a child, even if that was not the uploader’s intent. In the last week we terminated over 50 channels and have removed thousands of videos under these guidelines, and we will continue to work quickly to remove more every day. We also implemented policies to age-restrict (only available to people over 18 and logged in) content with family entertainment characters but containing mature themes or adult humor. To help surface potentially violative content, we are applying machine learning technology and automated tools to quickly find and escalate for human review.
  2. Removing ads from inappropriate videos targeting families: Back in June, we posted an update to our advertiser-friendly guidelines making it clear that we will remove ads from any content depicting family entertainment characters engaged in violent, offensive, or otherwise inappropriate behavior, even if done for comedic or satirical purposes. Since June, we’ve removed ads from 3M videos under this policy and we’ve further strengthened the application of that policy to remove ads from another 500K violative videos.
  3. Blocking inappropriate comments on videos featuring minors: We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors. Comments of this nature are abhorrent and we work with NCMEC to report illegal behavior to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.
  4. Providing guidance for creators who make family-friendly content: We’ve created a platform for people to view family-friendly content — YouTube Kids. We want to help creators produce quality content for the YouTube Kids app, so in the coming weeks we will release a comprehensive guide on how creators can make enriching family content for the app.
  5. Engaging and learning from experts: While there is some content that clearly doesn’t belong on YouTube, there is other content that is more nuanced or challenging to make a clear decision on. For example, today, there are many cartoons in mainstream entertainment that are targeted towards adults, and feature characters doing things we wouldn’t necessarily want children to see. Those may be OK for YouTube.com, or if we require the viewer to be over 18, but not for someone younger. Similarly, an adult dressed as a popular family character could be questionable content for some audiences, but could also be meant for adults recorded at a comic book convention. To help us better understand how to treat this content, we will be growing the number of experts we work with, and doubling the number of Trusted Flaggers we partner with in this area.

Across the board we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies. These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge. We’re wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right. As a parent and as a leader in this organization, I’m determined that we do.

Johanna Wright, Vice President of Product Management at YouTube


YouTube Blog