Posts Tagged: guidelines

[Sponsored] BLUETTI Provides Hurricane Preparedness Guidelines Amid Recent Florida Storm

On August 30th, Hurricane Idalia struck Florida with destructive force, resulting in significant damage. The storm caused trees to split, roofs to be torn off hotels, and even led to cars being submerged. Proactive preparation is paramount to mitigate potential damage and ensure safety during such natural disasters. BLUETTI, a pioneering leader in innovative energy […]

Come comment on this article: [Sponsored] BLUETTI Provides Hurricane Preparedness Guidelines Amid Recent Florida Storm

Visit TalkAndroid

TalkAndroid

Amazon workers sue over alleged failure to follow COVID-19 guidelines

Amazon is facing more scrutiny over its handling of COVID-19 at its warehouses. Workers at the internet retailer’s Staten Island warehouse have filed a lawsuit (via CNBC) accusing the company of failing to follow CDC and New York state public health…
Engadget RSS Feed

Faster removals and tackling comments — an update on what we’re doing to enforce YouTube’s Community Guidelines

We’ve always used a mix of human reviewers and technology to address violative content on our platform, and in 2017 we started applying more advanced machine learning technology to flag content for review by our teams. This combination of smart detection technology and highly-trained human reviewers has enabled us to consistently enforce our policies with increasing speed.

We are committed to tackling the challenge of quickly removing content that violates our Community Guidelines and reporting on our progress. That’s why in April we launched a quarterly YouTube Community Guidelines Enforcement Report. As part of this ongoing commitment to transparency, today we’re expanding the report to include additional data like channel removals, the number of comments removed, and the policy reason why a video or channel was removed.

Focus on removing violative content before it is viewed

We previously shared how technology is helping our human review teams remove content with speed and volume that could not be achieved with people alone. Finding all violative content on YouTube is an immense challenge, but we see this as one of our core responsibilities and are focused on continuously working towards removing this content before it is widely viewed.

  • From July to September 2018, we removed 7.8 million videos
  • And 81% of these videos were first detected by machines
  • Of those detected by machines, 74.5% had never received a single view

When we detect a video that violates our Guidelines, we remove the video and apply a strike to the channel. We terminate entire channels if they are dedicated to posting content prohibited by our Community Guidelines or contain a single egregious violation, like child sexual exploitation. The vast majority of attempted abuse comes from bad actors trying to upload spam or adult content: over 90% of the channels and over 80% of the videos that we removed in September 2018 were removed for violating our policies on spam or adult content.

Looking specifically at the most egregious, but low-volume areas, like violent extremism and child safety, our significant investment in fighting this type of content is having an impact: Well over 90% of the videos uploaded in September 2018 and removed for Violent Extremism or Child Safety had fewer than 10 views.

Each quarter we may see these numbers fluctuate, especially when our teams tighten our policies or enforcement on a certain category to remove more content. For example, over the last year we’ve strengthened our child safety enforcement, regularly consulting with experts to make sure our policies capture a broad range of content that may be harmful to children, including things like minors fighting or engaging in potentially dangerous dares. Accordingly, we saw that 10.2% of video removals were for child safety, while Child Sexual Abuse Material (CSAM) represents a fraction of a percent of the content we remove.

Making comments safer

As with videos, we use a combination of smart detection technology and human reviewers to flag, review, and remove spam, hate speech, and other abuse in comments.

We’ve also built tools that allow creators to moderate comments on their videos. For example, creators can choose to hold all comments for review, or to automatically hold comments that have links or may contain offensive content. Over one million creators now use these tools to moderate their channel’s comments.1

We’ve also been increasing our enforcement against violative comments:

  • From July to September of 2018, our teams removed over 224 million comments for violating our Community Guidelines.
  • The majority of removals were for spam and the total number of removals represents a fraction of the billions of comments posted on YouTube each quarter.
  • As we have removed more comments, we’ve seen our comment ecosystem actually grow, not shrink. Daily users are 11% more likely to be commenters than they were last year.

We are committed to making sure that YouTube remains a vibrant community, where creativity flourishes, independent creators make their living, and people connect worldwide over shared passions and interests. That means we will be unwavering in our fight against bad actors on our platform and our efforts to remove egregious content before it is viewed. We know there is more work to do and we are continuing to invest in people and technology to remove violative content quickly. We look forward to providing you with more updates.

YouTube Team


1 Creator comment removals on their own channels are not included in our reporting as they are based on opt-in creator tools and not a review by our teams to determine a Community Guidelines violation.   


YouTube Blog

More information, faster removals, more people – an update on what we’re doing to enforce YouTube’s Community Guidelines

In December we shared how we’re expanding our work to remove content that violates our policies. Today, we’re providing an update and giving you additional insight into our work, including the release of the first YouTube Community Guidelines Enforcement Report.

Providing More Information
We are taking an important first step by releasing a quarterly report on how we’re enforcing our Community Guidelines. This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.

We’re also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged to us for review against our Community Guidelines.

Machines Helping to Address Violative Content
Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed. And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).

Highlights from the report — reflecting data from October – December 2017 — show:

  • We removed over 8 million videos from YouTube during these months.1 The majority of these 8 million videos were mostly spam or people attempting to upload adult content – and represent a fraction of a percent of YouTube’s total views during this time period.2
  • 6.7 million were first flagged for review by machines rather than humans
  • Of those 6.7 million videos, 76 percent were removed before they received a single view.

For example, at the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.

The Value of People + Machines
Deploying machine learning actually means more people reviewing content, not fewer. Our systems rely on human review to assess whether content violates our policies. You can learn more about our flagging and human review process in this video:


Last year we committed to bringing the total number of people working to address violative content to 10,000 across Google by the end of 2018. At YouTube, we’ve staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams.

We continue to invest in the network of over 150 academics, government partners, and NGOs who bring valuable expertise to our enforcement systems, like the International Center for the Study of Radicalization at King’s College London, Anti-Defamation League, and Family Online Safety Institute. This includes adding more child safety focused partners from around the globe, like Childline South Africa, ECPAT Indonesia, and South Korea’s Parents’ Union on Net.

We are committed to making sure that YouTube remains a vibrant community with strong systems to remove violative content and we look forward to providing you with more information on how those systems are performing and improving over time.

— The YouTube Team

1 This number does not include videos that were removed when an entire channel was removed. Most channel-level removals are due to spam violations and we believe that the percentage of violative content for spam is even higher.
2Not only do these 8 million videos represent a fraction of a percent of YouTube’s overall views, but that fraction of a percent has been steadily decreasing over the last five quarters.
3This excludes videos that were automatically matched as known violent extremist content at point of upload – which would all have zero views.


YouTube Blog

A nested tab design is hitting the Google Play Store, violating Google’s own design guidelines

A new design element was found several weeks ago in the Google Play Store that violated Google’s own Material Design guidelines. A nested navigation bar is suddenly appearing under a main tab of categories. As an example in the picture above, you can see Pop, Alternative, Rock, etc. under the main categories, Genre, Artist, Album, […]

Come comment on this article: A nested tab design is hitting the Google Play Store, violating Google’s own design guidelines

Visit TalkAndroid


TalkAndroid