Posts Tagged: we’re

What we’re watching: ‘Lucifer,’ ‘Wu-Tang Clan: Of Mics and Men’

This week Lucifer returns from the dead for a fourth season on Netflix, while the streaming service also offers up its first season of The Society, with a description that includes the text "a modern take on Lord of the Flies." It also has Wine Count…
Engadget RSS Feed

What we’re buying: A terrible replacement baby monitor

This week's IRL tale has nothing to do with new year's resolutions. Thankfully. Instead, Senior Editor Dan Cooper tries to replace his decent (but broken) baby monitor, and finds that cheaper models no longer cut it.
Engadget RSS Feed

Faster removals and tackling comments — an update on what we’re doing to enforce YouTube’s Community Guidelines

We’ve always used a mix of human reviewers and technology to address violative content on our platform, and in 2017 we started applying more advanced machine learning technology to flag content for review by our teams. This combination of smart detection technology and highly-trained human reviewers has enabled us to consistently enforce our policies with increasing speed.

We are committed to tackling the challenge of quickly removing content that violates our Community Guidelines and reporting on our progress. That’s why in April we launched a quarterly YouTube Community Guidelines Enforcement Report. As part of this ongoing commitment to transparency, today we’re expanding the report to include additional data like channel removals, the number of comments removed, and the policy reason why a video or channel was removed.

Focus on removing violative content before it is viewed

We previously shared how technology is helping our human review teams remove content with speed and volume that could not be achieved with people alone. Finding all violative content on YouTube is an immense challenge, but we see this as one of our core responsibilities and are focused on continuously working towards removing this content before it is widely viewed.

  • From July to September 2018, we removed 7.8 million videos
  • And 81% of these videos were first detected by machines
  • Of those detected by machines, 74.5% had never received a single view

When we detect a video that violates our Guidelines, we remove the video and apply a strike to the channel. We terminate entire channels if they are dedicated to posting content prohibited by our Community Guidelines or contain a single egregious violation, like child sexual exploitation. The vast majority of attempted abuse comes from bad actors trying to upload spam or adult content: over 90% of the channels and over 80% of the videos that we removed in September 2018 were removed for violating our policies on spam or adult content.

Looking specifically at the most egregious, but low-volume areas, like violent extremism and child safety, our significant investment in fighting this type of content is having an impact: Well over 90% of the videos uploaded in September 2018 and removed for Violent Extremism or Child Safety had fewer than 10 views.

Each quarter we may see these numbers fluctuate, especially when our teams tighten our policies or enforcement on a certain category to remove more content. For example, over the last year we’ve strengthened our child safety enforcement, regularly consulting with experts to make sure our policies capture a broad range of content that may be harmful to children, including things like minors fighting or engaging in potentially dangerous dares. Accordingly, we saw that 10.2% of video removals were for child safety, while Child Sexual Abuse Material (CSAM) represents a fraction of a percent of the content we remove.

Making comments safer

As with videos, we use a combination of smart detection technology and human reviewers to flag, review, and remove spam, hate speech, and other abuse in comments.

We’ve also built tools that allow creators to moderate comments on their videos. For example, creators can choose to hold all comments for review, or to automatically hold comments that have links or may contain offensive content. Over one million creators now use these tools to moderate their channel’s comments.1

We’ve also been increasing our enforcement against violative comments:

  • From July to September of 2018, our teams removed over 224 million comments for violating our Community Guidelines.
  • The majority of removals were for spam and the total number of removals represents a fraction of the billions of comments posted on YouTube each quarter.
  • As we have removed more comments, we’ve seen our comment ecosystem actually grow, not shrink. Daily users are 11% more likely to be commenters than they were last year.

We are committed to making sure that YouTube remains a vibrant community, where creativity flourishes, independent creators make their living, and people connect worldwide over shared passions and interests. That means we will be unwavering in our fight against bad actors on our platform and our efforts to remove egregious content before it is viewed. We know there is more work to do and we are continuing to invest in people and technology to remove violative content quickly. We look forward to providing you with more updates.

YouTube Team


1 Creator comment removals on their own channels are not included in our reporting as they are based on opt-in creator tools and not a review by our teams to determine a Community Guidelines violation.   


YouTube Blog

Want to join the Talk Android team? We’re hiring!

Have an interest in Android and Google news? Want to use that interest and write for a team with other smartphone enthusiasts? Talk Android might just be the destination for you! We’re looking for part time writers to join our site and cover the daily news cycle, review gadgets, and offer your opinion one of the […]

Come comment on this article: Want to join the Talk Android team? We’re hiring!

Visit TalkAndroid


TalkAndroid

WSJ: Facebook believes spammers were behind its massive data breach

More than two weeks after Facebook revealed a massive data breach, we still don't know who was using the flaw in its site to access information on tens of millions of users. Now the Wall Street Journal reports, based on anonymous sources, that the co…
Engadget RSS Feed

Here’s how to see if you were affected by Facebook’s breach

Today, Facebook provided additional information on the data breach it disclosed last month. Whereas it initially said up to 50 million users might have been affected, it now reports that 30 million were impacted by the breach. By exploiting a system…
Engadget RSS Feed

[TA Deals] We’re giving away a SNES Classic Edition through Talk Android Deals!

Did you want a SNES Classic but couldn’t find any in stock over the holidays? Don’t worry, we’ve got your back. We’re giving away one of Nintendo’s retro consoles through Talk Android Deals, and it’s incredibly simple to enter the contest. This console includes a ton of classic games, including hits like The Legend of […]

Come comment on this article: [TA Deals] We’re giving away a SNES Classic Edition through Talk Android Deals!

Visit TalkAndroid


TalkAndroid

[Giveaway] We’re giving away Evo Max Tech21 cases for theGalaxy S9 and Galaxy S9+!

Need a case for your Galaxy S9? We’ve teamed up with Tech21 to help get you covered, literally. We’re giving away 10 black Evo Max cases from Tech21, and you’ll be able to pick either a case for either the Galaxy S9 or Galaxy S9+, whichever size you have. All you have to do is […]

Come comment on this article: [Giveaway] We’re giving away Evo Max Tech21 cases for theGalaxy S9 and Galaxy S9+!

Visit TalkAndroid


TalkAndroid

More information, faster removals, more people – an update on what we’re doing to enforce YouTube’s Community Guidelines

In December we shared how we’re expanding our work to remove content that violates our policies. Today, we’re providing an update and giving you additional insight into our work, including the release of the first YouTube Community Guidelines Enforcement Report.

Providing More Information
We are taking an important first step by releasing a quarterly report on how we’re enforcing our Community Guidelines. This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.

We’re also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged to us for review against our Community Guidelines.

Machines Helping to Address Violative Content
Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed. And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).

Highlights from the report — reflecting data from October – December 2017 — show:

  • We removed over 8 million videos from YouTube during these months.1 The majority of these 8 million videos were mostly spam or people attempting to upload adult content – and represent a fraction of a percent of YouTube’s total views during this time period.2
  • 6.7 million were first flagged for review by machines rather than humans
  • Of those 6.7 million videos, 76 percent were removed before they received a single view.

For example, at the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.

The Value of People + Machines
Deploying machine learning actually means more people reviewing content, not fewer. Our systems rely on human review to assess whether content violates our policies. You can learn more about our flagging and human review process in this video:


Last year we committed to bringing the total number of people working to address violative content to 10,000 across Google by the end of 2018. At YouTube, we’ve staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams.

We continue to invest in the network of over 150 academics, government partners, and NGOs who bring valuable expertise to our enforcement systems, like the International Center for the Study of Radicalization at King’s College London, Anti-Defamation League, and Family Online Safety Institute. This includes adding more child safety focused partners from around the globe, like Childline South Africa, ECPAT Indonesia, and South Korea’s Parents’ Union on Net.

We are committed to making sure that YouTube remains a vibrant community with strong systems to remove violative content and we look forward to providing you with more information on how those systems are performing and improving over time.

— The YouTube Team

1 This number does not include videos that were removed when an entire channel was removed. Most channel-level removals are due to spam violations and we believe that the percentage of violative content for spam is even higher.
2Not only do these 8 million videos represent a fraction of a percent of YouTube’s overall views, but that fraction of a percent has been steadily decreasing over the last five quarters.
3This excludes videos that were automatically matched as known violent extremist content at point of upload – which would all have zero views.


YouTube Blog

What we’re buying: Dyson’s Supersonic hair dryer

This month, Associate Editor Swapna Krishna is singing the praises of Dyson's advanced but pricey hair dryer. Compared with her old model, it's like night and day.
Engadget RSS Feed

5 ways we’re toughening our approach to protect families on YouTube and YouTube Kids

In recent months, we’ve noticed a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not. While some of these videos may be suitable for adults, others are completely unacceptable, so we are working to remove them from YouTube. Here’s what we’re doing:

  1. Tougher application of our Community Guidelines and faster enforcement through technology: We have always had strict policies against child endangerment, and we partner closely with regional authorities and experts to help us enforce these policies and report to law enforcement through NCMEC. In the last couple of weeks we expanded our enforcement guidelines around removing content featuring minors that may be endangering a child, even if that was not the uploader’s intent. In the last week we terminated over 50 channels and have removed thousands of videos under these guidelines, and we will continue to work quickly to remove more every day. We also implemented policies to age-restrict (only available to people over 18 and logged in) content with family entertainment characters but containing mature themes or adult humor. To help surface potentially violative content, we are applying machine learning technology and automated tools to quickly find and escalate for human review.
  2. Removing ads from inappropriate videos targeting families: Back in June, we posted an update to our advertiser-friendly guidelines making it clear that we will remove ads from any content depicting family entertainment characters engaged in violent, offensive, or otherwise inappropriate behavior, even if done for comedic or satirical purposes. Since June, we’ve removed ads from 3M videos under this policy and we’ve further strengthened the application of that policy to remove ads from another 500K violative videos.
  3. Blocking inappropriate comments on videos featuring minors: We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors. Comments of this nature are abhorrent and we work with NCMEC to report illegal behavior to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.
  4. Providing guidance for creators who make family-friendly content: We’ve created a platform for people to view family-friendly content — YouTube Kids. We want to help creators produce quality content for the YouTube Kids app, so in the coming weeks we will release a comprehensive guide on how creators can make enriching family content for the app.
  5. Engaging and learning from experts: While there is some content that clearly doesn’t belong on YouTube, there is other content that is more nuanced or challenging to make a clear decision on. For example, today, there are many cartoons in mainstream entertainment that are targeted towards adults, and feature characters doing things we wouldn’t necessarily want children to see. Those may be OK for YouTube.com, or if we require the viewer to be over 18, but not for someone younger. Similarly, an adult dressed as a popular family character could be questionable content for some audiences, but could also be meant for adults recorded at a comic book convention. To help us better understand how to treat this content, we will be growing the number of experts we work with, and doubling the number of Trusted Flaggers we partner with in this area.

Across the board we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies. These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge. We’re wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right. As a parent and as a leader in this organization, I’m determined that we do.

Johanna Wright, Vice President of Product Management at YouTube


YouTube Blog

What we’re buying: Lightroom on a new iPhone, Google’s Pixel 2 cases

This month, we're making the most of our devices, whether that's by testing mobile photo-editing apps, trying out an iPad keyboard that matches its surroundings, or simply just laying down a little too much cash for a pretty-looking Pixel 2 phone cas…
Engadget RSS Feed

What we’re buying: Instant Pot Ultra, Huel and Sleep Cycle

This month is a (mostly) foodie edition of IRL. Nicole Lee sings the praises of the Instant Pot, while Daniel Cooper doesn't last long on the meal-replacement system, Huel. Tim Seppala, however, is just trying to get a good night's rest.
Engadget RSS Feed

Google’s Pixel 2 event is tomorrow and here’s what we’re expecting

Google seems to like being last when it comes to major product announcements, but maybe saving the best for last is the way to go. Starting tomorrow at 12pm Eastern Time, their Pixel 2 event will be action packed. Along with the highly anticipated new Pixel phones, Google will announce a host of new products, […]

Come comment on this article: Google’s Pixel 2 event is tomorrow and here’s what we’re expecting

Visit TalkAndroid


TalkAndroid

We’re live from SXSW 2017!

The past few weeks have been intense for the tech world, what with MWC and GDC taking place over the past few weeks. Now it's turn for SXSW 2017. We're on the ground in Austin, Texas to check out what the festival has to offer with its interactive, m…
Engadget RSS Feed

We’re liveblogging Apple’s ‘Hello Again’ MacBook launch!

Hello again, indeed! If it feels like we were just doing this, it's because… we were. Apple held an event last month to unveil the iPhone 7 and Apple Watch Series 2. There was much fanfare and we had quite a bit to say about it all. Now, just a few…
Engadget RSS Feed

These were our favorite games, hardware and toys from E3 2016

Another year, another massive, exciting E3 showcase. The biggest names in the video game industry brought out their newest games and hardware, including two console announcements (and controllers) from Xbox and a ton of fresh games from PlayStation w…
Engadget RSS Feed

YouTube’s first live 360-degree videos were little more than tech demos

Last week, YouTube started supporting live 360-degree video streams in a bid for more-immersive video content. Though users have been able to upload and watch 360-degree video for over a year, it's only now that Google is introducing the option to be…
Engadget RSS Feed

The best tech toys for kids will make you wish you were 10 again

Are you hunting for the perfect tech toy or gadget gift for your child? It can be tricky to find great tech for kids. There’s a lot to choose from, but what will go the distance? And what will end up at the bottom of a toy box?

The post The best tech toys for kids will make you wish you were 10 again appeared first on Digital Trends.

Cool Tech»Digital Trends

After Math: That’s it, we’re calling security

It's been a heck of a week. With the world still reeling from the Paris attacks, more people than ever are concerned with their personal security. That's why we're featuring five of this week's best posts about stuff that keeps us safe — and one a…
Engadget RSS Feed