Twitter's spam bot issue isn't new, but it came to a head when it was revealed that thousands of Russian troll accounts used the platform to influence the 2016 Presidential elections. Now, the company has announced a set of changes and new developer…
Engadget RSS Feed
In mid-November, chipmaker Qualcomm announced they were rejecting a $ 130 Billion takeover bid from Broadcom. At the time, Broadcom indicated they were not planning to just walk away from the takeover attempt leading many to expect some aggressive moves may surface. We now see the fruits of that as Broadcom is moving to appeal directly […]
Come comment on this article: Broadcom steps up pressure in fight to takeover Qualcomm
In June, we announced four steps we’re taking to combat terrorist content on YouTube:
We shared our progress across these steps in August and wanted to update you again on where things are today.
Better detection and faster removal
We’ve always used a mix of human flagging and human review together with technology to address controversial content on YouTube. In June, we introduced machine learning to flag violent extremism content and escalate it for human review. We continue to get faster here:
Inevitably, both humans and machines make mistakes, and as we have increased the volume of videos for review by our teams, we have made some errors. We know we can get better and we are committed to making sure our teams are taking action on the right content. We are working on ways to educate those who share video meant to document or expose violence on how to add necessary context.
Outside experts are essential to advising us on our policies and flagging content for additional inputs that better train our systems. Our partner NGOs bring expert knowledge of complex issues like hate speech, radicalization, and terrorism.
We have added 35 NGOs to our Trusted Flagger program, which is 70 percent of the way towards our goal. These new partner NGOs represent 20 different countries and include NGOs like the International Center for the Study of Radicalization at King’s College London and The Wahid Institute in Indonesia, which is dedicated to promoting religious freedom and tolerance.
We started applying tougher treatment to videos that aren’t illegal and don’t violate our Guidelines, but contain controversial religious or supremacist content. These videos remain on YouTube, but they are behind a warning interstitial, aren’t recommended, monetized, and don’t have key features including comments, suggested videos, and likes. This is working as intended and helping us strike a balance between upholding free expression, by providing a historical record of content in the public interest, while also keeping these videos from being widely spread or recommended to others.
Amplify voices speaking out against hate and extremism
We continue to support programs that counter extremist messages. We are researching expansion for Jigsaw’s Redirect Method to apply this model to new languages and search terms. We’re heavily investing in our YouTube Creators for Change program to support Creators who are using YouTube to tackle social issues and promote awareness, tolerance and empathy. Every month these Creators release exciting and engaging new videos and campaigns to counter hate and social divisiveness:
In addition to this work supporting voices to counter hate and extremism, last month Google.org announced a $ 5 million innovation fund to counter hate and extremism. This funding will support technology-driven solutions, as well as grassroots efforts like community youth projects that help build communities and promote resistance to radicalization.
Terrorist and violent extremist material should not be spread online. We will continue to heavily invest to fight the spread of this content, provide updates to governments, and collaborate with other companies through the Global Internet Forum to Counter Terrorism. There remains more to do so we look forward to continuing to share our progress with you.
The YouTube Team
Now that Facebook has given Russia-linked ads to Congress, it's outlining what it'll do to prevent such a suspicious ad campaign from happening in the future. To begin with, it's promising to make ads more transparent — it's writing tools that will…
Engadget RSS Feed
Did you pay for an expensive pay-per-view or streaming pass to watch the hyped-up boxing match between Floyd Mayweather and Conor McGregor, only to boil with rage as your access went down? You're far from alone. Numerous reports have revealed that…
Engadget RSS Feed
A little over a month ago, we told you about the four new steps we’re taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.
We wanted to give you an update on these commitments:
Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:
We are encouraged by these improvements, and will continue to develop our technology in order to make even more progress. We are also hiring more people to help review and enforce our policies, and will continue to invest in technical resources to keep pace with these issues and address them responsibly.
More experts: Of course, our systems are only as good as the the data they’re based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time.
Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.
Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.
And over the weekend, we hosted our latest Creators for Change workshop in Bandung, Indonesia, where creators teamed up with Indonesia’s Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.
Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead.
The YouTube Team
Google is already flagging fake news, but it knows that isn't always enough. People need to recognize what fake news is, too. To that end, its YouTube wing just launched an Internet Citizens program that will teach UK teens to spot fake news throug…
Engadget RSS Feed
Sci-Fi taught us ultraviolet light was a weapon to use against the scary undead stalking the night. Actual science has discovered that it may be an extremely effective treatment for cancer.
Stepping into the ring to fight over China and its prime smartphone market are the country’s own smartphone makers, as they begin to acquire patents by way of licensing deals, acquisitions, and a whole lot of money.
The post China smartphone makers snap up patents in fight for market dominance appeared first on Digital Trends.
YouTube has taken the next step in its battle against false copyright infringement alerts by creating a team dedicated to minimizing mistakes.
The move follows cries for help from content creators using the video-sharing website that have had their videos removed in the past due to baseless legal claims.
Speaking about YouTube’s latest plan when it comes to tackling the issue, Spencer from YouTube’s Policy Team has released a statement on the website’s help forum.
According to the website spokesperson, YouTube has been monitoring false video removals very closely and is striving to do even better looking to the future.
“The good news is that the feedback you’ve raised in comments and videos on YouTube and beyond is having an impact. It’s caused us to look closely at our policies and helped us identify areas where we can get better.
It’s led us to create a team dedicated to minimizing mistakes and improving the quality of our actions.”
Revealed in this latest post, YouTube will soon be rolling out a number of new initiatives to help out content creators, strengthening communication between YouTubers themselves and the website’s support team.
Meanwhile, users are being told that YouTube will be increasing transparency into the status of monetization claims, with the website’s makers hoping to get things rolling as quickly as possible.
It’s clear that the YouTube community is growing tired of false copyright claims, so YouTube’s Policy Team will be hoping its latest plan of action turns out to be a brilliant one.
Source: Google Product Forums
Come comment on this article: YouTube forms team to fight false copyright claims
The Washington, D.C., neighborhood of Georgetown has had a shoplifting problem lately, so local citizens turned to group-messaging app GroupMe for help. What they got instead was widespread racial profiling.
The post ‘Operation GroupMe’ was meant to fight shoplifting, enables racial profiling instead appeared first on Digital Trends.