Advertisment

YouTube lays out plan to fight extremist content on its platform

author-image
CIOL Writers
New Update
Youtube gets a makeover with revamped design and a new logo

In the recent months, social media platforms have been under pressure to do more against content that promotes extremist and terrorist propaganda. Now, YouTube has outlined four steps that it's taking to curb extremist activity on its platform.

Advertisment

"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now," Kent Walker, Google's general counsel, said in an op-ed column in the UK-based Financial Times that was later posted on the Google blog.

The first of the four steps is to expand the use of the automated system to better identify terror-related videos. The company is also expanding its pool of Trusted Flagger users, a group of experts with special privileges to review flagged content that violates the site’s community guidelines. The expanded effort will allow the company to draw on specialty groups to target specific types of videos, such as self-harm and terrorism.

The third step is to take a harder line on videos that contain inflammatory religious or supremacist content -- that do not violate community standards -- behind a warning.

Advertisment

Finally, the company will do more with counter-radicalization efforts by building off its Creators for Change program, which will redirect users targeted by extremist groups such as ISIS to counter-extremist content. “This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining.”

Walker adds that YouTube is working alongside companies such as Facebook and Twitter to develop tools and techniques to support broader anti-terror efforts online.

twitter facebook youtube