Facebook head of counterterrorism: We need everyone's help

Monika Bickert might have one of the toughest jobs at Facebook right now.

As Facebook’s (FB) head of counterterrorism efforts and global product policy, she sets community standards and monitors content that its 1.86 billion users share and consume every day.

It’s only through public-private partnerships, particularly with students, that Facebook can really crack down on prejudice, online hate and extremism, Bickert told Yahoo Finance in an interview at South by Southwest in Austin, Texas.

Beyond managing her own team of counterterrorism researchers and analysts who review content 24 hours a day, 7 days a week, Bickert is relying heavily on community policing.

Through a joint venture with the US Department of Homeland Security and marketing firm EdVenture Partners, Facebook has doubled down on “Peer to Peer: Challenging Extremism,” its attempt to crowdsource and aggregate insights from students across the world.

EdVenture Partners
EdVenture Partners

Bickert says student-created and student-driven initiatives have been one the most effective way to combat hate speech and terrorist activity; collaborations like Peer to Peer ensure that speech doesn’t have a home on Facebook, she says.

Take, for instance, the University of Rochester’s “It’s Time” campaign that increases awareness about violent extremism and teaches students how to stay safe online through videos, info booths, lectures and art exhibits. Through the hashtag #ExOut, it encouraged people to avoid social media containing extremist messaging.

“We’re taking these local voices at more than 200 universities around the world and asking students: ‘What are the issues in your community? Where is violent extremism coming from? And how can we best counter that?” Bickert explained.

Universities participating in Peer to Peer

Source: EdVenture Partners
Source: EdVenture Partners

How do you scale these efforts?

But given Facebook’s incredible reach, the question remains whether grassroots movements and human-focused initiatives will stop extremist activity and behavior from finding a home on the platform.

Bickert believes local communities are, in fact, the best way to keep extremist activity away from Facebook.

“We want campaigns that are global. We want to reach a big audience, but we’ve seen — both from the research that we’ve done and also from talking to civil society groups around the world — that in order to really succeed with counter-speech, it has to be local,” she said.

University students, in particular, form solutions that may seem specific to their communities, but that can actually be extrapolated beyond the college campus.

“Students are the most credible voices and they know the issues better than anybody. At the same time, the insights are shared among all the universities…that allow the program to reach tens of millions of people,” Bickert said.

Previously lead security counsel at Facebook and a federal prosecutor, Bickert acknowledged that these cooperative efforts are vital because 24/7 active policing will never be the antidote to removing terrorist content from the platform.

“We want to make sure we’re removing any content from our site if it glorifies terrorism, if it promotes a terrorist group or if it’s from a terror group,” she said.

“But we know that even if Facebook could perfectly keep any terrorist activity from hitting our community and even if every other internet company or social media platform could do the same, we know that that’s still not ultimately the solution to countering violent extremism. Instead we need to make sure that we are allowing people to have a dialogue in a constructive way and help amplify voices from the community that are pushing back on these violent ideologies,” she added.

Of course, the proliferation of terrorist content is not unique to Facebook’s platform.

Twitter (TWTR) said on Tuesday that it has suspended 636,000 accounts over the past two years in an effort to tackle “violent extremism.” It got rid of 376,890 accounts in the last six months of 2016 alone, according to its latest transparency report.

Even with humans and algorithms constantly mining user content, it’s become increasingly clear that social media giants rely heavily on their users to keep their platforms free of extremist speech.

Melody Hahm is a writer at Yahoo Finance, covering entrepreneurship, technology and real estate. Follow her on Twitter @melodyhahm.

Read more:

How this ride-hailing startup is learning from Uber’s mistakes

How a 114-year-old automaker is taking cars off the road

Why one Apple expert says to look beyond the iPhone

Advertisement