KENT: How are the "bad actors" now in 2018 different than they were in 2015, '16, and ’17?
Byers Market Newsletter
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
CHAKRABARTI: We're looking all around the world to make sure that this same kind of playbook that bad actors used in 2016 is not used again. We are staying ever vigilant, looking everywhere for this kind of activity as a means of stopping it. And I think a lot of the work that we're doing, for example, around political ad transparency, helps prevent this kind of behavior. With political ad transparency, we're making it so that anybody who sees a political ad on Facebook can see who is behind that ad and who paid for that ad. I think that helps create a much more trustworthy environment for political discourse on our platforms. …It is an arms race. And we're always working to try to stay one step ahead. As an example, the bad actors out there have gotten more sophisticated. They're better at hiding their location... So we also have gotten better at finding when people are trying to hide their location.
KENT: President Donald Trump has come out criticizing big technology companies, Facebook included. He said, "Facebook, they're really treading on very, very troubled territory. And they have to be careful." The president went on to say, "It's not fair to large portions of the population." How do you interpret a statement like that from the president of the United States criticizing Facebook?
CHAKRABARTI: The way that I think about it is I go back to what is the mission of Facebook? You know, we really do want to give everybody a voice and bring the world closer together. And so what that means to us is that we want to make sure our platform is a great place for people to express themselves across the entire political spectrum. And we work to make sure that our platforms can support that, because if they don't, then we're not actually going to be able to realize our mission of bringing the world closer together. We want all voices to be represented.
KENT: Is Facebook discriminating against conservatives? There's a lot of concern on that front right now.
CHAKRABARTI: I think we just try to make sure that the platform is a place that's agnostic of people's political views. And really there is a thriving conversation taking place on Facebook. Any part of the political spectrum that you look at, there are thriving conversations taking place on Facebook.
KENT: How did you detect that activity coming out of Russia and Iran? And how quickly were you able to bring it down?
CHAKRABARTI: I think one important thing to understand is that we're not doing this alone. We're not doing this by ourselves. We're just one small part of a much bigger puzzle. We've been working with governments around the world, with security experts around the world, with civic society around the world to share information about threats that we see. And we bring those together and we put our best intelligence investigators on it to find that kind of activity on our platform and take it down. So, it's only by working with other people that we can solve these kinds of problems.
KENT: What is it like inside headquarters as you're detecting this activity? You're deploying so much effort, yet forces continue to try to violate the community standards.
CHAKRABARTI: I think it's just the reality of work in this space. This is an area where there are determined and sophisticated adversaries, who are always going to try to circumvent measures that we put in place. And that's precisely why we've made such massive investment in this space. You know, we've really grown our safety and security team from 10,000 people a year ago to 20,000 people today. And so that is the kind of commitment that we're showing to this. And it's really to the point that we've even said before, that it's going to impact our profitability, because we take our responsibility so seriously that we're willing to make that level of investment.
KENT: When you add 10,000 people to the security team, what exactly are their roles? What are they doing?
CHAKRABARTI: They're doing a lot of things. This is a huge cross-company effort that requires people of many different disciplines coming together to solve these kinds of problems. And so we have people who are trained intelligence investigators. We also have people who are some of the best computational data scientists in the world, who can find needles in a haystack using advanced artificial intelligence. And so those are the kinds of roles that we all put together in order to do this, because, really, that's what it takes.
KENT: How much do you work with your counterparts at Twitter and Google and other platforms to coordinate a response to fight bad actors?
CHAKRABARTI: As an example, with the takedowns that we did just a few weeks ago, we've been working with our industry partners on this, exchanging information. And that has really yielded a lot of benefits. The benefit that we see is we are able to get more information about particular bad actors and then we're able to take them off of the platform. And we can similarly, reciprocally, provide that kind of help to others in the industry.
"I think we are in a much better place than we were in 2016. But it is an arms race," said Facebook's head of civic engagement.
KENT: Facebook is building a war room, a "situation room,” a rapid response team of sorts in the final weeks leading up to the election. What is that going to look like? Why are you doing that?
CHAKRABARTI: We have many measures that we've put in place to try to prevent problems: the political ad transparency, blocking fake accounts, combating foreign interference, and preventing the spread of misinformation. But we know we have to be ready for anything that happens. And so that's why we've been building this war room, a physical war room [with] people across the company, of all different disciplines, who are there. So, as we discover problems that may come up in the hours leading up to the election, we can take quick and decisive action.
KENT: Is Facebook a safer platform now in 2018 compared to the lead-up to the 2016 election?
CHAKRABARTI: I believe we've been making very rapid progress in all of these areas: Combating foreign interference, making sure that we can block and remove fake accounts, stopping the spread of misinformation and fake news on the platform, and then also bringing more transparency to political ads. I think we are in a much better place than we were in 2016. But it is an arms race. And so that's why we're remaining ever vigilant, laser focused to make sure that we can stay ahead of new problems that emerge. This is going to be a never-ending process and that's exactly why we're investing so much in both people and technology -- to be as prepared as possible for the midterms.
KENT: There has been a very vocal critic out there [former Google design ethicist and co-founder of the Center for Humane Technology, Tristan Harris] who said that 2016 at Facebook left behind a living, breathing crime scene. What's your response to that for 2018? Will it not be the case?
CHAKRABARTI: In 2016, we saw new kinds of threats that we hadn't seen before. And that's why we've been mobilizing this huge effort across the company. Every single corner of this company is just remaining laser focused and taking our role really seriously. To make sure that this can be a safe place for political discourse.
KENT: Do you sense that the efforts now that Facebook is deploying to fight disinformation and fake news are working?
CHAKRABARTI: I think we're much more effective than we used to be, in this regard.