Facebook had lost close to $50 million, days after reports about user data being compromised by Cambridge Analytica emerged. The company is under scrutiny by government authorities across the US and UK looking to find if Facebook or the data it generates were involved in the Brexit referedum or the US elections in 2016 that resulted in a win for Donald Trump.
Facebook's CEO Mark Zuckerberg finally spoke with CNN's Laurie Segall yesterday about how he plans to fix the situation that has caused a deficit of public trust in the world's largest social media platform.
*Transcript provided by CNN and edited for clarity
LAURIE SEGALL: I want to start with just a basic question, Mark. What happened? What went wrong?
MARK ZUCKERERG: So this was a major breach of trust, and I’m really sorry that this happened. You know, we have a basic responsibility to protect people's data. And if we can't do that, then we don't deserve to have the opportunity to serve people.
So, our responsibility now is to make sure that this doesn't happen again. And there are a few basic things that I think we need to do to ensure that.
One is making sure that developers like Aleksandr Kogan, who got access to a lot of information and then improperly used it, just don't get access to as much information going forward. So, we are doing a set of things to restrict the amount of access that developers can get going forward.
But the other is we need to make sure there aren't any other Cambridge Analyticas out there, right, or folks who have improperly accessed data. So, we're going to go now and investigate every app that has access to a large amount of information from before we locked down our platform. And if we detect any suspicious activity, we're going to do a full forensic audit.
SEGALL: Facebook has asked us to share our data, to share our lives on its platform and wanted us to be transparent. And people don't feel like they've received that same amount of transparency. They're wondering what's happening to their data. Can they trust Facebook?
ZUCKERBERG: Yes. So one of the most important things that I think we need to do here is make sure that we tell everyone whose data was affected by one of these rogue apps, right? And we're going to do that. We're going to build a tool where anyone can go and see if their data was a part of this. But --
SEGALL: So the 50 million people that were impacted, they will be able to tell if they were impacted by this?
ZUCKERBERG: Yes. We're going to be even conservative on that. So, you know, we may not have all of the data in our system today. So, anyone whose data might have been affected by this, we're going to make sure that we tell.
And going forward when we identify apps that are similarly doing sketchy things, we're going to make sure that we tell people then too, right? That's definitely something that looking back on this, you know, I regret that we didn't do at the time. And I think we got that wrong. And we're committed to getting that right going forward.
SEGALL: I want to ask about that because when this came to light, you guys knew this a long time ago, that this data was out there. Why didn't you tell users? Don't you think users have the right to know that their data is being used for different purposes?
ZUCKERBERG: So yes. And let me tell you what actions we took.
So, in 2015, some journalists from "The Guardian" told us that they had seen or had some evidence that data -- that this app developer, Aleksandr Kogan, who built this personality quiz app and a bunch of people used it and shared data with it, had sold that data to Cambridge Analytica and a few other firms. And when we heard that, and that's against the policies, right? You can't share data in a way that people don't know or don't consent to.
We immediately banned Kogan's app. And, further, we made it so that Kogan and Cambridge Analytica and the other folks with whom we shared the data -- we asked for a formal certification that they had none of the data from anyone in the Facebook community, that they deleted it if they had it, and that they weren't using it. And they all provided that certification.
So, as far as we understood around the time of that episode, there was no data out there.
SEGALL: So, why didn't Facebook follow up? You know, you say you certified it. I think why wasn't there more of a follow-up? Why wasn't there an audit then? Why does it take a big media report to get that proactive approach?
ZUCKERBERG: Well, I mean, I don't know about you, but I was -- I’m used to when people legally certify that they're going to do something, that they do it. But I think that this was clearly a mistake in retrospect.
SEGALL: Was it putting too much trust in developers?
ZUCKERBERG: I -- I mean, I think it did. That's why we need to make sure we don't make that mistake ever again, which is why one of the things that I announced today is that we're going to do a full investigation into every app that had access to a large amount of data from around this time, before we locked down the platform.
And we're now not just going to take people's word for it and -- when they give us a legal certification, but if we see anything suspicious, which I think there probably were signs in this case that we could have looked into, we're going to do a full forensic audit.
SEGALL: How do you know there aren't hundreds more companies like Cambridge Analytica that are also keeping data that violates Facebook’s policies?
ZUCKERBERG: Well, I think the question here is do -- are our app developers, who people have given access to their data, are they doing something that people don't want? Are they selling the data in a way that people don't want, or are they giving it to someone that they don't have authorization to do?
And this is something that I think we now need to go figure out, right? So for all these apps --
SEGALL: That's got to be -- I got to say, that's got to be a really challenging ordeal. How do you actually do that? Because you talk about it being years ago, and then you guys have made it a bit stricter for that kind of information to be shared. But backtracking on it, it's got to be difficult to find out where that data has gone and what other companies have shady access.
ZUCKERBERG: Yes. I mean, as you say, I mean, the good news here is we already changed the platform policies in 2014. Before that, we know what the apps were that had access to data. We know how much -- how many people were using those services, and we can look at the patterns of their data requests.
And based on that, we think we'll have a pretty clear sense of whether anyone was doing anything abnormal, and we'll be able to do a full audit of anyone who is questionable.
SEGALL: Do you expect -- do you have any scale or any scope of what you expect to find, anything in the scope of what happened with Cambridge Analytica where you had 50 million users?
ZUCKERBERG: Well, it's hard to know what we'll find, but we're going to review thousands of apps. So, this is going to be an intensive process, but this is important. I mean, this is something that in retrospect we clearly should have done up front with Cambridge Analytica. We should not have trusted the certification that they gave us, and we're not going to make that mistake again.
If you told me in 2004 when I was getting started with Facebook that a big part of my responsibility today would be to help protect the integrity of elections against interference by other governments, you know, I wouldn't have really believed that was going to be something that I would have to work on 14 years later.
But we're here now. And we’re going to make sure that we do a good job at it.
SEGALL: Have you done a good enough job yet?
ZUCKERBERG: Well, I think we will see. But, you know, I think what's clear is that in 2016, we were not as on top of a number of issues as we should have, whether it was Russian interference or fake news.
But what we have seen since then is, you know, a number of months later, there was a major French election, and there we deployed some A.I. tools that did a much better job of identifying Russian bots and basically Russian potential interference and weeding that out of the platform ahead of the election. And we were much happier with how that went.
In 2017, last year, during a special election in the Senate seat in Alabama, we deployed some new A.I. tools that we built to detect fake accounts that were trying to spread false news, and we found a lot of different accounts coming from Macedonia.
So, you know, I think the reality here is that this isn't rocket science, right? I mean, there's a lot of hard work that we need to do to make it harder for nation states like Russia to do election interference, to make it so that trolls and other folks can't spread fake news.
But we can get in front of this, and we have a responsibility to do this not only for the 2018 midterms in the U.S., which are going to be a huge deal this year, and that's just a huge focus of us. But there's a big election in India this year. There's a big election in Brazil. There are big elections around the world, and you can bet that we are really committed to doing everything that we need to, to make sure that the integrity of those elections on Facebook is secured.
SEGALL: I can hear the commitment. But since I got you here, do you think that bad actors are using Facebook at this moment to meddle with the U.S. midterm elections?
ZUCKERBERG: I’m sure someone's trying, right? I’m sure there's, you know, V2 of all -- a version two of whatever the Russian effort was in 2016. I’m sure they're working on that, and there are going to be some new tactics that we need to make sure that we observe and get in front of.
SEGALL: Speaking of getting in front of them, do you know what they are? Do you have any idea?
ZUCKERBERG: Yes. And I think we have some sense of the different things that we need to get in front of.
SEGALL: Are you seeing anything new or interesting?
ZUCKERBERG: Well, what we see -- what we see are a lot of folks trying to sow division, right? So, that was a major tactic that we saw Russia try to use in the 2016 election. Actually most of what they did was not directly, as far as we can tell from the data that we've seen. It was not directly about the election but was more about just dividing people.
And, you know, so they run a group on, you know, for pro-immigration reform, and then they'd run another group against immigration reform and just try to pit people against each other. And a lot of this was done with fake accounts that we can do a better job of tracing and using A.I. tools to be able to scan and observe a lot of what is going on. And I’m confident that we're going to do a much better job.
SEGALL: Lawmakers in the United States and the U.K. are asking you to testify. Everybody wants you to show up. Will you testify before Congress?
ZUCKERBERG: So, the short answer is, is I’m happy to if it's the right thing to do. You know, Facebook testifies in Congress regularly, on a number of topics, some high-profile, and some not. And our objective is always to provide Congress -- this extremely important job, to have the most information that they can.
We see a small slice of activity on Facebook, but Congress gets to, you know, have access to the information across Facebook and all other companies and the intelligence community and everything. So, what we try to do is send the person at Facebook who will have the most knowledge about what Congress is trying to learn. So, if that's me, then I am happy to go.
What I think we found so far is that typically, there are people whose whole job is focused on an area. But I would imagine at some point that there will be a topic where I am the sole authority on, and it will make sense for me to do it. And I’d be happy to do it at that point.
SEGALL: You are the brand of Facebook. You are the name of Facebook. People want to hear from you.
ZUCKERBERG: That's why I’m doing this interview. But, you know, I think there is -- the question in a -- a question of congressional testimony is what is the goal, right? And that's not a media opportunity, right? Or at least it's not supposed to be.
The goal there, I think, is to get Congress all of the information that they need to do their extremely important job, and we just want to make sure that we send whoever is best informed at doing that.
I agree separately that there's an element of accountability where I should be out there doing more interviews. And, you know, as uncomfortable as it is for me to do, you know, a TV interview, it’s -- I feel that this is an important thing that as a discipline for what we're doing, I should be out there and being asked hard questions by journalists.
SEGALL: Knowing what you know now, do you believe Facebook impacted the results of the 2016 election?
ZUCKERBERG: Oh, that's -- that is hard. You know, I think that it is -- it's really hard for me to have a full assessment of that. You know, it’s -- the reality is --
ZUCKERBERG: Well, there were so many different forces at play, right? The organic posting that people did, the get out the vote campaigns that we ran, the pages that both candidates ran, the advertising that they did, I’m sure that all of that activity had some impact. It's hard for me to assess how much that stacked up compared to all the campaign events and advertising that was done off of Facebook and all the other efforts.
And I think it's also a -- hard to fully assess the impact of that organic activity, which we're actually quite proud of, right?
SEGALL: Also the bad actors, too.
ZUCKERBERG: And the bad stuff. That's what I’m saying, yes.
ZUCKERBERG: So I think it's hard to fully assess.
SEGALL: Given the stakes here, why shouldn't Facebook be regulated?
ZUCKERBERG: I actually am not sure we shouldn't be regulated. You know, I think in general, technology is an increasingly important trend in the world, and I actually think the question is more what is the right regulation rather than yes or no, should it be regulated?
SEGALL: What’s the right regulation?
ZUCKERBERG: Well, there are some basic things and I think that there are some intellectual debates. On the basic side, you know, there are things like ad transparency regulation that I would love to see, right? If you look at how much regulation there is around advertising on TV, in print, you know, it's just not clear why there should be less on the Internet, right? You should have the same level of transparency required.
And, you know, I don't know if a bill is going to pass. I know a couple of senators are working really hard on this. But we're committed and we've actually already started rolling out ad transparency tools that accomplish most of the things that are in the bills that people are talking about today, because we just think that this is an important thing. People should know who is buying the ads that they see on Facebook, and you should be able to go to any page and see all the ads that people are running to different audiences.
SEGALL: How has being a father changed -- changed your commitment to users, changed your commitment to their future and what a kinder Facebook looks like?
ZUCKERBERG: Well, I think having kids changes a lot. I used to think that the most important thing to me by far was, you know, my having the greatest positive impact across the world that I can. And now, you know, I really just care about building something that my girls are going to grow up and be proud of me for. And I mean, that's what is kind of my guiding philosophy at this point is, you know, when I come and work on a lot of hard things during the day and I go home and just ask, will my girls be proud of what I did today.
Subscribe to Arabian Business' newsletter to receive the latest breaking news and business stories in Dubai,the UAE and the GCC straight to your inbox.