© 2024 Kansas City Public Radio
NPR in Kansas City
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

Should Facebook Users Trust CEO Mark Zuckerberg's Apologetic Tone?


Mark Zuckerberg has taken an apologetic tone, saying he didn't do enough to shield user data from political strategy group Cambridge Analytica. Here he is on CNN last week.


MARK ZUCKERBERG: We have a basic responsibility to protect people's data. And if we can't do that then we don't deserve to have the opportunity to serve people.

GREENE: We're going to talk now with someone Zuckerberg has known from the beginning. His Harvard roommate Chris Hughes, who helped found Facebook, is on the line. Chris Hughes, thanks for being here.

CHRIS HUGHES: Thanks for having me.

GREENE: So you've known Mark Zuckerberg since you two were in college together, and you know his ambitions. What do you make of the apologetic tone? Should Facebook users trust him?

HUGHES: Well, I think it's time for the apology, and in many ways, it's overdue. I think this is a real turning point for Facebook and for Mark and for the entire leadership team. I mean, the Cambridge Analytica scandal, in some sense, is really just the tip of the iceberg. I mean, when it comes to foreign powers hacking the election, the way the news feed has been configured to reward the most outrageous of voices, Facebook has failed on multiple counts over the past few years. And it's time to be honest about that and also be a little bit more transparent about what's happened so that it can be corrected in the future. So I think it's well overdue.

GREENE: I know you haven't been working at the company for many years now, but as one of its founders there from the beginning when it was designed, do you take some personal responsibility for the failings?

HUGHES: Well, I do. You know, like you said, I haven't been at the company in over a decade. But, you know, I was part of the founding team in 2004. And I'm proud of a lot of the early work that made Facebook what it is. I do also think that, you know, Facebook has a responsibility to its users to protect their data and not just to protect it but make sure that people understand what data they're producing and whether they own it, who has access to it and when. And Facebook has failed them, you know, across the board. And the question now is not just what - you know, what can be done to ensure the security of that data. It's, how can we use this moment to ensure that we're having a broader cultural conversation about the data that we're all creating on Facebook, Google, Amazon, through our phones, et cetera and make sure that the companies are held accountable for it? So it's not just a kind of voluntary regulation but potentially something that government takes a look at, as well.

GREENE: Well, you said Facebook has responsiblities to serve its users but also its shareholders, right? I mean, and that requires sharing user data, so companies can target their advertising, and Facebook can make money. I just want to play a little bit of tape here from Tim Wu. We spoke to him several days ago. He's a former adviser to the Federal Trade Commission.

TIM WU: I think there's a sort of intrinsic problem in having for-profit entities with this business model in this position of so much public trust because they're always at the edge because their profitability depends on it.

GREENE: Is there any way for Facebook to be profitable and to grow without being on the edge? Because user information has to be shared.

HUGHES: Well, I think Tim's exactly right. It's in Facebook's interests to try to capture as much attention as possible from its user base. The more times that we pick up our phone each day, the more that we check Facebook, the more engaged we are there. And that attention is in turn what is sold to advertisers. And now, Facebook will say, well, that's what makes the company, the platform free for users to use. And while that's true, there's nothing written in stone about that. I mean, the - I don't know if a lot of Facebook users truly understand that their attention is being boxed up and sold to advertisers. And I do think that there's a big question about whether or not users should have the right to say no to that or the right to say, you know, actually, I don't prefer that business model. I'd rather pay a few dollars a month or take a different approach. I don't think that there's anything written in stone about Facebook's current business model. And what's more, I think we're entering a moment of reckoning where these kind of big, fundamental questions are the kind that they're going to need to answer.

GREENE: And a lot of it has to do with politics. I want to ask you - I mean, you went from Facebook to President Obama's campaign as a digital strategist and some - I mean, go as far as crediting you with inventing how to use social media and politics.

HUGHES: (Laughter) I don't know about that.


GREENE: Well, but in 2012 - I mean, the campaign collected data on Facebook from unwitting friends of people who had downloaded a campaign app. I mean, doesn't that show us that Facebook users are always sort of on the edge and close to maybe having their data taken an improper way?

HUGHES: Well, I think it's important not to make a false equivalency between what happened with the Trump campaign in 2016 and what happened in 2012. I mean, in 2016, Facebook users were misled by an app developer to share their own data and the data of their friends, which then Cambridge Analytica - it seems like potentially illegally used in conjunction with the Trump campaign in 2012...

GREENE: But in 2012, wasn't there an app that - I mean, people who were friends of people who had the app were having some of their data taken by the Obama campaign.

HUGHES: Users did volunteer in 2012 to talk about their support of Obama and to recruit their friends. However, the point that you're making, I think, is right - that in both cases, users didn't always understand exactly what they were sharing and with whom. And that is a problem across all of these companies - is that right now we're all constantly - we on average check our phones over 100 times a day, and we're sharing all kinds of information. And yet a lot of the time, we don't exactly know who's exactly seeing it, whether that's in politics or in business or elsewhere. So I think the underlying point that you're making is true, and I think that's going to have to change.

GREENE: Chris Hughes helped found Facebook and was Mark Zuckerberg's college roommate at Harvard. Thanks so much, Chris. We appreciate it.

HUGHES: Thanks for having me. Transcript provided by NPR, Copyright NPR.

KCUR serves the Kansas City region with breaking news and award-winning podcasts.
Your donation helps keep nonprofit journalism free and available for everyone.