TERRY GROSS, HOST:
This is FRESH AIR. I'm Terry Gross. You'd be hard-pressed to name an American company that's more distrusted and yet more influential today than Facebook. The social media site has been rocked by scandals involving the misuse of its users' personal information and harsh criticism of its role in the 2016 election. And yet it remains huge, with nearly 3 billion users, and profitable, with annual earnings in the billions of dollars. Our guest, Steven Levy, is a veteran technology journalist who's been reporting on Facebook for years and has written a new in-depth history of the company. Facebook's founder and CEO Mark Zuckerberg gave Levy leaving nine interviews and permission to talk to many other present and former employees of the company. Levy writes that virtually every problem Facebook has confronted since 2016 is a consequence of its unprecedented mission to connect the world and its reckless haste to do so.
Steven Levy has written for Rolling Stone, Harper's Magazine, The New York Times and The New Yorker. He's now editor at large for Wired and the author of seven previous books. He spoke with FRESH AIR's Dave Davies about his new book, "Facebook: The Inside Story."
DAVE DAVIES, BYLINE: Well, Steven Levy welcome to FRESH AIR. You have covered technology for a long time. When did you decide you had to make this a book-length exploration of Facebook?
STEVEN LEVY: I could pinpoint that pretty precisely. I think it was the end of August of 2015. Mark Zuckerberg posted on his Facebook feed that a billion people had signed on to Facebook that day, and I realized that had never happened before. I'd covered Facebook for a long time before that, but I realized that this is something utterly new, enabled by technology and whatever this company did. And I had to write about it and explain how that happened and what it meant and how they dealt with it.
DAVIES: Right. So what kind of commitment did Zuckerberg and others at Facebook make to cooperating with you?
LEVY: Well, it took me a few months to get them to this point, but eventually, they agreed to give me access to their employees, including Mark and Sheryl Sandberg, his chief operating officer. And this is also very important. They would give a go-ahead to former employees who wouldn't talk to me unless Facebook said it was OK. There were no strings attached. They didn't get the read the manuscript. The only thing that I said was the interviews I did for the book would be embargoed for the book. And I wouldn't be able to take something if someone said something controversial and just write it in an article for Wired the next day.
DAVIES: All right. Everything on the record?
LEVY: Well, they're allowed to go and background. They're off the record if they wanted to go there. And some of the interviews I did, particularly with some people who may be - not be at Facebook, were either on background or not for attribution. So some people were a little worried about direct quotes against Facebook being in the book.
DAVIES: Right. That makes sense. But Zuckerberg himself - was all that on the record?
LEVY: I don't think Mark once said, this is off the record. I think pretty much everything he says - there's - the PR people can't say, Mark, don't say that. At least they don't. And he calls his own shots. Some other people would say, oh, let me say this off the record. And I would try to dissuade them, but a couple of people did that. And one in particular - in the course of our interviews, I kept saying, hey; this has got to be on the record. I really ought to use this.
DAVIES: Right. What's he like in person? What was the relationship like?
LEVY: We got to a pretty good place where I think he was probably as candid with me as he's been with any journalist, although I don't think he ever forgot that he was talking to a journalist. He has grown over the years as an interviewee. When I first met him and started asking him questions - this is 2006 - he would just stare at me blankly. And it took him a while before I can get any answer out of him. And by the time I started this book, he had obviously been more comfortable with this. He'd been a CEO for many years and a CEO of a public company, so he understood how to give answers to journalists. And he's a very curious person. People describe him as a learning machine. So sometimes he'll ask questions of me, and I think he did this with other journalists, too.
DAVIES: Yeah. Can you think of an example of him asking you a question?
LEVY: Well, I think one of the most interesting examples - and this sort of casts a light on how Facebook is dealing with its current issues - it was 2018, and it was a run up to their big developer conference they called F8, which has the resonance of fate. And he was explaining to me how he was going to spend half of his keynote speech not apologizing exactly but saying that Facebook knew it made mistakes and we're going to win back your trust. And the other half was going to be about how - well, we have to introduce new things, too, because we can't stand still. So on one hand, it was saying, hey; don't worry; we're fine, and on the other hand saying, here's the new disturbing things we're doing.
And he said, well, we're going to introduce a product called Facebook Dating. And I said, really? Don't you think that's a little off-tone considering people are so concerned about the data they give you? This is just basically a few weeks after Cambridge Analytica, which was the biggest scandal in the company's history, involving the release of data. And he said, well, Facebook has always been sort of a secret dating site of some kind, and we don't think it's that big a deal. And then he went on. And then a few minutes later, he just stopped and he said, hey; do you really think that's going to be a big problem for us?
DAVIES: What'd you say?
LEVY: I said, yeah.
DAVIES: (Laughter) And it didn't happen or did happen? It says in the book, but I can't remember.
LEVY: Well, he - as one would expect, people said, what's going on here? In this moment, they're introducing a dating site. They introduce it in some smaller countries first. And then a year later, they rolled it out to the United States, and you can now date on Facebook.
DAVIES: One of the things you see about Facebook after it got going for a few years was there was a decision to focus on growth - more users, more engagement among the users that we have. What motivated this? I mean...
LEVY: Well...
DAVIES: It's sort of natural for every business, I suppose, but...
LEVY: Sure.
DAVIES: This was sort of a different business.
LEVY: Right, right. Well, it's - every business wants to grow. But what this thing with Facebook was its relentless focus on growth. And Mark always believed in it, but in 2008, an executive named Chamath Palihapitiya said, I want to take this a step farther. I want to start an internal group called the Growth Circle. And we would just do anything we can to make Facebook grow faster and faster and faster and get it to a billion people really, really quickly. And this was unthinkable at the time, but Chamath was very smart, and he gathered the team of the best people at Facebook. He got a few people from outside. And they were almost a group, a wrecking crew from within Facebook. They sat differently in a different place than everyone else, and they did things that really pushed the boundaries of what was acceptable in Silicon Valley practices to push growth even farther than it had before.
DAVIES: What were some of the things that they did?
LEVY: Well, one thing was - and this was kind of standard; they did a really great job of it - was to make Facebook higher and higher when you searched in Google for it. And, you know, Facebook profiles were available on Google, so when people search themselves, a Facebook profile would come up with you with your name on it, even though you weren't there. And they would encourage you to take a look and sign up. And another thing was sometimes they would - if you signed up...
DAVIES: Wait one second. You said that it would show a profile of you even if you weren't a user, weren't registered with Facebook?
LEVY: That's right. This is a controversial area of Facebook. It's something - Mark, you know, in 2006, he kept this notebook, and he would speculate on something called dark profiles. And Facebook didn't quite implement it the way he outlined that, which is almost like a Wikipedia page that your friends would start about you even if you weren't on Facebook. That didn't happen.
But there was apparently something - I talked to a couple people in early Facebook who said that if someone tagged you in a photo, Facebook would keep a stub that you would then bring to life if you actually came and signed up for it. And Chamath told me that, yeah, one of the things they would do was have this - he referred to it as a dark profile to me - that would show up. And if you signed up, then they would immediately try to populate your newsfeed so you would have a reason to stay on Facebook.
DAVIES: Right. And so if I weren't even registered with Facebook but Facebook collected information about me - because there's information on the Internet - someone does a Google search and finds my name, then suddenly there's me connected with Facebook. I see that. And I think, how - man, I should sign up? (Laughter).
LEVY: That's what Chamath told me. Oh, yeah. And you know, So...
DAVIES: Wow.
LEVY: ...Facebook has since said that they never use shadow profiles or dark profiles for reasons like that. And I'm not quite sure what they do now, you know, but they do keep tabs on people for security reasons and some other things. But they say they don't serve advertising to anything they keep. So it's sort of a gray area that I found some contradictions when I was doing the book.
DAVIES: They also translated the site into foreign languages and - so that it would grow internationally, too. Right?
LEVY: That's right. And you know, they were very aggressive about translations. They didn't hire translators. They let people - users in that country translate it themselves. And they would sometimes refine it. And especially in the big countries, they would do refinements. But in obscure dialects of certain languages, people would just, you know, translate for themselves. And in some smaller countries, Facebook would be in there in the native language when no one at Facebook spoke the language. There was no way to police what went on in that country from Facebook.
DAVIES: Right. And in some cases, it became a major medium of information and was abused - right? - by people spreading rumors and hate speech.
LEVY: That's right. We're thinking - yeah. Yeah, the prime example is Myanmar, where Facebook went and became very popular. And then it became even more popular when Facebook wound up giving it away. If you had a mobile plan, you wouldn't pay for the Facebook pages you used. So it became almost, like, synonymous with the Internet in Myanmar and some other places. Meanwhile, Facebook had at first none and then very few people who could read what was going on there and couldn't police it. And it was abuse. People would put up false content that spurred people to violence, literally. And it wasn't until 2015 that Facebook even translated its book of rules into Burmese.
DAVIES: Steven Levy is editor at large for Wired. His new book is "Facebook: The Inside Story." We'll talk more after a break. This is FRESH AIR.
(SOUNDBITE OF ALEXANDRE DESPLAT'S "SPY MEETING")
DAVIES: This is FRESH AIR, and we're speaking with Steven Levy. He is editor at large for Wired. He is a veteran technology writer, and he has a deep dive into Facebook. His new book is called "Facebook: The Inside Story."
All right. Let's talk about the 2016 presidential campaign. How much did Facebook know, as that campaign progressed, about Russian efforts to plant fake or provocative stories in the newsfeed and also put bogus information in ads it was buying?
LEVY: Well, the Russian misinformation, which we could talk about, is part of a number of things that happened in the 2016 election that Facebook was involved in. The first thing was, you know, proliferation of fake stories. This called fake news weren't necessarily cooked up by the Russians. Only a small percentage of those turned out to be a part of this big Russian disinformation campaign that took place on Facebook. But a lot of it was done for financial gain - a lot of it from this small town in Macedonia, as it turned out, where people would make up fake stories or take a fake story that some obscure blogger posted and circulate it on Facebook.
And because of decisions made earlier in Facebook's history - around 2008 and 2009, Mark had a Twitter obsession, and he made Facebook more friendly to things that went viral on the system. These fake news stories - things like, you know, the pope endorsed Donald Trump or Hillary Clinton was involved in a sex ring in a pizzeria - they would go viral on Facebook and be more popular than legitimate news stories.
And Facebook had been warned against this sort of phenomenon. And particularly in the later stage of the presidential campaign, people were saying - hey, this has got to stop. But the company made a decision not to stop it because they felt it was part of people's expression to be able to post what they wanted. And Facebook didn't want to be in the position of a referee. Now, that decision, in part, was made by the head of Facebook's Washington office, who was a lifelong Republican and who saw his job, according to people who worked with him, as protecting the Republicans.
So that went on till the end of the election, and fake news proliferated on Facebook.
DAVIES: And I think it's worth making the point that one of the reasons that people in Macedonia found that they could make money by doing this was that, as part of its drive for growth, Facebook had its algorithm tweaked so that the most addictive material would come up at the top of your newsfeed. So there was an incentive for viral stuff, whether it was true or not.
LEVY: Right. Engagement was part of the formula for what would appear on your newsfeed. So if a lot of people shared a post, it would be ranked higher and it would appear in more people's newsfeeds.
DAVIES: Right. So what about the ads that were coming through?
LEVY: So - the fact is that Facebook offered help to political candidates who wanted to use Facebook better. And one side did a much better job of it than the other, and that was Donald Trump's team. And they took advantage of everything you could on Facebook. They spent much, much more on Facebook than the Clinton campaign did. They accepted the help that Facebook offered both sides, and they just built the heck out of their campaign around Facebook. They would do things like create many, many variations of an ad directed to different kinds of people just to see what clicked, and the Clinton campaign did nothing like that.
So the people at Facebook watched this with kind of awe. They're saying, wow, these people are really doing a great job with our campaign there on the Trump campaign. They didn't do anything about it because it wasn't - you know, they wanted to be fair to both sides. And they weren't too bothered by it because they thought - well, Clinton's going to win anyway, so this is just an interesting phenomenon about how someone uses our platform.
DAVIES: And the company was making money from every ad, right?
LEVY: Oh, sure. Yeah, it was making money. It wasn't the giant part of their revenues, but political ads in general were a significant source of revenue.
DAVIES: Right. And of course, it's worth mentioning that that microtargeting of ads based on Facebook's information about its users was its chief selling point, not just on political ads but everything. I mean, you could really a target an ad at a specific kind of customer.
LEVY: Right. And it sort of changed what the feel-good pitch of Facebook advertising was. You know, they would always say when you complained about microtargeting was - hey, this helps us serve more relevant ads to people. If you're interested in a certain musical star, we're going to let you know when that person has an album out or is appearing in your area because the advertisers will know where to find you.
This is something a little different because Facebook knows so much about you that you could target someone who might be vulnerable to a certain kind of pitch that make you change your mind about something or even deter you from voting. So this isn't part of the feel-good vibe of Facebook advertising that it presents to the world.
DAVIES: So Trump wins the election, which was not well-greeted by a lot of the staff at Facebook.
LEVY: No, people were in tears when it's what happened, and they had a big meeting the day after the election.
DAVIES: Tell us about the meeting.
LEVY: Well, people wanted to know for the first time - gee, did we have a hand in this? Were we, in part, responsible for this electoral outcome that we, meaning a lot of the people at Facebook, didn't want to see? And there was some soul-searching among them. And a couple days afterwards, though, Mark Zuckerberg, in speaking at a conference in Half Moon Bay, said he thought it was a crazy idea to think that Facebook influenced the election, which was a statement he had to later walk back.
DAVIES: Right. And eventually, real information came out. How did Zuckerberg's attitude change then?
LEVY: Well, I think he realized that, one, it sounded blithe. I actually was in the room when he said that, and it really didn't sound that blithe. It was part of a longer, more thoughtful answer, and it didn't really bring the room to a standstill. But we learned more in the weeks following about the abuse of Facebook, including the Russian disinformation campaign that took place on Facebook through ads and other posts. And it didn't seem like such a reach when we learned those things.
DAVIES: Yeah. How big were the numbers? How many people were - how many users got this misinformation?
LEVY: Well, the Russian disinformation was hundreds of thousands - but again, not a big percentage at all, a very tiny percentage of the stuff that people saw on Facebook. Yet it had some really disturbing aspects to it.
The Russians would do things - like, they would be able to find people who were against immigration and tell them - hey, there's this rally happening somewhere in Texas, you know, that's pro-immigration. You ought to go there and protest against it. And then they would inform people who were on the other side about the same rally, which was a nonexistent rally that the Russians were trying to create to bring two sides together that might fight each other. Basically, they were just trying to sow dissent and bad feelings to make people think that the system was really screwed up and, in some cases, deter them from voting.
GROSS: We're listening to the interview FRESH AIR's Dave Davies recorded with Steven Levy, editor at large for Wired and author of the new book "Facebook: The Inside Story."
After a break, they'll talk about the damage to Facebook from the Cambridge Analytica scandal and about Mark Zuckerberg's plans for the future of the company. Later, Ken Tucker will review a newly released Bryan Ferry concert recorded live at the Royal Albert Hall in 1974. I'm Terry Gross, and this is FRESH AIR.
(SOUNDBITE OF WAYNE HORVITZ AND THE ROYAL ROOM COLLECTIVE ENSEMBLE'S "A WALK IN THE RAIN")
GROSS: This is FRESH AIR. I'm Terry Gross. Let's get back to the interview Dave Davies recorded with Wired editor at large Steven Levy about the social media giant Facebook. Facebook CEO Mark Zuckerberg gave Levy nine interviews and permission to talk to many other present and former employees of the company for Levy's new book, "Facebook: The Inside Story." When we left off, they were talking about the damage to Facebook's reputation from its role in the 2016 presidential election.
DAVIES: The criticisms about the fake news and the ad campaigns, some of them from foreign sources, were almost overshadowed by the Cambridge Analytica scandal. This really grew out of a decision Facebook had made to make its product not just a social networking site but a platform so that outside software developers could use Facebook data to create apps, software products on Facebook. This was kind of a fateful decision, wasn't it?
LEVY: That's right, and this took place really early in Facebook's history, in 2007. It was a big leap that Facebook all of a sudden said, we want to be kind of the next operating system in the world, and this will be a social operating system. The Internet should be built around people, and we're the place that's going to host that. And you can't say the platform was an unmitigated failure because they really lifted Facebook to the top rung of technology companies. It took place in 2007. And all of a sudden, everyone was talking about Facebook in a way they weren't before. And a couple of months afterwards, I wrote a cover story about Facebook for Newsweek, where I was working then.
But the problem with it was, first, in order to do this, you had to hand over personal data to outsiders, to the software developers who were writing applications for this Facebook platform. And the platform itself had its dreams dashed when mobile phones became popular and people used those as operating systems. So your iPhone apps or your Android apps would be the operating system that you used, and the developers turned their attention to writing applications for phones. The platform still persisted, but it became more of an exchange of information between the developers and Facebook, with Facebook giving a lot of the information to the developers.
In 2010 - and this was where I really say Cambridge Analytica started - Facebook said, we'll give more information than ever to the developers. So when you signed up as a user for one of these apps, the third party would not only get your information that you posted on Facebook but the information that your friends posted on Facebook. You would be giving away your friends' likes, which are very revealing, and their relationship status, sometimes their political status - all falling out of the hands of Facebook into the hands of these developers who could use them for their apps and were told, you can't sell these or give them to anyone else. But Facebook didn't really have strong enforcement to make sure that happened.
DAVIES: And so Cambridge Analytica ends up getting the data of tens of millions of Facebook users.
LEVY: That's right. There was an academic researcher who followed Facebook's rules in getting the information but then broke Facebook's rules in licensing the information to this company called Cambridge Analytica, which was run by this British military consultancy company, which made a partnership with a big funder of the far-right in the United States.
DAVIES: Right, the Mercers.
LEVY: Right.
DAVIES: Right. You said that it's - to this day, it isn't clear whether the company's election efforts used Facebook profiles. Is that right? I thought that was accepted fact.
LEVY: Well, they had the data handed over, but the Trump campaign, which worked with Cambridge Analytica, said that they really didn't use it much as a data source. But they helped with television ads, and they liked some of the people who worked there. It is fuzzy because on the other hand, the head of Cambridge Analytica, the company behind it, was boasting about the information they used. So I think it is a big mystery the degree to which that data was used in the election, but we do know the Trump campaign did use a lot of data and merged it with other databases.
I think the shock to people was that this information got handed over to this company that no one ever heard of that worked for Trump. And that became Facebook's biggest scandal, ironically, even though some of this information had been published in 2015 by The Guardian, the same place that returned with the scoop in 2018. But this is pre-Trump, and it wasn't a big deal then.
DAVIES: So this information about the 2016 campaign and then a subsequent hack of, like, 50 million user accounts sent the company's public image into a real nosedive. And Mark Zuckerberg went before Congress, and they promised serious steps to change things. What are they doing?
LEVY: Well, they've done a number of things. For one thing, they give you more information. When you see something that looks like it's fake on Facebook, they have fact-checkers go over some of these claims. And when they turn out they're not factual, they don't take them off, but they might downrank it in the newsfeed. Fewer people might say it. They give you a chance to mouse over and see a little more information about the publication where this is printed. Maybe it's, like, a phony publication that doesn't exist besides Facebook.
DAVIES: Now, those aren't ads. Those are items posted to the newsfeed, right? So...
LEVY: Right. Yeah. And also, they - during elections, they monitor things. They look for the signals of disinformation campaigns and try to shut them down. So they're doing certain things that would have stopped some of the tricks in the 2016 election. And in the couple elections since - the midterm in 2018, elections in France and other places - they've done a better job, but it's an open question of how well they're going to do in 2020 when the people using Facebook and trying to abuse it again are going to come up with a new set of tricks.
DAVIES: Steven Levy's book is "Facebook: The Inside Story." We'll continue our conversation in just a moment. This is FRESH AIR.
(SOUNDBITE OF COOTIE WILLIAMS' "RINKY DINK")
DAVIES: This is FRESH AIR, and we're speaking with Steven Levy. He is a veteran technology writer and the editor at large for Wired. He has a new book about the history of Facebook. It's called "Facebook: The Inside Story."
So Facebook has this challenge where people can post things on a newsfeed that may be misleading or false, right? And then...
LEVY: Correct.
DAVIES: People can run paid ads. What's the company's policy in dealing with potentially inaccurate or misleading information in the organic posts that people put on the site and the ads that people buy?
LEVY: Well, Facebook will allow people to post misleading content, but if it starts circulating a lot and people complain about it, it will have fact-checkers to verify whether that's true or not. And if it's not true, they may downrank it, show it to fewer people. Then they might provide extra information, saying, you know, this is a publication that you may not want to trust, or, here's some other articles about the same thing that are more factual. Ads are a different thing. Mark has gone out on a limb and said, we are not going to fact-check political ads. So if someone makes a false claim, even consciously, about an opponent in a political ad, Facebook is going to be hands-off.
DAVIES: Right, and that's been controversial.
LEVY: It's really controversial, but Mark has stuck to his guns on this. He went before Congress this past October, where people just pummeled him about this. But he feels that he doesn't want Facebook to be in the position of saying, you know, well, this politician's ad is fake. The thing is a lie, and this isn't.
DAVIES: Does the company have any disclosure requirements for funding sources of the ads?
LEVY: It's the same funding disclosures that people have for ads in print or on TV. You have to say, this is my ad, and I approve this content.
DAVIES: And for Facebook to be looking at not the ads but the posts by users, which might be factually suspect - I mean, given, you know, the scale of the posts every day, that's a huge challenge. How many people are doing this?
LEVY: Well, it is a huge challenge. So the problem they had in 2016 is these Russians were posting it on their fake accounts, so that's not allowed. So if the account is inauthentic, as Facebook calls it, it could be taken down. But if an authentic person somewhere posts something that's fake, Facebook will not take it down. And we've heard that one thing the outside actors - Russians - might do is get people in the United States to post the fake content that they create. And Facebook then, by its own rules, would not be taking it down.
DAVIES: Right. So a fake account can be taken down, right?
LEVY: Yeah. Yeah. So this - and this became clear early in the 2016 election, when there was a page called DCLeaks, which was meant to spread the emails that were hacked from the Democratic National Committee. And at first, Facebook said, we're going to leave this up because it looks like a legitimate account. And then they figured out that it was not an authentic account, and they took it down because of that.
DAVIES: Right, right.
LEVY: Not because it was circulating this stuff about the Democrats that was hacked.
DAVIES: You had one astonishing number in here. If I remember this accurately, in one three-month period, Facebook stopped 2 billion phony accounts from being registered.
LEVY: Yeah. There's an unbelievable number of fake accounts that people try to stop. Now, in a way, that's worse than it sounds because when people try to create these fake accounts, they just bombard the website with one after another, and the vast majority are easily identified by Facebook algorithms. But there are 5% of accounts on Facebook that are fake, and this is hundreds of millions - right? - because you have almost 3 billion people now on Facebook. So even though 5% doesn't sound like a lot, that's an unbelievably big number.
DAVIES: You know, in the middle of all this, Mark Zuckerberg talked about a new vision for Facebook. What is it?
LEVY: Yeah. Recently, Mark said that his vision for Facebook and the thing he's going to move the focus to is private messaging. So it's going to take advantage of WhatsApp and Instagram and Facebook Messenger, which is the messaging system that Facebook built from scratch, and use that as sort of the center gravity for a lot of activities that people do on Facebook. And it'll happen with end-to-end encryption, meaning that their messages will be private. And he's not going to make the newsfeed go away, but maybe it won't be quite as important to the future of Facebook as it is now when people spend more and more time using these other things in the Facebook family coming from these acquisitions that Facebook made a few years ago.
DAVIES: Wow. So in other words, rather than this huge community which has open registration and you can post something which can be viewed by literally a billion people, the focus is on communities, people that you know talking to each other, right?
LEVY: That's right.
DAVIES: It's a different kind of vision, right?
LEVY: It is a different vision, and Mark has talked a lot more about community in the last couple of years. And in a way, this is almost a revival of some of the initial impulses of Facebook before it became almost this giant broadcast system. That happened when Mark became obsessed with Twitter. He's looking ahead to see - and for one thing, the newsfeed and Facebook aren't as popular among young people, and these other applications that Facebook owns are. I spoke to a high school class about a year ago and asked them if anyone used Facebook, and I don't think any hands went up. But I asked if they used Instagram, and everyone raised their hand. They all use Instagram and Snapchat.
DAVIES: So what do you think the future is of this company?
LEVY: Well, it's not going away. And one thing that happens - in the midst of all the problems that Facebook has, it's making billions of dollars. Every year, it makes more money. The revenues get higher and higher. Their operating expenses are a little higher. They're doing more in research, and they spend more for moderating the content, policing the content and security. So it's almost like a split screen scenario. I described one day in Facebook's history when, because of a privacy-violating app that it used on the Apple system, Apple pulled the plug on it. And people in Facebook weren't allowed, because this app went away, to develop their upcoming apps for Facebook. And even the shuttles on the Facebook campus weren't running. So there was chaos in the workforce, but literally that same moment, Mark and Sheryl and the financial officer of Facebook were announcing record earnings.
DAVIES: So despite the fact it's one of the least trusted companies in America, I guess at least by some measures, it's one of the most profitable.
LEVY: That's right. So Facebook isn't going away, and we still use it. That's the other thing - that despite the problems with its reputation, Facebook still has the big numbers, and it's still growing. Now, its growth in North America is kind of flat, but around the world, it's still getting bigger. Pretty soon, I think they're going to say, we crossed the 3 billion user mark. And if you consider how many people are on planet Earth, that's an astounding percentage of all of humanity. And really, which is what led me to be drawn to do this in the first place before all that all these problems is this has never happened before - so many people getting on one network, which makes Mark Zuckerberg such a powerful figure in the world.
DAVIES: You know, you had enormous access to Mark Zuckerberg and others at Facebook to do this book. And I've always thought that the smartest people in media management were the ones who brought reporters in the door and talked to them at length, I mean, assuming they had a, you know, decent or at least plausible story to tell. And I'm just wondering how you deal with the inevitable criticism that all this access might have made you more willing to accept their explanations and credit their good faith or the possibility that it had that effect on you. It can seductive, I think.
LEVY: Well, you know, you have to be aware of that. I think that one thing that was an implicit promise, one that I kept, is that their point of view will be represented in there. And I think I would be criminally negligent if I didn't give you the flavor of that interaction I had with Mark, with Sheryl, with other important executives at Facebook. So it's a pretty colorful group of people. Sometimes, they would leave the middle of my work, and then I'd talk to them again, and they'd be even more candid. But I think that, ultimately, the facts drive the narrative. I think this is a book which is fair to Facebook. But yeah, I guess I think your listeners can tell from what we're talking about it's critical because the facts show that they haven't always behaved in the best possible way.
DAVIES: You know, the book is full of information and insight. And I've got to give you credit for that. So let me ask you - and you can take a pass if you want to - but for a couple of opinions. I mean, this is clearly a whole different kind of communication than humanity has ever seen. On the whole, is it a good thing?
LEVY: Well, that's a really tough one. I feel there's a lot of positive aspects that come of it. I think that you could argue that Facebook could have done things in a different way. Maybe if it wasn't so hungry for growth, it wouldn't have made it so easy for bad content to bubble up and be so prominent. I think - and that's what I was able to do in the book - is sort of identify those points where, gee, it took a turn in this direction that maybe it didn't have to take that made it a little worse than it would be otherwise while pursuing what might be a laudable goal. On the other hand, the base premise that connecting the world is a good thing is one that is an open question. You know, knowing what we think, is that good? And the jury's out on that.
DAVIES: Well, Steven Levy, thank you so much for speaking with us.
LEVY: It's my pleasure. Thank you, Dave.
GROSS: Steven Levy is editor-at-large for Wired. He spoke with Dave Davies about his new book, "Facebook: The Inside Story." Coming up, rock critic Ken Tucker reviews the new album "Bryan Ferry: Live at the Royal Albert Hall, 1974." This is FRESH AIR.
(SOUNDBITE OF THE ADAM PRICE GROUP'S "STORYVILLE") Transcript provided by NPR, Copyright NPR.