Facebook on Thursday announced sweeping changes to the way it plans to manage the newsfeed, the front door to the service for its 2 billion monthly users. Under the new regime, Facebook says users will see more content from friends and family, and less from brands and publishers. The new algorithm also will favor content that draws a lot of comments over posts that are popular, but don’t elicit comments.
Fred Vogelstein sat down with Adam Mosseri, Facebook’s vice-president in charge of newsfeed, to discuss the changes and why Facebook thinks they are necessary. Edited excerpts follow:
FRED VOGELSTEIN: Tell me about the announcement.
ADAM MOSSERI: So what we’re talking about is a ranking change where we're trying to focus or trying to look at how we might help—or use ranking to help people become closer together, connect people more. Newsfeed was founded—or Facebook was founded in a lot of ways—to connect people. So we want to see if we can do that better.
So what we're going to try and do is better identify and value meaningful social interactions between people. We want newsfeed to be a place where people have conversations, where they connect with people. So we're going to focus more on that, and less on how much time people spend on Facebook and on newsfeed, and less on even how much they share directly.
And so this will mean that a number of different things will happen, but content that generally gets—that facilitates or inspires more meaningful conversation or meaningful interactions between people will get more distribution, and content that does so less will get less distribution.
There will be more friend content and family content. There will also be more group content. Group content tends to inspire a lot of conversation. Communities on Facebook are becoming increasingly active and vibrant.
There will be less video. Video is an important part of the ecosystem. It's been consistently growing. But it's more passive in nature. There's less conversation on videos, particularly public videos.
There will be less content directly from (professional) Pages. Page content will still be an important part of the ecosystem, but it will shift a little bit. Content that is shared and talked about between friends will grow, and content that's directly consumed from Pages directly will shrink slightly.
FV: You guys tweak newsfeed all the time. This sounds like more than a tweak.
AM: So what we want to make sure is that anytime we make any major ranking change, we explain it proactively. But we make lots and lots of changes. Most of them are very small in nature with small effects, maybe they're fixing a bug here, maybe it's getting a little bit better at predicting shares there. But those we don't talk about because they don't have material effect. They add up over time, but we don't want to inundate everybody with every small thing that we do.
And so this one is bigger than the average tweak. It's not a tweak.
FV: The general perception of newsfeed is that popularity and buzz are very important, and that there are pros and cons associated with that. One of the cons, which obviously people have been talking about all year, is that people try to game the system, which tends to promote more extreme kinds of conversations. Is this a way of addressing that?
AM: This is primarily trying to help newsfeed deliver on its core promise of bringing people together, about connecting people with stories from their friends and family that matter to them. But also content that's not from friends, right? You might have a really engaging conversation with someone who shares interests in a group, for instance.
But connecting people with each other is the value proposition on which our company was built in a lot of ways. So I do think that it's consistent with what our values have been for a long time. But it's really about creating more good—helping newsfeed become a place where there's a vibrant, healthy amount of interaction and discussion. It's less about reducing any sort of problematic content types, which is another area of work that we focus on intently.
FV: What are the specific things that you're going to do to make all this happen?
AM: So one of the key things is understanding what types of interactions people find meaningful, what inspires them to interact more or share more in the future. Some of the specific things would be like we're going to be (weighing) long comments more than short comments, because we find regularly that if you take the time to actually write a more thoughtful perspective on something that correlates positively with a comment that someone actually would respond to or Like. It also correlates negatively with problematic content types like spam or uncivil content, et cetera.
AM: Comments in general, this was true before (the change). But it's more true after. Comments are more valuable than Likes. If you bother to actually take the time to respond to something that I posted, a picture of maybe my two kids. It's a pain actually to type on a mobile phone. Liking is pretty easy; that's the whole point of Liking.
FV: Where does news fit into all of that?
AM: So news content, some news content that is shared and talked about a lot will receive some sort of tailwind from this. And news content that is more directly consumed by users—that they don't actually talk about or share—will actually receive less distribution as a result.
But overall the way the ranking change works is it doesn't take a look at news or even at video and say, we want to value that less—or friend content and say we want to value that more. It takes a look at what stories actually inspire meaningful interactions between people, and values those more.
So if a specific piece of news or even a video we think will inspire more conversation or more interaction, that will actually do better post-launch of this change. But on average video content tends to facilitate less interactions because it's passive in nature.
I mean, two reasons why we're excited about this, one is we do hear consistently that people want to interact with friends and family in newsfeed, and we want to always do everything we can to respond to the asks and the interests of our community.
But the other is a lot of the research that we've done and the research that's out there in the field and in academia that we've read suggests that online interacting with people is positively correlated with a lot of measures of well-being, whereas passively consuming media content online is less so.
FV: It sounds like there's kind of a fine line that you’re trying to walk there. I know, for example, you have been spending time thinking about informed-ness as it pertains to news. How does that fit into what we're talking about here?
AM: So focusing on social interactions I think is going to be an important thing that we do for the foreseeable future, but it is not meant to encapsulate all of the different things that we value, right?
Our second newsfeed value—our newsfeed values are public—is to help inform people about the world around them. So we try to measure that in a variety of ways. The predominant one right now is that we actually ask people through a lot of surveys every day—tens of thousands—how informed they find specific stories. And then we actually even try to predict that.
FV: Others have sort of talked about the possibility of actually creating white lists of the most trustworthy publications whose content gets special treatment. Is that rolled up into what we're talking about here?
AM: I think it's separate. I mean, in general we have an immense amount of responsibility, and part of that responsibility is to do everything we can do maintain the integrity of the information that flows through our system. But also given our scale, we need to be very thoughtful and careful about where we act and where it would be inappropriate to act.
So for instance, we don't want there to be false news on our platform. But we also don't think we can responsibly be in a place where we're deciding what is false and what is not.
AM: So therein lies an obvious tension. This change doesn't affect those efforts. It's not bad for those efforts. It's not good for those efforts. It's just more about nurturing and creating more good. It's really about trying to make sure that the time people spend on our platform is time well spent. It's not about addressing false news or other forms of problematic content, though that is a continued area of focus and investment for us.
FV: Talk to me about like the evolution of this. What's changed over the course of the past 18 months to make you feel like this is something worth doing?
AM: The biggest thing has been just the explosion of video. Video is a paradigm shift in a lot of different ways. We've done a lot to try and nurture it. We think video is going to continue to be a more and more important part about how people communicate with each other, and how publishers communicate with people.
But as video has grown on Facebook, it has changed the nature of how people interact with the platform in a lot of different ways. Video is, primarily, a passive experience. You tend to just sit back and watch it. And while you're watching it, you're not usually liking or comment or speaking with friends. So this change is, in part, a reaction to how the ecosystem has shifted around us.
FV: When you talk about video, does that apply to advertisers as well?
AM: Ads is a separate system. So in terms of this ranking change, it doesn't apply.
FV: What do you do if I want to write a long comment that’s just mean?
AM: That can also happen. Nothing we try and optimize for is going to be perfect. So we try and pick the measure of value that has the least issues that we can find.
For instance, clicks are valuable. If you click on something you're more interested in than something that you didn't click. But, clearly, clickbait gets people to click on things that they don't actually want to see. People don't like clickbait. Every time we ask them in surveys where we just show them two headlines—one which is clickbait, one which is not—they are very specific about the fact that they don't like clickbait. They also very consistently click on clickbait.
So the way we address that is we actually define clickbait, we label tens of thousands of examples in I think probably over a dozen languages. We try to identify it using classifiers. And when we do identify it we value it less in the rankings.
FV: How has the past year played into what's happening now?
AM: I think one of the biggest things to be clear about is we were invested in a lot of these integrity efforts pre-2016 – and I think invested heavily in some of them that are particularly important that get talked about less, things like spam and violence and hate speech, et cetera.
I think we were surprised by a bunch of things. False news caught us off guard. We had worked on it, we'd certainly even proactively announced some work to reduce the prevalence of hoaxes. But I actually think false news caught most of the world off guard.
So I think we've done a reasonable job over the last year investing more in some of these problems that we hadn't invested as much in before. But I also think we have a long way to go. A lot of the problems that we're trying to tackle are complicated, and will take a long time. And that's not a way of trying to absolve ourselves of any responsibility, it's more with trying to communicate that we are really committed to getting this right.
FV: Have you briefed publishers that all of a sudden their stuff is going to get down-ranked?
AM: We are talking to a number of different publishers.
FV: One of the things that I've always wondered is why newsfeed is so flat. The posts about my dog and your kids look the same as posts from The New York Times, TMZ and anything else. I know the original thinking was to not disadvantage unbranded friends and family content. But I can also see how having some visual signals in newsfeed might actually help people better sort out what's what.
AM: Yeah. So, in general, we're not opposed to variation in aesthetics or in visual design language in news feed. The tradeoff is always the tension between making the feed more complicated. The other thing that's sometimes a challenge is deciding what to differentiate. So differentiate friend content from public content is one thing. But differentiating some subset of public content from another set of public content is a more complicated thing.
So one idea that we hear about a lot is, “Can you just differentiate real news publishers from non-serious news publishers?” Which would then put us in a place where we would have to decide who's a real news publisher and who's a non-legitimate news publisher. Even defining what is news and what is not news is a blurry line that is not something I'm confident we would be able to do well.
That said we're always exploring these types of ideas. So one thing that we think is valuable, and we're actively trying to pursue, is how can we help publishers in general, news or otherwise, better communicate their brand. We think that's good for the publisher and good for us.
Ultimately, if a publisher posts something that is valuable, that credit should accrue to the publisher and having a more prominent brand would help that happen. And if a publisher shared something on the platform that is upsetting or problematic in some way, they should also be accountable. So we think more effectively helping publishers communicate their brand in news feed is a good thing.
FV: I could imagine you not wanting to decide what was news. On the other hand it wouldn't be hard to decide what's opinion and what's not. Newspapers do that all the time.
AM: We just deal with a very different nature of a problem. So let's say, What goes in which section? Forget about op-ed versus not. It's a decision that gets made by a handful of people and probably the equivalent of a page-one meeting at 9:30 in the morning at the average publication.
And that's possible because there's a limited amount of information that actually gets published by a publisher on any given day that that room can actually have a sense of every single one of those pieces.
We deal with over a billion things posted on the platform a day. So the way we can do that isn't to have a few people sit around and talk about the specifics and the nuances. We have to build scaled systems. It doesn't mean it's impossible. It just is a very different nature of a problem. We have to build classifiers and guidelines and labeling systems and pipelines, the rest of it.