In February 2018, Facebook made groundbreaking new shifts to the foundation of its News Feed algorithm, causing an earthquake in countless advertising and content distribution strategies. Unlike last year’s algorithm, the Facebook News Feed will now prioritize video content by its number of comments and replies from friends, and by the level of engagement with a post shared by a friend. Mark Zuckerberg explains the shift as being in favor of “meaningful social interactions,” above “relevant content,” where users will now be more likely to consume posts pushed by their friends than brands or publishers.
Meredith Broussard, a Professor at NYU’s Carter Institute for Journalism, believes that this algorithmic shift will create an entirely new problem, what she refers to as a “Filter Bubble,” borrowing the title of Eli Paliser’s book on the topic. Broussard, and countless other media theorists, fear the consequences that result from users consuming only content favored by friends versus content pushed by established publications. “In a classroom,” she explains, “you would probably have multiple voices from across the political spectrum, but in the algorithmic world you only have a narrow range of perspectives given to you because that's how the algorithms are designed.”
This sort of bias, she explains, is uniquely digital. While she concedes that before the digitization of media, there were publications that were proudly politically aligned and others that remained their best to stay neutral, Broussard argues that print newspapers more clearly delineated the difference between opinion sections and objective reporting sections. Today, she argues that when developing content prioritization algorithms, “software developers still don’t understand that different sections of the newspapers function differently...when you treat all content as exactly the same, that's one of the reasons that we ended up with the fake news crisis. To the computer, the fake news looks exactly the same as real news. And the computer is not human, it doesn’t have the ability to differentiate.”
Broussard encourages individual users to learn how to visually identify fake news articles by reading from recognizable URLs or publications, but still suggests actual digital media innovation as a solution, such as reworking Facebook’s new friend-centric algorithm.
Justin Hendrix, a journalist for The Economist and the Executive Director of the NYC Media Lab, agrees that the solution includes “Training people to be better consumers of information. We need the population to be smarter.”
However, unlike Broussard, Hendrix is more pessimistic. He believes that Facebook users’ demand for content inadvertently favors the spread of disinformation, or “the disintegration of the process of truth seeking.” The consequence, he believes, is “a vacuum where people are trying to search for a universal truth but everyone’s bubble creates their own unique perspective of truth.”
We find ourselves facing a challenge of correcting a culture of opinionated ‘news’ with inaccuracies at best, and lies at worst. Despite the recent shifts, even Zuckerberg himself is not yet quite sure how to mediate algorithms in a way that’s socially productive, or in a way that can prioritize reason, non-partisanship, and truthful content from fake.
Also involved are media entrepreneurs, a group of motivated technologists who understand that the first step to correcting such problems is to identify specific places where media and its distribution tactics went wrong, and to make that more visible to the world. This would help consumers more clearly and easily distinguish real journalism—published using professional standards, from opinion and commentary—from stories that are simply made up.
Matt Hartman, partner at betaworks ventures, is interested in investing in startups that produce content that offer lasting value to their users, unlike the “sugar rush” content, as he refers to it, that many entertainment companies release for quick satisfaction. For example, betaworks invested in Shine, an app that provides daily messages for millenials to promote well-being both in their personal lives and in their careers. This sort of platform, argues Hartman, provides users with information that will enrich them. There will be new business models supporting this type of content, which reward the content provider for providing long-term value to the user, versus content that feels like a sugar rush but leaves readers regretting having wasted their time.
Now that users are beginning to understand how this works, the “sugar rush” mechanics are becoming less effective. Rather than leveraging this type of psychology to keep users hooked for a few months, Hartman argues, new companies that provide longer term value to users. Catchy headlines are examples of this type of poor, trendy media. Higher signal media, even if snackable, provides users with tangible value, such as mental exercise or new information. “Younger people are much more savvy about how services are trying to rehook them,” Hartman says, “And that’s an instantiation of this idea that ‘I want you to respect my time.’”