NEW YORK — Apple and Facebook have figured out how to keep us glued to their devices and platforms. But they haven’t figured out how to curb the misinformation that plagued them during the 2016 election and have struggled to regain public trust. And now, as the midterm elections approach, they certainly don’t agree on a solution.
Last week, Apple launched a human-curated political news section to help readers steer clear of falsehoods surrounding the midterms. The company’s announcement reignited a fiery debate with Facebook about whether tech giants should hire people to curate news or rely on algorithms instead.
Apple has used human editors to curate news content in “Top News” and other specialized sections since the application first launched in 2015, and said it will continue to do so for the midterm elections news section. The company uses a combination of human editors and machine learning to manage more tailored content in personalized feeds for users.
“News was kind of going a little crazy,” said Apple Inc. Chief Executive Officer Tim Cook at the recent Fortune CEO Initiative, subtly referring to Facebook’s struggle with the foreign actors such as Russia, profiteers and bots that took advantage of its News Feed algorithms for financial and political gain during the presidential election. “We felt the top stories should be selected by humans,” Cook said.
Apple’s midterm election section will feature material from The Washington Post, Politico and Axios, in addition to coverage from other sources the company describes as “trustworthy.” But critics contend this coverage is limited, given that Apple’s curators will only promote articles from a few legacy outlets. The inclusion of the Trump administration-friendly Fox News also struck a chord.
Apple co-founder and former CEO Steve Jobs once called Fox News a “destructive force in our society,” according to Walter Isaacson’s biography of the tech mogul. The company’s current leadership contends that all content featured in the midterm election news section, including articles from Fox News, will be vetted for high-quality reporting and sourcing.
“This election season, our editors will highlight the most important, rigorously reported news to help you understand key races and your fellow voters,” wrote Apple News Editor-in-Chief Lauren Kern in a note on the news app. “We won’t shy away from controversial topics, but our goal is to illuminate, not enrage.”
But human editors and algorithms can display repetitive patterns of behavior when curating the news, warned Pete Brown, the author of a June study published by the Tow Center for Digital Journalism that examined Apple News’ editorial decisions on Twitter and in newsletters.
“Humans, like algorithms, are prone to habit,” Brown wrote. “Apple News may have fallen into a pattern that Facebook and others have been trying to avoid: editorial bias.”
The study, which analyzed almost 7,000 news recommendations made by Apple News, found that editors had a strong tendency to favor a select group of legacy media outlets. For example, editors in the U.S. showed a preference for The New York Times over smaller, regionally oriented outlets.
The study didn’t analyze any news recommendations featured on the app itself — which is the primary means by which Apple delivers news to users.
Apple declined to comment on its plans for the midterms.
However, while human curators may display patterned behavior in selecting certain news articles, they can explain their decision processes. Computers can’t.
“There is always going to be a greater degree of transparency with human editors than with algorithms,” Brown said. “We can ask, ‘Why are you choosing these publications over another one? What are the criteria in which you’re making these decisions?’
“Whatever happens, there’s going to be criticism from both political sides”
During the last presidential campaign, Facebook relied on human editors to curate the popular news topics listed in its “Trending” section. But when the company received criticism from conservatives who felt it was prioritizing liberal-leaning content, the company replaced its editors with algorithms.
“Making these changes to the product allows our team to make fewer individual decisions about topics,” Facebook said in a statement after it dismantled the team of editors in August 2016.
But those algorithms have also brought Facebook a whirlwind of congressional hearings and headaches. In recent months, the company abandoned its automated “Trending” feature and restructured the News Feed algorithm to rank news sources on a trustworthiness scale determined by users.
Fox News has benefited from Facebook’s algorithmic changes to highlight “trustworthy” sources. This past April, the outlet generated the most engagement on Facebook, outpacing sources such as CNN, NBC and The New York Times, according to social media analytics company NewsWhip.
Among other attempts to curb misinformation, Facebook recently announced the expansion of its fact-checking operation and the creation of new automated “Breaking News” labels for quality content.
Media critics and journalists take issue with what they deem Facebook’s lack of transparency.
“Algorithmic transparency is basically nonexistent,” Brown said. “We know very little about how these black boxes curate news.”
Facebook couldn’t immediately be reached for comment.
With the midterms just months away, the tech giants are wary of backlash from both ends of the political spectrum, no matter what decision they make. “Whatever happens, there’s going to be criticism from both political sides,” Brown said. “That’s all the more reason to be transparent about that curation process.”
——
©2018 Bloomberg News
Visit Bloomberg News at www.bloomberg.com
Distributed by Tribune Content Agency, LLC.
———
PHOTO (for help with images, contact 312-222-4194): ELN-FACEBOOK-APPLE-FAKENEWS