Wednesday, March 10, 2021

Studies highlight social-media platforms' misinformation problem, reminding us of the key role of the news media

Recent studies show misinformation is rife on popular social-media platforms, pointing up the importance of the traditional news media as a source of information and the difficulty of journalists as Americans seek confirmation of belief more than information that may run contrary to belief.

Instagram recommended posts containing misinformation about the pandemic last fall, according to a study by the Center for Countering Digital Hate, based in England. It created 15 profiles and found that the app recommended 104 posts with false claims about the coronavirus, Covid-19 and the 2020 election. "The study is the latest effort to document how social media platforms' recommendation systems contribute to the spread of misinformation, which researchers say has accelerated over the past year, fueled by the pandemic and the fractious U.S. presidential election," Shannon Bond reports for NPR.

"Facebook, which owns Instagram, has cracked down more aggressively in recent months,:" Bond reports. "It has widened its ban on falsehoods about Covid-19 vaccines on its namesake platform and on Instagram in February. But critics say the company has not grappled sufficiently with how its automated recommendations systems expose people to misinformation. They contend that the social networks' algorithms can send those who are curious about dubious claims down a rabbit hole of more extreme content."

A new study by New York University-based group Cybersecurity For Democracy "found that far-right accounts known for spreading misinformation are not only thriving on Facebook, they're actually more successful than other kinds of accounts at getting likes, shares and other forms of user engagement," Michel Martin reports for NPR. After studying more than 8 million Facebook posts from nearly 3,000 news and information sources over a five-month period, the researchers confirmed "what some Facebook critics — and at least one anonymous executive — have been saying for some time: that far-right content is just more engaging. In fact, the study found that among far-right sources, those known for spreading misinformation significantly outperformed non-misinformation sources."

Laura Edelson, who helped lead the study, told NPR that misinformation peddlers in other partisan categories don't gain as much traction. "There could be a variety of reasons for that, but certainly the simplest explanation would be that users don't find them as credible and don't want to engage with them," she said. The researchers called this the "misinformation penalty."

A Facebook spokesperson told NPR that extreme partisan content isn't as pervasive as studies suggest, and "engagement" isn't the same as how many people actually see a post. Edelson said Facebook should back up that assertion by being transparent with how it tracks impressions and promotes content. "I think what's very clear is that Facebook has a misinformation problem," Edelson said. "I think any system that attempts to promote the most engaging content, from what we call tell, will wind up promoting misinformation."

Facebook cracked down on misinformation after the election, demoting posts and known misinformers. The move served as proof to many that Facebook could do more to halt the spread of misinformation but generally chose not to in the name of getting more traffic, The Washington Post reports.

"All of these changes may, in fact, make Facebook safer. But they also involve dialing back the very features that have powered the platform’s growth for years. It’s a telling act of self-awareness, as if Ferrari had realized that it could only stop its cars from crashing by replacing the engines with go-kart motors," Kevin Roose writes for The New York Times.

And Facebook's efforts arguably didn't help much, the Post reports: Facebook users who wanted to read that type of content responded not by consuming less of it but by decamping to Parler and other social-media apps popular with conservatives. 

Journalists have warned readers for years about the growing threat of QAnon, the Proud Boys, and other extremist groups that routinely organize on social media, but such warnings don't do much to sway Americans who distrust the news media, Rob Tornoe reports for Editor & Publisher.

"Most of these people aren’t just going to suddenly start reading real news. It’s not going to happen," Ben Collins, who covers disinformation, extremism and the internet for NBC News, told Tornoe.

No comments: