Humans spread fake news faster than bots

A new study found that true information took, on average, six times longer to reach 1,500 people, compared with false stories. File Photo by LoboStudioHamburg/Pixabay

SATURDAY, March 10, 2018 (HealthDay News) — Amid growing concerns about the impact of “fake news,” a new study finds that false stories take off much faster than truth on Twitter.

The study, of news and rumors shared by 3 million Twitter users, found that false information spreads more quickly and further than accurate information.

Falsities were about 70 percent more likely to be “retweeted” than truth, said the researchers. They were led by Sinan Aral, of the Massachusetts Institute of Technology, in Cambridge, Mass.

False stories often came from “bots” — automated accounts that impersonate real users. But it seemed that humans were the main reason that fiction spread faster than fact.

Reports of false information related to the 2016 U.S. presidential election put the spotlight on the power of false news to influence public opinion.

With midterm elections approaching, false news remains a concern.

In this study, “novelty” seemed to be key, Aral’s team said.

False stories typically contained something new or surprising — whereas true stories could get repetitive.

“People are more likely to spread novel information, which favors the spread of falsity over the truth,” Aral said in a statement.

And what is the impact of all this fast-moving false information? No one knows yet, said Filippo Menczer, a professor of informatics and computer science at Indiana University Bloomington.

“It’s very challenging to study how this actually affects people,” Menczer said.

Propaganda and manipulation have existed for a long time, he noted. But the rapid, widespread dissemination of false information via social media is new.

And it’s a concern, Menczer said.

He is the co-author of a perspective piece published with the study in Friday’s issue of Science.

For the study, Aral’s team analyzed about 126,000 stories tweeted by roughly 3 million Twitter accounts between 2006 and 2017. They included traditional news media stories and also tweets that were spreading rumors or claims.

The researchers verified the accuracy of the stories by consulting fact-checking websites that investigate media information and widely circulating rumors — like snopes.com and factcheck.org.

The investigators found that, overall, false stories were retweeted much more often than true stories. For example, truth “rarely” diffused to more than 1,000 people, whereas the top 1 percent of false-news tweets routinely reached anywhere from 1,000 to 100,000 people, according to the report.

The truth was also slow-moving, the study found. True information took, on average, six times longer to reach 1,500 people, compared with false stories.

And while bots were often spreading falsehoods, they disseminated true stories at the same rate, the researchers said. Instead, humans seemed to be the driving force behind false stories’ popularity.

Menczer agreed that “novelty” helps explain why people retweet false information.

But there are other factors, he said.

If a fishy-sounding tweet happens to align with what you already believe, you will be less likely to question it, Menczer noted.

For example, he said, if you “already dislikeĀ Donald Trump,” and see a tweet about some outrageous thing he did, you may well have an emotional reaction and hit that “retweet” button without much thought.

“If I’m just reacting on an emotional level and clicking ‘retweet,’ then I can become part of the problem,” Menczer said.

Twitter provided the data for the study and funded the work. But the researchers said they cannot comment on how the company might use the findings.

Menczer said that Twitter, Facebook and other social media sites have a responsibility to address “fake news,” and have taken some steps. Twitter, for example, announced that it blocked some accounts linked to Russian misinformation and alerted users exposed to the accounts that they might have been “duped.”

But Menczer said that to truly address the problem, he believes the sites should work with academic researchers, and not just use their own in-house researchers.

For now, he suggested, people may want to be more cautious about clicking those easy “like” and “share” buttons.

“Remember that any of us can be manipulated,” Menczer said. “It’s not just ‘other people.'”

More information

The Center for Digital Ethics and Policy has more onĀ ethical online behavior.

LEAVE A REPLY

Please enter your comment!
Please enter your name here