A study from Indiana University says automated accounts on Twitter influenced opinions and administrative talking points in the 2016 election. And researchers say the findings could be applicable to other networking sites.
The study examined 14 million messages between May 2016 and March 2017. It focused on bot accounts – accounts that are automated, rather than controlled by users. It found that about 6 percent of those accounts were responsible for spreading 31 percent of what was considered “low-credibility” information around the election.
The bots were also responsible for sharing articles from low-credibility sources, recognized for producing false or misleading information.
Indiana University Professor Filippo Menczer led the study. He says they would first find a claim or an article and look for the “network of diffusion” – or who was retweeting and sharing the claim.
Then, they would look for bots.
“And we did find, yes, that bots do play an important role. In general, the accounts that are most active in spreading fake news are much more likely to be bots,” Menczer says.
Menczer says the way the bots engaged with the tweet would get it more views, helping to spread the misinformation more broadly. He says that engagement tended to happen in the ten-second window before a story or tweet went viral.
“Real users tend to be influenced by this, and we have found that they do retweet messages from these bots, which include links to unsubstantiated claims,” Menczer says.
The study recommends requiring CAPTCHAs or similar security to send messages, in order to ensure a human is behind it.
But Menczer says as humans get better at detecting and removing bots from websites, the people creating automated accounts will find ways to go around those tests. He says users on social media sites should be fact-checking and thinking critically before sharing claims.
The twitter bots also engaged in other tactics, such as tweeting at high-profile accounts. That included the account of then-presidential candidate Donald Trump.