Fake thumbs can silence real voices in an online commentary
Are online discussions affected by bot-assisted, computer-based propaganda operations? Sang-Pil Han, associate professor of information systems, studied this question as it relates to something vital to democracy: people’s willingness to speak their minds.
This past September, Meta booted some 1,600 fake accounts off Facebook to disable bad actors in Russia who were trying to influence public opinion about the war in Ukraine. That’s recent news, but fake news and opinion manipulation online have been making headlines in a big way since 2016, when Russian interference in the presidential election triggered alerts from U.S. intelligence agencies, an FBI probe, and a special council investigation that resulted in 34 criminal proceedings.
Despite the abundant coverage and research related to fake content creation and posting, little has been done to see how bot-assisted, computer-based propaganda operations impact online discussion. That’s why Sang-Pil Han, associate professor of information systems, studied this question as it relates to something vital to democracy: people’s willingness to speak their minds. His research shows that manipulation of public commentary can, in fact, dampen other people’s willingness to comment. “That undermines the discourse that is necessary to a healthy democracy,” Han says.
The data Han and his research team used in their study goes back to the “Druking” scandal, an infamous South Korean disinformation operation that occurred in 2017 and 2018. Druking was the screen name of a popular blogger who also had a company that ran illegitimate political campaigns online. This scandal involved manipulating user perceptions of other people’s viewpoints by using an automation tool to give up-votes to comments in line with Druking’s anti-government views and down-votes to comments that were supportive of the government, opposing Druking’s views.
“He never posted any comments,” Han explains. “He took advantage of the authentic original comments posted by the general audience. When comments were congruent with his political stance, he ramped up their place in the ranking with upvotes. When comments were against his political interests, he decreased their visibility by bombarding them with downvotes,” Han says. “Druking didn’t post fake news. He simply manipulated the visibility of existing opinions.”
Manipulating a comment's thumbs up and thumbs down may not seem like a very powerful disinformation approach, but it was in South Korea. News portal sites similar to Yahoo are very popular, and attention to the reader comments posted to them are an integral part of South Korean news habits, the researchers noted in a paper.
“A nationwide survey reported that 70% of respondents read other users’ news comments in portal sites during the past week, 21% posted comments for themselves, and 31% enacted click speech (e.g. liking) in response to others’ comments,” the scholars wrote. That survey was conducted in 2018, the year Druking was arrested for opinion rigging.
To see whether Druking’s efforts affected other site users, the research team focused on one site and one story that garnered 39,827 comments within 26 hours. Within those hours, 1,389 bot accounts controlled by Druking cast more than 56,000 up- and downvotes to reshuffle the order in which the comments appeared to site readers.
The researchers based their hypothesis — that people would avoid commenting if they perceived that their views differed from most others’ views — on the “spiral of silence” theory. First presented by a political scientist, this theory holds that people with minority viewpoints are less likely to express their views due to a fear of being ostracized because of their unpopular stands.
Keeping your voice down
Like many misinformation campaigns, Druking’s activity gave people the impression that the manipulated view was the popular view, resulting in false amplification of that standpoint. “It amplifies the like-minded people’s opinions because people feel more licensed to speak out,” Han explains.
False amplification also makes alternative views look unpopular, even when they’re not. “Those who have opposing views will shut themselves up. They’re not going to express their thoughts,” he continues. Consequently, false amplification also creates a false diminution of viewpoints that seemingly go against the crowd.
To see where the crowd actually stood, the researchers used other data, including age, gender, commenting propensity, and whether the user was pro-government or anti-government, like Druking. The researchers found that 97% of the users were neutral in their choices for political news, indicating that these people didn’t lean toward news sites with obvious partisan affinity.
Still, despite the fact that other news sites might expose readers to a different type of commentary on the focal news story, the researchers found that having views that were incongruent with Druking’s manipulated comments did make site users less likely to comment, as well.
Han considers this silencing effect ominous. Originally from South Korea, he is very familiar with the dictatorship in North Korea. “There is no media, there is no freedom of speech,” he says. “Everyone basically remains silent. They cannot express their opinions.”
He adds: “This silencing effect is an important wake-up call. If we suppress freedom of speech and only allow the like-minded to speak out, it will be similar to what is happening in North Korea. We want society to be as healthy as possible, but only allowing the like-minded to speak out undermines democracy.”
The website under study in Han’s research has since started to limit the number of up and downvotes people can make to three per day. Still, he says the lack of publicity around fake engagement is troubling.
People know what fake news is, and they may be able to tell it apart from real news. But when they see particular opinions trending up or down, very few suspect that legitimacy. It’s important to educate the public about fake engagement, too.”
That’s especially true for younger people, he adds. “Younger generations were born with the internet, social media, and mobile phones. They’re social media natives,” he explains. “In that sense, they’re more susceptible to what other people think about a topic. The harm of an operation like Druking’s could be disproportionately more severe for younger people.”
- Department of Economics launches short paper competition
Effort encourages exploration of economics program and current events in the industry.
- Will Tucson benefit from Super Bowl LVII?
A marketing expert says the whole state will benefit from the upcoming game.
- Eight ASU online business programs rank in US News’ top 10
In the newest U.S. News & World Report online program rankings, the W. P.