It’s becoming more and more difficult to tell if online content is real or if it has been fabricated. This is especially true of content shared on social media. Part of the problem is that disinformation campaigns, where people are deliberately spreading misinformation with the intent to deceive others, are on the rise. Technology is also amplifying the problem by making it easier and faster than ever before to spread these falsehoods.
So what can we do? One of the first steps we can take to combat this rise in misinformation is to learn how these bad actors are trying to persuade others through disinformation strategies. By knowing what to look for, we can be better prepared to spot misinformation when we see it in action.
General Strategies
The following are seven general disinformation strategies that are commonly used to deceive and persuade.
According to a Spencer Feingold article from the World Economic Forum, disinformation campaigns often attempt to discredit experts or authorities who support an opposing viewpoint. They might do this by spreading false accusations, connecting the authority figures to conspiracy theories, or by painting them as corrupt. If they can taint the opposition’s credibility, they can make their own false claims seem more believable.
This strategy involves taking on the appearance or voice of a real person and using their identity to spread false information. This might mean making up a fake quote and illegitimately attributing it to someone, or it could mean using technology to create a deepfake, such as a video, picture, or audio recording of someone that has been manipulated in some way to mislead an audience. These videos or voice recordings can make it appear as if someone is saying something that they never actually said.
This is different from a deepfake. With a deepfake, someone uses a real person’s likeness and then manipulates it. In the case of a fake persona, a new person is simply invented out of thin air. These fabricated personas might then be labeled as experts or tagged with false credentials. They might even be inserted into a staged photograph that appears to put them in official or important locations. These fake personas are used to convey a sense of authority in order to convince an audience that their misleading words or points of view have merit. They’re not real, but they seem convincing and authoritative.
The American Psychological Association (APA) points out that people are much more likely to react to and share content that elicits a strong emotional response. Content that makes someone angry or fearful is especially effective.
Disinformation campaigns often try to divide people into conflicting and polarized groups. These divided identities might be defined by criteria such as political party, class, or race. In a commentary published by the Brookings Institution, Mathias Osmundsen, Michael Bang Petersen, and Alexander Bor recognize this approach and specifically call the spread of fake news “a symptom of our polarized societies,” where people actively seek out information that affirms their point of view, regardless of its authenticity or accuracy.
A conspiracy theory is a belief that a secret plot is being carried out by powerful people to accomplish some type of sinister goal. Dr. Karen Douglas, a professor of social psychology at the University of Kent, says in a podcast episode with the APA, “People are drawn to conspiracy theories . . . in an attempt to satisfy three important psychological motives.” These motives include a need for knowledge and truth, a need to feel safe and secure, and to “feel good about themselves as individuals and also feel good about themselves in terms of the groups that they belong to.” People engaging in conspiracy theories often feel like they are finding a collective truth that satisfies these needs. Because conspiracy theories can be so powerful, disinformation agents may devise and amplify a conspiracy theory in order to advance their agenda.
This is the practice of putting misleading headlines on a story or article. The headline, visual, or caption does not support or align to the content within the article. For the many information consumers who never read the article, the headlines become misinformation. Those who do read the article may be subconsciously influenced by the headline which, in turn, may influence how they consume the content.
Social Media Strategies
In addition to the seven general persuasion strategies that may be used, there are other approaches that are used more specifically on social media platforms. These techniques often utilize the previous seven strategies, and then activate them in more specific ways.
The World Economic Forum describes this approach as the process of analyzing social media accounts in order to specifically target, or direct posts toward, people who are most likely to believe the content and also amplify it through resharing. People might be targeted because of their affiliation with a political party or an economic demographic. Advertisers do this with products, and disinformation agents do it with ideas and false messaging.
Astroturfing involves posting overwhelming amounts of content from fake accounts, making it artificially appear that a specific point of view has more grassroots support than it actually does. Appearance can seem like reality, and if a lot of people appear to support a specific point of view, others may be persuaded that the seemingly popular perspective and content have merit.
This strategy involves flooding social media platforms with posts and comments to drown out other perspectives with sheer quantity. While similar to astroturfing, this approach is less concerned about the perceived grassroots source and is more focused on the overwhelming quantity of posts.
The goal of this approach is to get a prominent or influential person to share misinformation. When a person who is well-known and respected posts content, that post not only amplifies the message to their many followers, but it also gives the content credibility by association to the person. This strategy is supported by the “power law” of social media, which states that “messages replicate most rapidly if they are targeted at relatively small numbers of influential people with large followings.” In other words, when a well-known person shares misinformation, it can spread very quickly.
Both of these approaches involve insincere and misleading posts on social media accounts that are intended to persuade or influence perspectives. Bots are automated computer programs that typically target like-minded audiences who are inclined to believe the group’s talking points. These recipients of the messaging are easier to convince since they are already inclined to support that point of view. However, bots are also easier to detect by social media monitoring algorithms and, in turn, are more likely to be removed from the platform.
Trolls, on the other hand, are real humans who join online conversations. Like bots, they are there to promote specific viewpoints or to spread misinformation. Because they are human, however, trolls are harder to detect by algorithms, and they may also be able to convince less gullible social media users by interacting with them personally and responding directly to their posts or questions.
Being aware of these persuasion strategies can be empowering for adults and students alike. They can help us all identify disinformation when we see it in action, and that awareness can potentially prevent us from being misled and believing information that is untrue.
AVID Connections
This resource connects with the following components of the AVID College and Career Readiness Framework:
- Instruction
- Rigorous Academic Preparedness
- Student Agency
Extend Your Learning
- Disarming Disinformation: The 5 Tactics That Threaten Internet Spaces (Young African Leaders Initiative)
- One Strategy Democracies Should Use to Counter Disinformation (Carnegie Endowment for International Peace)
- Tactics of Disinformation (Cybersecurity & Infrastructure Security Agency)