Misinformation, spin, deceit – they’ve all been around since humankind stopped grunting and started speaking. But today the stew of readily available social media platforms, global reach, advanced algorithms, advertising channels, and unscrupulous fortune hunters has produced a unique news environment. Add in make-or-break elections and you get an information sphere like none other in history.

Election systems and processes are especially vulnerable to the spread of disinformation. This type of false content, including videos, infographics, memes, data and articles, can significantly diminish the effectiveness and credibility of election management bodies around the world.

To help election officials mitigate misinformation, Smartmatic has published the second edition of its handbook, “Protecting Elections in the Age of Fake News.” This document offers practical and applicable advice to prepare an electoral organization before a communication crisis occurs as a result of a disinformation campaign, as well as steps to manage it. It also includes a glossary with terms that can be useful when dealing with misinformation.

Below five terms from that glossary that you should know when preparing a plan to manage and mitigate a communication crisis:

Bots (robots) are social media accounts operated entirely by computer programs and are designed to generate posts and/or engage with content on a particular platform. In disinformation campaigns, bots can be used to draw attention to misleading narratives, to hijack platforms’ trending lists and to create the illusion of public discussion and support.

Researchers and technologists take different approaches to identifying bots, using algorithms or simpler rules based on the number of posts per day.

Deepfakes are fabricated media produced using artificial intelligence. By synthesizing different elements of existing video or audio files, AI enables relatively easy methods for creating ‘new’ content, in which individuals appear to speak words and perform actions which are not based on reality. It is likely we will see examples of this type of synthetic media used more frequently in disinformation campaigns, as these techniques become more sophisticated.

Fake followers are anonymous, or imposter social media accounts created to portray false impressions of popularity about another account. Social media users can pay for fake followers as well as fake likes, views and shares to give the appearance of a larger audience.

Potemkin village(s) is a false organization – companies, research institutes, or think tanks – created to give credibility to disinformation.

Trolling is the act of deliberately posting offensive or inflammatory content to an online community with the intent of provoking readers or disrupting conversation. Today, the term “troll” is most used to refer to any person harassing or insulting others online.

The second edition of “Protecting Elections in the Age of Fake News” is available free of charge for electoral bodies to download and delve deeper into these and other terms or data related to disinformation. Even if election officials already have strategies to combat fake news, they have nothing to lose and everything to gain by downloading the handbook. Click this link or the image below to download it now.

Related Posts

Five Terms You Should Know to Combat Mis- and Disinformation