Countering disinformation
AI-powered OSINT analysis at scale
Countering disinformation
The world is divided. Societies are divided. At the core of the divide is the inequality experienced by those who feel left behind, and the lack of empathy shown by many on the greener side. These postures gave way to a polarizing “us vs. them” mentality, and what seems to be the end of productive dialogue.
The victims’ feelings of having been abandoned are confirmed by the disproportionate share—whether real or perceived—of the brunt they bear. We saw this in images of trains packed with workers commuting to their jobs during the peak of the pandemic. For those with cushy jobs who can work remotely, the pandemic imposed merely inconveniences; for some, it even brought the welcomed comfort of working from home. For those making a living by being in close contact with others, the situation has been extremely stressful to say the least. The experience of being more severely affected by the same events than other sectors of society, has eroded trust. Trust in the very institutions that could help improve conditions. Trust people built with each other.
Bringing about the type of change needed to fix injustice is a lengthy process. Policies that are not carefully considered will miss nuance and may lead to negative outcomes. Actions that introduce abrupt change may give way to more, not less, discontent. Systemic change is a long term process that requires patience and support from society. This type of protracted transition is susceptible to occasional setbacks. The global pandemic provides another example of this; when vulnerable sectors voicing legitimate concerns got their messages mixed up with the violent rhetoric from groups on the fringes, these vulnerable groups were met with indifference and repression. Not only their demands were not met—their alienation got revalidated.
Disinformation goes viral
Low production and dissemination costs make disinformation ubiquitous
The role of disinformation
This chaos is fueled by disinformation campaigns. These campaigns are architected by actors who have little interest in the needs of the vulnerable. Much the opposite; manipulators amplify the victims’ discontent by feeding it with disinformation that confirms their sense of victimhood. Disinformation merchants target the vulnerable by pushing their own agenda through them. The biggest tragedy is perhaps the fact that those who need change the most, are taken advantage of to advance agendas that will likely worsen their situation.
Although disinformation existed throughout human history, two emerging factors now contribute to its ubiquity and damaging power. These are the ease with which forgeries can be produced, and the low effort required to disseminate them.
Technology provides the ability to create convincing information designed to deceive at scale. Using artificial intelligence, perpetrators can fabricate convincing videos showing authority figures delivering divisive messages. Language-generation algorithms enable malicious actors to write pieces that imitate the writing style of fakely-attributed authors. In most cases, it may not even be necessary to create imitations—putting out unfounded claims may suffice if they confirm an existing belief.
The second factor is how modern digital infrastructures enable disinformation to spread. Social networks are designed to promote controversial content to those likely to agree with it. These networks also make it very easy for anyone—person or fake bot account—to promote such content to friends and followers. At the same time, information echo chambers create a feedback loop that blocks any contrasting, and potentially enlightening, views from getting in.
The ease with which disinformation is created and disseminated acts as a catalyst for an accelerated march towards a deeper fracturing of society. Trust must be reinstated if societies are to succeed at tackling the biggest challenges. Challenges such as solving inequality itself or dealing effectively with the events that beat on the most vulnerable in the process. Reinstating trust requires both timely action and long term resiliency. Mitigating disinformation attacks requires institutions to quickly spot and shut down malicious campaigns. Building long term resiliency requires investments in systems designed to promote the truth.
Disinformation erodes trust
Without trust, societies divide, and resolve weakens
How to fight back
To counter disinformation, organizations must have access to tools that enable them to sift through large amounts of data to identify and stop attacks quickly and effectively. These tools need to surface deceptive discourse and detect orchestrated efforts to promote it. They must provide mechanisms to organize the information in ways that enable analysts to intuitively understand what is being peddled. These tools must help analysts gauge the impact of campaigns, as measured by the campaigns’ reach, and the effect they have on their targets.
To build long term resiliency, institutions must lift the veil on disinformation campaigns.
Organizations must raise awareness amongst the public on how disinformation is creeping in the social fabric and perverting legitimate dissent. An example of resiliency-building is the work by discursus, a project that uses open source intelligence to monitor protest movements and the health of the debate during those events. Systems like these can help institutions and everyday citizens understand how much of what happens on the streets is fueled by legitimate demands vs. how much is fueled by disinformation.
Novacene develops technology to help analysts find insight hidden in increasing volumes of unstructured data. To discuss collaboration opportunities, please contact us.