“Falsehood flies, and truth comes limping after it.”
—Jonathan Swift
This 18th century quote holds true to this day.
Ever since the discovery and development of communication tools, humans have battled to disseminate information, news, and, more alarmingly, ideologies.
In the 21st century, information technology has seen unprecedented growth, thanks to the internet boom of the 1990s. Information has rapidly become accessible to the global population. It has transformed into a weapon that is widely available, enabling quicker learning and a higher capacity for innovation.
The speed and ease with which information is transmitted through communication technologies presents a big challenge to enterprises in the digital era. There’s also this so-called “evil” side of digitization, which manifests and thrives in inaccurate information, decontextualization, a lack of rigor, and the deliberate or unintentional spread of misconceptions.
While misconceptions are widespread and can occur in any discipline, they have become increasingly rampant with the advent of the internet. However, it is not the technology itself but rather the people who create the disinformation.
How do we perceive information?
The human experience serves as the foundation for our theories about the world and everything we see in it. Our understanding of why something operates the way it does is rather confined to the small boundaries of information that is readily available.
Naturally, we tend to overemphasize information that supports our beliefs and discredit information that throws our belief system out of equilibrium. Moreover, we do not always actively look for the facts when we are presented with new sets of information. Instead, our brains frequently take shortcuts; we seek patterns, familiarity, and comfort in the choices we make, even when faced with contradictory evidence.
Although easy and efficient, this comes at the expense of factual accuracy. When new knowledge does not readily integrate with preexisting mental schemas, cognitive dissonance is generated, making it easier for misinformation and disinformation to be spread practically un-vetted.
The charm of narrative simplicity
Information can be viewed as a social phenomenon that is inextricably related to human behavior and nature. The compelling urge to create and spread knowledge to the next person is an intrinsic human trait.
Now armed with access to digital infrastructure and unmatched reach, people can craft persuasive messages to resonate with preexisting opinions. Personal views are constantly repeated, reverberated, and reinforced, resulting in an echo chamber where people seek and find information that only reaffirms what they already believe. Additionally, smarter algorithms may even limit access to content that contradicts these views, leading to confirmation bias.
The dangers are plenty here: Echo chambers exacerbate extremism, cultural and social divisions, and polarized opinions. For example, the fear of unconventional weapons, such as chemical, biological, and nuclear weapons, grew in the United States after the 9/11 attacks. Many attribute the US invasion of Iraq in 2003 to the culmination of this fear. Invading Iraq was based on the belief that Iraq possessed weapons of mass destruction (WMD).
However, despite the fact that WMD were never discovered and the information that supported their presence was later thoroughly debunked, public opinion polls from June 2003 revealed that about 23% of Americans continued to think that Iraq had held a sizable stockpile of biological and chemical weapons. Despite the CIA never discovering such weapons, a significant number of respondents believed these weapons were discovered and used in the war.
This is called the continued influence effect, which causes us to believe something long after it has been disproved. The spread of erroneous information and divisive rhetoric is only worsened when ideas compete for our finite attention spans.
This has become a sobering illustration of how incorrect information can have devastating effects for organizations. Disinformation can have dire impacts on consumers’ purchasing decisions, creeping up in the form of hidden costs for businesses. Also, to nobody’s surprise, about 85% of consumers said they would not purchase from a business that is associated with misinformation.
How do we address this?
In an era where the digital paradigm is ubiquitous, disinformation and misinformation are some of the most serious risks that jeopardize the success of organizations. However, organizations that actively engage with their clients and the public are better positioned to have greater credibility when attempting to dismiss disinformation.
With the right management of information, there is a higher likelihood of creating goods and services with higher value for customers, which is a crucial component of corporate competitiveness. The first step for organizations is to build trustworthiness. Organizations must be considered credible authorities in their domains. If a brand is considered transparent and trustworthy when it comes to handling confidential information, this perception acts as a natural barrier against disinformation.
Businesses should consider developing internal protocols or teaming up with governments to refine disinformation policies. Some organizations may find it advantageous to have resources completely focused on curbing the spread of disinformation, such as Facebook’s Information Warfare team or AP News’ Not Real News section.
An organization can also delegate information management by including it in the duties of marketing, PR, communications, and compliance departments. With proper protocols in place, teams can always be on the lookout for disinformation about their companies.
Organizations are often popular targets for disinformation campaigns. By having clear protocols and response plans, being transparent and data-driven, and putting genuine effort into building trust, businesses can save their bottom lines and themselves from the devastating effects of disinformation.