Published on February 26, 2022

This article series is here to help you hack productivity by optimizing the human factors that influence production and performance. The objective of this series, however, is to show that there’s no shortcut to an optimized worker. There are always systemic, cultural, and personal factors at play, influencing the quality and quantity of work accomplished. Although we don’t have a complete understanding of the conditions and constraints that influence every human’s mind, there are many lessons for leaders to integrate into their understanding of their workplace and those they lead.

When it comes to decision-making, it can be all too easy to go with the first answer that comes to mind or to make the choice because it feels right. With the amount of information that we have to consider each day, why wouldn’t we make decisions as fast and easily as possible? The reason that we shouldn’t rely on fast and easy choices is that our minds are prone to error and susceptible to cognitive illusions.

When it comes to time-sensitive and mission-critical judgements in the workplace, you’re likely already tempering your decisions against foolish mistakes, duplication of effort, or avoidable catastrophes. But decision-making can be complicated. Reaching a conclusive answer often depends on countless factors that are outside of our conscious awareness, which sometimes results in the obfuscation of optimal or even satisfactory choices.

While Part 3 provided considerations for the systemic constraints your organization places on worker productivity, this entry explores some constrains of the human mind that can affect productivity. The lesson we’re focusing on in this entry is heuristics, the shortcuts out minds often take to make a judgement or solve a problem. Understanding the causes, common attributes, and applicability of heuristics will help you and those you lead make better decisions; without being misdirected toward common cognitive errors.

What are heuristics?

Heuristics are useful “rules of thumb” that guide thoughts and behavior down efficient pathways and free up cognitive resources for more complex planning and problem-solving endeavors. This is especially useful in circumstances where some information might be missing or there is too much information to process in a reasonable amount of time.

Often, heuristics arise as highly context-dependent strategies, arising as good-enough solutions within certain circumstances, domains, or cultural expectations. Even if we have the necessary information to make a decision, heuristics can also help us make satisfactory decisions under time constraints. Over time, some heuristics can be developed into an intuition or an internalized knowledge through familiarity, experience, and mastery. A heuristic mindset can also be tapped into without a specific goal in mind to reveal creative solutions to conventional limitations.

All that being said, many heuristics have been shaped by evolutionary pressures over millennia, meaning they’re practically hard-coded into the brain as fundamental strategies for navigating everyday consciousness. Our understanding of others, ourselves, and the world is filtered through heuristics, meaning our ability to make “real-world” predictions is constantly being constrained by our mind’s tendency for efficient solutions rather than optimal or well-reasoned ones. A glaring problem is that this preference for efficiency over optimization means that the accuracy of our predictions will decrease as the amount of data points, information, or knowledge that needs to be considered goes up.

Understanding the constraints to cognition

Section preface: This section acts as background on some cognitive science concepts to inform the discussion about heuristics. If you have familiarity with psychology, neuroscience, or cognitive science, you can likely skip ahead to the next section on cognitive biases.

Humans have an impressive ability to make predictions, solve problems, and apply reason to a wide variety of circumstances. The human mind, however, is prone to logical flaws and experiential illusions. Thinking about how you think (metacognition) is a necessary process in identifying gaps and errors in your intellectual and experiential repertoire, but this requires significant and deliberate intent. Consider the following mistakes people often make when thinking about cognition and how the mind operates.

Mistake 1: Thinking that the mind works like a computer 

Talking about consciousness can quickly broach into metaphysics and philosophy. The elusive and hard-to-describe nature of consciousness can easily lead one astray from truly comprehending the pervasive conditions that affect their perception, awareness, and sense of self. Somewhat ironically, one popular but problematic way to ground the discussion of minds and consciousness is to use the “mind as computer” metaphor.

Despite similar language being colloquially used to describe computers and the mind, organic consciousness is not reducible to definitive physical processes or pure logical calculations. Not yet, at least. There’s too many variables to derive a definitive understanding of consciousness.

While at first-blush it seems like we can compare the mind and brain to software and hardware, doing so uncritically can mislead one’s understanding of themselves and others. One illustrative example of why this metaphor isn’t great is to make comparisons about memory. Humans have a few kinds of memory: 1) sensory memory, 2) short-term memory, 3) working memory, and 4) long-term memory. On the computer side, you can correspond these with 1) raw content from peripherals like cameras and microphones, 2) RAM/ROM, 3) CPU, and 4) HDD/SSD.

  1. Sensory memory includes the automatic and very brief register of raw sensory data. Unlike peripherals, our senses like hearing and vision do not provide “perfect” captures of information in the same way a microphone or camera would. For example, when we place particular expectations on our senses, we prime our minds to detect that particular aspect of sensory data more readily. Similar to recording without a memory card, any unattended sensory data is promptly forgotten.
  2. Short-term memory is where information is temporarily held at the ready before being forgotten or stored in long-term memory through rehearsal or integration. Short-term computer memory, RAM, has similar properties to short-term human memory in that it is temporary and volatile, but unlike RAM, mental operations cannot be performed on information held in short-term memory (in that sense, short-term memory is like ROM).
  3. Working memory includes a limited capacity to briefly hold information while performing mental operations on it. Working memory can draw from short-term memory and long-term memory, but has limited capacity and requires focus to maintain mental operations. Somewhat like a CPU, working memory is related to the executive control of other functions in the mind (such as the inner eye, inner ear, and inner voice) as well as the differentiation and integration of data. Unlike a CPU, however, working memory has difficulty with multitasking.
  4. Long-term memory includes the storage of persistent information and memories, making them potentially accessible for an entire lifetime. However, both the formation and recall of long-term memories is aided and inhibited by factors such as sensory data, emotional disposition, environmental conditions, distractions, and contextual clues. Furthermore, when a person retrieves a memory from long-term storage, they are not pulling a perfect copy. They’re using contextual details from that experience to recreate an interpretation of that experience in the present moment, which can lead to the degradation of a memory’s accuracy over time. Long-term computer memory, on the other hand, is mostly lossless; encoding, storing, and retrieving data tends to result in precisely the same information preserved throughout.

Mistake 2: Believing that we are consistently rational

The mind is constantly constructing internal models of concrete and abstract objects, and we attend to these mental models and manipulate them to make more plausible predictions and construct a better understanding of ourselves and the world around us. However, as shown in the discussion about memory, there’s a computational limit to how much we can consider at once.

A common decision-making strategy we use to navigate our computational and attentional limits is by using “satisficing,” a term coined by Herbert Simon, the American scientist and Noble-laureate. Satisficing is an aspect of Simon’s concept of “bounded rationality,” where humans tend to reach for a satisfactory or adequate result, rather than the optimal solution. There’s two main ways of satisficing: 1) avoid expending time, energy, and resources on finding the perfect decision and instead accept the first “good-enough” choice or outcome, and 2) make it easier to find a solution to a problem by narrowing the scope of complexity or broadness that needs to be considered for that problem.

Simon later went on theorize about “attention economics” and explained: “In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention.”

An emphasis on the attention economy has grown exponentially over the decades, and now we are all-too-familiar with the constant vying for attention that we’re subjugated to online and offline. This poverty of attention is increasingly an issue for workplaces, as the slow, careful consideration needed for many tasks and roles is being eroded by the attention economy. What’s more, the effort needed to deliberately ignore distractions makes it increasingly likely that we make weaker and weaker satisficing decisions.

Mistake 3: Over-relying on answers that come quickly

Another popular way to think about cognition and decision-making is to look at System 1 and System 2, as described in the book “Thinking, Fast and Slow” by Daniel Kahneman (here’s an overview).

This dual-system paradigm isn’t perfect, but it illustrates a useful categorical distinction between conscious and unconscious experiences. For many people, their default in most situations is to System 1, which is described as fast, automatic, frequent, emotional, stereotypic, and unconscious. Kahneman warns against trusting the fast, intuitive answers that System 1 can generate. System 1 can be trained and tempered by experience, but it is often afflicted by cognitive biases (more on biases in the next section).

On the other hand, System 2 is described as slow, effortful, infrequent, logical, calculating, and conscious, but that doesn’t mean it’s always correct. In fact, System 2 can often be employed to provide seemingly sufficient rationalizations for faulty System 1 conclusions. Even if you’re aware of the common mistakes that System 1 makes, the conscious awareness employed by System 2 is subject to your familiarity, experience, and sense of self.

 

Identifying and defending against cognitive bias

Problem: When looking back to rationalize our actions, we often try to find reasons to justify our experiences. But when we slow down to really think things through, we might realize that we’ve made leaps or faults in logic, snap-judgements, or hasty rationalizations. It can feel like we’re constantly dealing with the unwanted consequences of unintended choices. What’s more, even if we’re made aware of our mistakes when we use slow, deliberate thinking, there isn’t a guarantee that our moment-to-moment experiences will support clear realizations. This raises the question: why is it so hard to think clearly in the moment?

The human brain consumes, on average, about 20% of the body’s total energy expenditure. That is a massive energy cost for its size, so the brain’s evolutionary pressures have kept its energy overhead down by preferring shortcuts when making judgements and decisions. These mental shortcuts, which we established earlier as heuristics, help us to arrive at solutions to problems quickly, but can easily lead to inaccuracies.

What happens when heuristics break down or are misguided? The result is varying forms of cognitive bias. There’s a long list of cognitive biases to consider, with different biases affecting memory, beliefs, social interactions, moral judgements, and more. It’s best for leaders to be aware of common behavioral and information-processing shortcuts that can result in perceptual and logical errors. Here are a few cognitive biases you should be defending against:

  • Status quo bias: Preferring that things remain the way they are, and perceiving any changes to the baseline as a loss. This bias is informed by a few other psychological effects, including:
    • Loss aversion bias (“losses loom larger than gains”) and psychological inertia (the tendency to avoid intervention due to a lack of incentive)
    • The mere exposure effect (a greater preference for things that seem familiar over things that are uncertain or hard to interpret, even if a person was briefly exposed to that thing in the past can’t consciously recall having been exposed to it)
    • The endowment effect (the value a person places on something is dependent on whether they’re already in possession of that thing: the most someone is willing to pay to acquire something is usually lower than the lowest amount someone will accept to part with it)
  • Authority bias: We more easily trust and are swayed by the opinions of authority figures, even if that authority is presumed.
  • False consensus: The tendency to assume that most other people agree with you, therefore you think your beliefs represent the majority consensus.
  • Belief bias: The tendency to judge the strength of an argument by how plausible we believe its conclusion is rather than how well it supports that conclusion.
  • Confirmation bias: The tendency to remember and seek out information that confirms our perceptions and beliefs, rejecting evidence to the contrary.
  • Sunk cost fallacy (aka escalation of commitment): Sunk costs are any costs that can’t be recovered or refunded. The sunk cost fallacy is the irrational justification (or escalation) of future investments or commitments in an attempt to recover sunk costs.
  • Availability heuristic: Relying on how easy it is to come up with an example or recall a memory (i.e. how available it is) to estimate the likelihood of something being in the realm of possibilities. The fault in relying on examples from memory is that it’s easier to recall unusual, emotionally charged, or vivid memories. Common, unexciting, and mundane experiences are much more likely to happen than our past experiences will account for.

Heuristics can help people quickly and creativity reach answers, but come at the cost of clear logic and robust reasoning. The more critical and complex a decision is, the less you should unquestioningly rely on the first answer that comes to mind. What’s more, even being aware of cognitive biases isn’t a sufficient defense against them. They’re often systemic and pervasive—and having them on your radar without sufficient checks against them or a way to keep your mission-critical decision-making grounded could even lead to a false sense of security and an over-confidence in one’s rational abilities.

Takeaway 1: Using algorithmic thinking, such as step-by-step processes and checklists, can defend against the well-intended but incorrect predictions made by you and those you lead.

Takeaway 2: Making lasting, positive changes to an organization is an uphill battle due to status quo bias, false consensus, confirmation bias, sunk cost fallacy, and other cognitive biases.

Takeaway 3: Give workers the space they need to propose alternative and dissenting views, then—as much as possible—actually consider the proposals without preconceived notions.

Promote intrinsic motivators to improve heuristic tasks 

Problem: 21st century work is increasingly reliant on knowledge workers, and knowledge workers depend on heuristics to get their work done. While algorithmic thinking—which can be slow and cumbersome but less error-prone than heuristics—is great for repeatable tasks, tasks that demand intuition and creativity require time and space to iterate through possible solutions. However, the incentivization structures found in many workplaces do not support heuristic thinking, which can cause a fundamental breakdown in the productivity of creative and knowledgeable individuals.

We went in-depth about this topic in a previous piece, “Motivation, management, and mastery.” The key takeaways for motivating heuristic work are as follows:

  • Algorithmic tasks—which have a simple set of repeatable rules and clear, well-defined outcomes—largely defined 20th century work.
  • As work that defined the 20th century has become more reliably outsourced, automated, or solved by computer algorithms, 21st century work is increasingly filled with heuristic tasks, which are complex and cognitively demanding.
  • Extrinsic motivators (e.g. “carrots and sticks”), narrow focus and improve concentration, which increases output and productivity on algorithmic tasks.
  • When it comes to heuristic work, however, extrinsic motivators backfire because the narrow, concentrated focus that they foster is exactly the opposite of the expansive, diffuse mindset needed to perform novel, creative work.
  • Therefore, managers need to foster the factors that increase intrinsic motivation: challenge, curiosity, control, cooperation & competition, and recognition.

Takeaway: Promote productivity by identifying the incentive structures within your organization and checking them against the kinds of work your employees are tasked with. Extrinsic motivators work well for algorithmic tasks, but they impair performance for heuristic tasks; while intrinsic motivators promote the opposite. Understand those differences and employ motivators that promote the task at hand.

 

Jumping to the conclusion

There’s no simple solution for improving the efficacy of heuristics. Many heuristics are context-dependent and affected by familiarity, making it hard to recognize and set straight the ones that are off-base. But you can also set yourself and others up to benefit from heuristics by ensuring that the mechanisms of competency, collaboration, and culture in your organization provide grounded and realistic baselines for decision-making, predictions, and interpersonal relations. Documented rules, procedures, and guidelines can and should be used to inform complex decisions.

As a leader, you also need to be cognizant of the limits of attention and awareness. Nowadays, maintaining an engaged workforce means that you’re not only trying to navigate the preferences of your workers, but you’re helping them contend with constant distractions. Corporations and media are vying for their attention; which means you shouldn’t be contributing to this problem from within the organization. As much as possible, avoid loading the workday with meetings, messages, and mandatory busywork. Many jobs require some downtime to promote deep thinking and to provide space to double-check decisions made through heuristics.

In Part 5 of this series, we’ll be taking our awareness and turning it inward towards feelings, emotions, and personality. We’ll also look at how leaders can employ empathy to build trust and maintain healthy working relationships with the people that they lead. If you still believe that “emotionality” is unprofessional or doesn’t belong in the workplace, then we have some valuable tips that might change your mind.

 

This is Part 4 of a series. Part 1 is on personal productivity and self-improvement. Part 2 is on competency, collaboration, and culture. Part 3 is on organizational wisdom. Part 4 is on heuristic cognition. Part 5 is on empathy and emotional intelligence. Part 6 is on physical and social needs.
 

 

Nick Glavor

Enterprise Analyst, ManageEngine

Nick Glavor is an enterprise analyst at ManageEngine. He approaches writing and thinking about technology from an interdisciplinary perspective, with an eye for psychology, cognitive science, game theory, transhumanism, and cybersecurity.

He has experience in copy editing, field marketing, and brand evangelism. His experience at ManageEngine has led Nick to believe that companies can be profitable and future-oriented, while maintaining the humanity of their workforce.

Nick holds a B.S. in Cognitive Science from University of California, Santa Cruz.

 Learn more about Nick Glavor
x Your enterprise, your rules: Master digital governance