The world that we live in today is built on mass data collection and processing. At some point in time, companies discovered that with a large enough set of data and the right handling, they could find incredibly valuable insights about their customers and potentially capitalize on them. For those of us who depend on modern technologies in our daily lives, it feels like nearly everything we do is tracked and measured in some way. We now have an overwhelming supply of data coming in from sources everywhere. At the end of the day, all of this information must end up at a single processing point – our brains. The world of information is growing rapidly and the human brain is doing everything it can to adapt and grow along with it, but it can only do so much.
What are cognitive biases?
Enter cognitive biases; Ruhl of SimplyPsychology (2021) describes cognitive biases as “unconscious errors in thinking that arise from problems related to memory, attention, and other mental mistakes.” She explains that this is a “result from our brain’s efforts to simplify the incredibly complex world in which we live.” In other words, as we are continually presented with new information, our brains attempt to process this efficiently by making assumptions to help it all make sense – some of which are incorrect.
How do cognitive biases show up in software engineering?
Taking this into the context of software engineering, with technology advancing at an exponential rate, engineers are required to learn quickly so they can maintain and update the systems they build upon. To sustain a productive pace, making assumptions becomes inevitable because knowing anything with certainty takes time, which is limited. Therefore it becomes part of an engineer’s job to be able to identify and carefully consider what assumptions are being made. This is typically done during the solution design process and is accompanied by analyses of potential costs for failed outcomes. The difference between assumptions made in this manner versus ones resulting from cognitive biases, however, is that one is conscious while the other is unconscious. So in order to prevent the latter, we need to be aware of its existence.
As an engineer, being able to identify one’s own cognitive biases can be very powerful as it can help to prevent costly mistakes. However, knowing how to build this skill may not come naturally since it is more bound in psychology than engineering. Thankfully, a considerable amount of research has been done in this area and is available to the public. Organizations can use this information to design and implement helpful development practices for engineers to follow so that mistakes can be avoided.
Chattopadhyay et al. (2020) from the University of Oregon conducted a study called A Tale from the Trenches: Cognitive Biases and Software Development, with developers from a startup whose work experiences ranged from 1 to 23 years. This involved observing the developers perform their daily work tasks while verbalizing and expressing their thoughts aloud; essentially a pair programming exercise, except with a silent researcher instead a fellow developer. After observing 10 different developers, the researchers compiled a list of cognitive bias categories and example mistakes that happened as a result of those, as can be seen in the table below.
Bias Category | Bias(es) | Example |
---|---|---|
Preconceptions | Confirmation, Selective perception | P1 continually added hashmaps when other data structures were more suited for data query APIs. |
Ownership | IKEA effect, Endowment effect | P8 decided to reuse her old CSS file instead of the pre-made CSS files from the Bootstrap project. |
Fixation | Anchoring and adjustment, Belief preservation, Semmelweis reflex, Fixation | P9 fixated on changing the function definitions when the environment just needed to be reloaded. |
Resort to default | Default, Status-quo, Sunk cost | P2 opened a new code file and kept unused template code at the top of the file. |
Optimism | Valence effect, Invincibility, Wishful thinking, Overoptimism, Overconfidence | P4 was proud of his new aggregated map code, but it got an error after it was printed. |
Convenience | Hyperbolic discounting, Time-based bias, Miserly information processes, Representativeness | P2 created simply overly-verbose code that addressed his current needs, but became spaghetti code that slowed future progress. |
Subconscious action | Misleading information, Validity effect | P6 focused on fixing the files listed in error messages instead of the core dependency file causing errors throughout the system. |
Blissful ignorance | Normalcy effect | P10 disregarded all compiler warnings out of habit and failed to notice a new exception detailing the cause of his build failure. |
Superficial selection | Contrast effect, Framing effect, Halo effect | P4 copied and pasted a function from his documentation directly into his syntax without examining it first. |
Memory bias | Primacy and recency, Availability | P1 reused a design pattern that worked well on recent tasks, since he could easily recall the structure of the code. |
Understanding bias to disrupt a vicious cycle
I definitely recognized some of these mistakes in my own experiences as a developer. If someone were to ask me why I made those mistakes, before I knew what cognitive biases were, I would probably say, “because I thought this was the right way to do it.” When it comes to unconscious behavior, it’s difficult to understand where it comes from or that it even exists. It’s seemingly innocent and harmless at the moment, but over time these mistakes can lead to serious negative impacts as bad habits are formed and become difficult to break. When left unchecked, an increasing number of bugs are introduced into the system and resources are drained into fixing these bugs, potentially in an incorrect manner as well – a vicious cycle.
Having an outside perspective, like a silent researcher, makes it a lot easier to spot these behaviors. It’s even more helpful if the observer knows these behaviors exist and is given permission to point them out when seen. Since these mistakes ultimately affect the business, it is up to the organization to understand these effects of producing at a fast rate. This understanding can justify investment into the implementation of proper procedures and training. Although I’ve only focused on how cognitive biases appear in software development, one can imagine how these can arise in other departments as well. Essentially anyone who is looking to be productive in their work can benefit from understanding cognitive biases.
Interested in furthering your career within team cultures like this? Check out our open positions below to be the first to know about new opportunities.