In part one of this article, we defined cognitive biases as shortcuts our unconscious minds take to make decisions on our behalf, often to our benefit but sometimes not. We analyzed how some cognitive biases could be used to make our designs more effective for our users. In part two of the article, we will discuss how cognitive biases affect us designers directly and how we can keep some of these biases from impairing our work.
“Everyone knows that…”
One of the first biases we must account for is Anchoring, which is where an individual depends too heavily on an initial piece of information offered to make later judgments during decision making. Once we get information that tells us something to be true, we anchor on a belief that it will never change, and it will always be true. This is particularly difficult to overcome because at first glance it seems like we aren’t making any assumptions since we are basing our decision on established information. However, the assumption comes from us believing, without any proof, that the information is still applicable and true.
One example of this is the use of carousels in designs.
For a long time, it was believed that carousels shouldn’t be used because testing showed that users didn’t know how to interact with them, and they simply weren’t useful. For a long time, this was true, and carousels were avoided. However, as people started interacting with different types of displays, such as touch screens, navigating by swiping sideways became ubiquitous. This made carousels more useful in modern designs, but designers who still hold the assumption that they are not effective will have difficulty moving past their initial assumptions.
How do we counteract this bias?
Testing often with users provides us with opportunities to check our assumptions against current realities and in context use for the product or service.
“If it ain’t broke, don’t fix it”
Another bias that affects us is System Justification. This bias is the tendency to defend and support the status quo. Existing social economic and political arrangements tend to be preferred, and alternatives disparaged, sometimes at the expense of the individual and collective self-interest. In other words, to keep things as they are, we’ll make excuses for the faults of the current system and easily point out flaws of any alternatives. This is a powerful bias because it taps into a universal fear of change, fear of the unknown, and an aversion to question one’s core beliefs.
Directly applied to UX design, this bias can be summed up with the phrase, “if it ain’t broke, don’t fix it”. At first glance it sounds like sound advice, but in design this means that the only thing that could trigger any change is failure. This would mean that the best we could hope for in our designs is good enough. A better design or the best design would be unattainable.
An example of this is the difference between constantly repairing leaks on a boat while out in the middle of the ocean versus designing a more efficient waterproof boat. Both address the issue of the boat not sinking but in completely different ways and to vastly different ends. “If it ain’t broke don’t fix it” keeps you reactive, but if you want to innovate and rise above just good enough, you must be proactive.
You do this by challenging the status quo and asking the hard question: Even though something has historically been done a certain way and achieved some level of success, can it be done better? Contextual inquiry is a great tool for understanding what workarounds users may have for completing a task or using a system or process. Often these workarounds hide the “broken” parts which hinder the opportunity to innovate.
“Don’t fall in love with your designs”
Finally, we come to one of the toughest biases to break, Confirmation Bias. This is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values. This bias goes hand in hand with the Semmelweis Reflex, which is the tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or paradigms. In simpler terms we only accept things that tell us we are right and reject things that tell us we are wrong.
For UX designers, these biases affect our work in many ways. One of those ways can be summed up by some advice new designers often get: “Don’t fall in love with your designs.” Although this sounds counterintuitive at first, this advice reminds designers that they should not love their designs so much that they’d be unwilling to change or improve them.
The solution to overcome these biases requires something that’s as simple as it is difficult to do; put one’s ego aside and be open to the idea that we are not right all the time. If we can get over this hurdle, there are practical things we can do to make sure our designs are the best they can be!
How to Overcome Cognitive Biases
The military has an exercise that they conduct called Red Team, Blue Team. The premise of this exercise is one team is tasked to come up with a solution to a problem and the other team tries to find flaws in their solution, so it can be refined into a more elegant solution.
In practice this can take several forms from design crits to various forms of user research. If taking a Red Team, Blue Team approach, a key criterion is having a diverse group of team members on each team. This is especially important for the team who will be tasked with finding flaws in the solution. In our scenario the Blue Team, made up of a multi-disciplinary team, comes up with the initial design, conducts all the necessary research, presents a concept, and creates wireframes. Now we need a Red Team. When it comes to the Red Team, there are things we can do to make sure to get the most benefit out of this exercise. One thing would be to make sure that the team is independent of the Blue Team so they can review the designs objectively. The other major factor is choosing who is part of the Red Team. It might seem logical to have UX designers make up the other team, and if that is the only option available, you can still derive some benefit from having them review the designs. However, to truly get the most out of the exercise you need the benefit of a wide range of perspectives. It’s important to not only have designers as part of the team but also developers, product managers, strategists, and other disciplines. All of them will view the designs from different lenses and their unique perspectives allow them to offer unique critiques that a review with only other designers or even just your project team may not. The importance of perspective also extends to age, gender, ethnicity, and other socioeconomic differences that would give the participants unique and valuable viewpoints. Making sure the team is inclusive can go a long way to achieving the best results!
As we have seen, simply because UX designers are aware of cognitive biases it doesn’t mean that they are immune to them. Overcoming them requires actively establishing safeguards such as constant testing of our previous assumptions and setting up independent review of our work. To do this, designers must view getting something seemingly “wrong” as a normal part of the process, a chance to learn and improve, not as a failure. Additionally, teams working with designers shouldn’t view their results in a binary, pass or fail. Rather they should objectively review their work to figure out what is working and what isn’t. The result of all this effort is better designs and better products!
Interested in more tips on how to overcome cognitive biases to create more effective designs? Let’s connect!