There are some fascinating discussions on this important topic (thanks to Mitchell Winkler for initiating it). As we understand it, cognitive bias is a person's subjective reality that results from his or her own pattern of thinking that may not be in sync with the characteristics of the object, or of a problem. It is a systematic pattern of deviation from the rationality of judgment. In the context of societal relationship one can say that, cognitive bias represents the observer's state of mind – his or her mind-set or conviction – which has nothing to do with observed.
Let us attempt to see the issue cognitive bias from a different perspective – in terms of solving a problem – engineering or otherwise. For such cases, and individual invariably starts with searching for tools in his or her toolbox, which may contain at least three:
- The past experiences that lead to the development of one's intuition – often termed as rule-of-thumb, heuristics, commonsense or simply as gut-feeling. They can be very powerful – especially in the preliminary phases of a project – and if the tools of experience fit the requirements. Many routine and well-established engineering projects usually fall into this category.
- The standards, guidelines and design codes are developed by experienced professionals utilizing experience, principles and theories. They are intended to provide a general guideline. They come with a disclaimer – because they are not based on site-specific conditions of a certain project. But if certain codes are accepted within a jurisdiction, an engineer is legally bound to check on them as a minimum requirement.
- The third is very important for large and complicated projects that require more than routine procedures. Here all-out efforts such as those combining research, advanced computational routines and experiments are needed because depending on the rule-of-thumb approach or design codes may simply prove inadequate and too risky.
Whatever tools an engineer chooses, his or her commonsense judgment is always required – but not steeped with cognitive bias rather with seeing and comprehending the problem or project as is.
Often, we become so much involved in something that we forget to see a problem in simple terms. I remember, one of the giants in coastal engineering, RG Dean (1931 – 2015) asked a simple question to a presenting author during a conference session. The author's presentation was highly elaborate in formulae and mathematics – and he was so consumed by them that a simple physics based question appeared very difficult for him to answer.
-----
Dilip
Website
ORCID ID
Google Scholar
------------------------------
Dr. Dilip Barua, Ph.D, P.Eng, M. ASCE
Vancouver, BC, Canada
https://widecanvas.weebly.com
------------------------------
Original Message:
Sent: 02-22-2021 11:41 AM
From: Samuel Ng
Subject: Cognitive Biases
Mitchell's phrase, "... many individuals who were always quick to knock the ideas of others" is a good summary of human behavior regardless of profession. I would decribe that behavior as "They don't know what they don't know!" To be fair though, it takes more than a lifetime of learning to "Know what you don't know."
------------------------------
Samuel Ng Ph.D., P.E., F.ASCE
RETIRED
Plymouth MN
Original Message:
Sent: 02-08-2021 02:03 PM
From: Mitchell Winkler
Subject: Cognitive Biases
This is a fascinating and I think fun topic and I am hoping others will weigh in with their own experience, particularly as seen or experienced in engineering practice. I first became aware of this topic in the early 1990 when first exposed to the concept of decision quality (a future topic) and in particular the disabling role of anchoring.
As a brief background, the notion of cognitive biases was first identified by Amos Tversky and Daniel Kahneman in work published in the early 1970s. The role cognitive biases play in everyday life has now become far ranging from Behavioral Economics to baseball's Sabermetrics. They are also also making their into engineering, e.g., February's free paper: Value of Information on Resilience Decision-Making in Repeated Disaster Environments.
Examples of cognitive biases - also referred to as heuristics - include:
- Anchoring - Why we tend to rely heavily upon the first piece of information we receive?
- Availability - Why do we tend to think that things that happened recently are more likely to happen again?
- IKEA effect - Why do we place disproportionately high value on things we helped to create?
- Representativeness - Why do we use similarity to gauge statistical probability?
Finally, if you read and liked Michael Lewis' book Moneyball I highly recommend his follow up book the Undoing Project. The provides the why behind former. There's also a nice article from the New Yorker The Two Friends Who Changed How We Think About How We Think that serves as a great intro to the overall subject of cognitive biases.
------------------------------
Mitch Winkler P.E., M.ASCE
Houston, TX
------------------------------