Full description not available
J**E
When is an intuition a simplifying heuristic or an expert solution? That is the cue of recognition, nigh a formula Grasshopper!
Stemming from the author's Nobel prize winning scholarly research on the simplifying short-cuts of intuitive thinking (systematic errors) and then decision making under uncertainty both published in Science Journal, the book is a series of thought experiments that sets out to counter the prevailing rational-agent model of the world (Bernoulli's utility errors) that humans have consistent preferences and know how to maximise them.Instead, Prospect Theory shows that important choices are prone to the relativity of shifting reference points (context) and formulations of inconsequential features within a situation such that human preferences struggle to become reality-bound. In particular our decisions are susceptible to heuristic (short-cutting) or cognitive illusory biases - an inconsistency that is built in to the design of our minds, for example, the 'duration neglect' of time (less is more effect) in recounting a story by the Remembering Self, as opposed to sequentially by the Experiencing Self. Prospect Theory is based on the well known dominance of threat/escape (negativity) over opportunity/approach (positivity) as a natural tendency or hard wired response towards risk adversity that Kahneman's grandmother would have acknowledged. Today this bias is explored by behavioural economics (psychophysics) and the science of neuroeconomics - in trying to understand what a person's brain does while they make safe or risky decisions.It would appear that there are two species of homo sapiens: those who think like "Econs" - who can compare broad principles and processes 'across subjects', like spread betters (broad framing) in trades of exchange; and "Humans" who are swayed optimistically or pessimistically in terms of conviction and fairness by having attachments to material usage (narrow framing) and a whole host of cognitive illusions, e.g. to name but a very few: the endowment effect, sunk cost fallacy and entitlement. Kahnmann argues that these two different ways of relating to the world are heavily predicated by a fundamental split in the brain's wet-ware architecture delineated by two complementary but opposing perspectives:System 1 is described as the Inside View: it is "fast" HARE-like intuitive thought processes that jump to best-case scenario and plausible conclusions based on recent events and current context (priming) using automatic perceptual memory reactions or simple heuristic intuitions or substitutions. These are usually affect-based associations or prototypical intensity matches (trying to compare different categories, e.g. apples or stake?). System 1 is susceptible to emotional framing and prefers the sure choice over the gamble (risk adverse) when the outcomes are good but tends to accept the gamble (risk seeking) when all outcomes are negative. System 1 is 'frame-bound' to descriptions of reality rather than reality itself and can reverse preferences based on how information is presented, i.e. is open to persuasion. Therefore, instead of truly expert intuitions System 1 thrives on correlations of coherence (elegance), certainty (conviction) and causality (fact) rather than evidential truth. System 1 has a tendency to believe, confirm (well known bias), infer or induce the general from the particular (causal stereotype). System 1 does not compute base rates of probability, the influence of random luck or mere statistics as correlation (decorrelation error) or the regression to the mean (causality error). System 1's weakness is the brain's propensity to succumb to over-confidence and hindsight in the resemblance, coherence and plausibility of flimsy evidence of the moment acronymically termed WYSIATI (What You See Is All There Is) at the expense of System 2 probability. To succumb is human as so humbly shown throughout the book which has no bounds to profession, institution, MENSA level or social standing. Maybe Gurdjieff was right when he noticed that the majority of humans are sheep-like.System 2 on the other hand is the Outside View that attempts to factor in Rumsfeld's "unknown unknowns" by using realistic baselines of reference classes. It makes choices that are 'reality-bound' regardless of presentation of facts or emotional framing and can be regarded as "slow" RAT-like controlled focus and energy sapping intention, the kind used in effort-full integral, statistical and complex reasoning using distributional information based on probability, uncertainty and doubt.However System 2 is also prone to error especially in the service of System 1 and even though it has the capability with application not to confuse mere correlation with causation and deduce the particular from the general, it can be blocked when otherwise engaged, indolent or full of pride! As Kahneman puts it "...the ease at which we stop thinking is rather troubling" and what may appear to be compelling is not always right especially when the ego - the executive regulator of will power and concentration - is depleted of energy, or conversely when it is in a good mood of cognitive ease (not stress) deriving from situations of 'mere exposure' (repetition and familiarity). Experiments have repeatedly shown that cognitive aptitude and self-control are in direct correlation, and biases of intuition are in constant need of regulation which can be hard work such as uncovering one's outcome bias (part hindsight bias and halo effect) based on the cognitive ease with which one lays claim to causal 'narrative fallacies' (Taleb) rather than "adjusting" to statistical random events born out of luck!So..Do not expect a fun and "simples" read if you want clarity in to how impulses become voluntary actions and impressions and feelings and inclinations so readily become beliefs, attitudes and intentions (when endorsed by System 2).The solution..Kahneman makes the special plea that our higher-minded intuitive statistician of System 2 take over the art of decision-making and wise judgement in "accurate choice diagnosis" to minimise the "errors in the design of the machinery of cognition." We should learn to recognise situations in which significant mistakes are likely by making the time and putting in the analytical effort to avoid them especially when the stakes are high - usually when a situation is unfamiliar and there is no time to collect more information. 'Thinking Fast and Slow' practically equips the reader with sufficient understanding to approach reasoning situations applying a certain amount of logic in order to balance and counter our intuitive illusions. For example recognising the Texas sharp shooter fallacy (decorrelation error) or de-constructing a representative heuristic (stereotype) in one's day-to-day affairs should be regarded as a reasonable approach to life even by any non-scientific yard stick. In another example, the System 2 objectivity of a risk policy is one remedy against the System 1 biases inherent in the illusion of optimists who think they are prudent, and pessimists who become overly cautious missing out on positive opportunities - however marginal a proposition may appear at first.One chapter called "Taming Intuitive Predictions" is particularly inspiring when it comes to corrections of faulty thinking. A reasonable procedure for systematic bias in significant decision-making situations where there is only modest validity (validity illusion) especially in-between subjects is explored. For example, when one has to decide between two candidates, be they job interviewees or start up companies as so often happens the evidence is weak but the emotional impression left by System 1 is strong. Kahneman recommends that when WYSIATI to be very wary of System 1's neglect of base rates and insensitivity to the quality of information. The law of small numbers states that there is more chance of an extreme outcome with a small sample of information in that the candidate that performs well at first with least evidence have a tendency not to be able to keep up this up over the longer term (once employed) due to the vagaries of talent and luck, i.e. there is a regression towards the mean. The candidate with the greater referential proof but less persuasive power on the day is the surer bet in the long term. However, how often can it be said that such a scenario presents itself in life, when the short term effect is chosen over the long term bet? Possibly a cheeky pertinant example here is the choice of Moyes over Mourinho as the recently installed Man Utd manager! A good choice of bad choice?There are many examples shown in low validity environments of statistical algorithms (Meehl pattern) showing up failed real world assumptions revealing in the process the illusion of skill and hunches to make long-term predictions. Many of these are based on clinical predictions of trained professionals, some that serve important selection criteria of interviewing practices which have great significance. Flawed stories from the past that shape our views of the present world and expectations of the future are very seductive especially when combined with the halo effect and making global evaluations rather than specific ratings.For example one's belief in the latest block busting management tool adopted by a new CEO has been statistically shown to be only a 10% improvement at best over random guess work. Another example of a leadership group challenge to select Israeli army leaders from cadets in order to reveal their "true natures" produced forecasts that were inaccurate after observing only one hour of their behaviour in an artificial situation - this was put down to the illusion of validity via the representation heuristic and non regressive weak evidence. Slightly more worryingly, the same can be said for the illusory skills of selling and buying stock persistently over time. It has shown that there is a narrative being played within the minds of the traders: they think they are making sensible educated guesses when the exposed truth is that their success in long term predictability is based on luck - a fact that is deeply ingrained in the culture of the industry with false credit being "taken" in bonuses!! Kahneman pulls no punches about the masters of the universe and I am inclined to believe in the pedigree of his analysis!!According to Kahneman so-called experts - and he is slightly derisive in his use of the term - in trying to justify their ability to assess masses of complexity as a host of mini-skills can produce unreliable judgements, especially long term forecasts (e.g planning fallacy) due to the inconsistency of extreme context (low or zero-validity environments with non regular practice) - a System 1 type error. Any final decision should be left to an independent person with the assessment of a simple equally weighted formula which is shown to be more accurate than if the interviewer also makes a final decision who is susceptible to personal impression and "taste"..(see wine vintage predictions). The best an expert can do is anticipate the near future using cues of recognition and then know the limits of their validity rather than make random hits based on subjectively compelling intuitions that are false. "Practice makes perfect" is the well known saying though the heuristics of judgement (coherence, cognitive ease and overconfidence) are invoked in low validity environments by those who do not know what they are doing (the illusion of validity).Looking at other similar books on sale, "You are Not So Smart" for example by David McRaney is a more accessible introduction to the same subject but clearly rests on Kahneman's giant shoulders who with his erstwhile colleagues would appear to have informed the subject area in every conceivable direction. It is hard not to do justice to such a brilliant book with a rather longish review. This is certainly one of the top ten books I have ever read for the benefits of rational perseverance and real world knowledgeable insights and seems to be part of a trend or rash of human friendly Econ (System 2) research emanating out of the USA at the moment. For example, recently 2013 Nobel winning economics research by R Shiller demonstrates that there are predictable regularities in assets markets over longer time periods, while E Fama makes the observation that there is no predictability in the short run.In summary, "following our intuitions is more natural, and 'somehow' more pleasant than acting against them" and we usually end up with products of our extreme predictions, i.e. overly optimistic or pessimistic, since they are statistically non-regressive by not taking account of a base rate (probability) or regression towards the mean (self correcting fluctuations in scores over time). The slow steady pace of the TORTOISE might be considered the right pace to take our judgements but we are prone not to give the necessary time and perspective in a busy and obtuse world. The division of labour and conflict between the two Systems of the mind can lead to either cognitive illusion (i.e. prejudice/bias) or if we are lucky wise judgement in a synthesis of intuition and cognition (called TORTOISE thinking by Dobransky in his book Quitting the Rat Race).Close your eyes and imagine the future ONLY after a disciplined collection of objective information unless of course you happen to have expert recognition, which is referred to in Gladwell's book on subject called Blink, but then your eyes are still open and liable to be deceived. Kahneman's way seems so much wiser but harder nonetheless. The art and science of decision-making just got so much more interesting in the coming world of artificial intelligence!
O**N
Present Company Included
This is a monster book packed with fascinating insights about how our cognitive systems process and render information. Its starting premise is that we have two discrete "systems" for mental processing. Daniel Kahneman, a cognitive psychologist who transformed himself into a Nobel Prize-winning behavioural economist, gives these the Dr. Seussian labels "System 1 and System 2".System 1 is fast. It makes snap judgments on limited information: it manifests itelf in the "fight or flight" reflex. System 2 is more deliberative: courtesy of this, one meditates on eternal verities, solves quadratic equations and engages in subtle moral argument. Though this is interesting enough, their interaction is more fascinating still. System 1 is lightweight, efficient and self-initiates without invitation; bringing System 2 to bear on a conundrum requires effort and concentration.This talk of snap judgments calls to mind Malcolm Gladwell's popular but disappointing "Blink: The Power of Thinking Without Thinking". Kahneman's account, rooted in decades of controlled experiment, is a far more rigorous explanation of what is going on, and is able to explain why some snap judgments are good, and others are bad. This conundrum, unanswered in Gladwell's book, is Daniel Kahneman's main focus of enquiry.It also invokes another popular science classic: Julian Jaynes' idea of the "Bicameral Mind" - wherein there are large aspects of our daily existence, which we consider them conscious, really are not - driving by rote to the office, playing a musical instrument - these are also mental processes, I imagine Kahneman would say, undertaken by System 1. Jaynes was widely viewed as a bit of an eccentric: Kahneman's work suggests he may have been right on the money.It gets interesting for Kahneman where the division of labour between the systems isn't clear cut. System 1 can and does make quick evaluations even where system 2's systematic analysis would provide a better result (these are broadly the "bad" snap judgments of Gladwell's Blink). But System 2 requires dedicated mental resource (in Kahneman ugly expression, it is "effortful"), and our lazy tendency is to substitute (or, at any rate, stick with) those "cheaper" preliminary judgments where it is not obviously erroneous to do so (and by and large, it won't be, as System 1 will have done its work). Kahneman's shorthand for this effect is WYSIATI: What You See Is All There Is.Kahneman invites the reader to try plenty of experiments aimed at illustrating his fecklessness, and these hit their mark: it is distressing to repeatedly discover you have made a howling error of judgment, especially when you knew you were being tested for it. This has massive implications for those who claim group psychology can be predicted on narrow logical grounds. The latter half of Thinking Fast and Slow focusses more on our constitutional inability to rationally adapt to probabilities and soundly wallops the notion of Homo Economicus, the rational chooser each of us imagine ourselves to be. This is where Kahneman's Nobel Prize-winning Prospect Theory and gets full run of the paddock.Kahneman draws many lessons (which, by his own theory, doubtless will go unheeded) for scientists, economists, politicians, traders and business managers: "theory-induced blindness"; how we become (irrationally) risk tolerant when all our options are bad and risk averse when all our options are good, and how we systematically underweight high probability outcomes relative to actual certainty. For those with nerves of steel there's a real arbitrage to be exploited here.This long book is a rich (if "effortful") store of information and perspective: it is not news that our fellow man tends not to be as rational as we like to think he is, but we are inclined strongly exclude present company from such judgments. Kahneman is compelling that we are foolish to do so: this is a physiological "feature" of our constitution: the "enlightened" are no more immune. This is a valuable and sobering perspective.Olly Buxton
Trustpilot
2 months ago
4 days ago