How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life
G**D
Not the easiest read but definitely worth it
Helps you consider your own opinion-forming habits and improve them a bit, and DEFINITELY helps explain some of the utter rubbish that people take to be true these days!
C**N
Author falls prey of the same bias he cleverly denounces
Good book. I recommend. It is interesting to notice that the author himself consistently falls prey of the fallibility he so cleverly brings to our attention. He interprets many things according to his needs. Many things could be said about they way he criticizes those who oppose the "germ theory", holisticmedicine, etc. He tends to imply that "holistic" is the sameas "quackery". Of course, quackery is quackery, no matter what a quack calls his "medicine", be it "holistic", be it "orthodox", be it whatever you chose. At least what I know about "holistic" medicine has nothing to do with quackery and nothing to do with Gilovich's description. The same thing can be said about his condemnation of those who oppose to the "germ theory" that is dominant since Pasteur. As far as I know real doctors who oppose it do not say germs are not important factor. What they say is that germ is only ONE factor and very often not the most important one. That explains why you can subject hundreds of people to the same germ and most of the time only a few will get sick.In page 6 he refers to Rhea Sullins. Her father may have killed her, as the author implies. Nevertheless, this is far from certain based on what he says. Would she be cured using conventional medicine? Did her father use a proper natural medicine? Statistically, is her case important? He mentions a single occurrence of a victim and a perpetrator and expects us to believe this is enough to prove "Natural hygiene" is bad or that hygienists are all dumb and irresponsible.When he goes to Homeopathy he applies every single technique he denounces in others. He implies homeopathy is quackery and has no scientific soundness. Far from this. Homeopathy does not pretend to understand why it works. Nobody knows. But it follows the same scientific methodology other sciences follow. Like the so much beloved "double blind", for instance. Also, homeopathy practioners tend to make much better prognosis then orthodox medicine, like the course of a remission. With very objective measures, like changes in body temperature, weight modification, skin alterations etc. Also, homeopathy is used in animals with great success. And in emergency rooms too. Also -- in Brazil at least -- in order to practice homeopathy a person must first go to a regular medicine school for about 6 years. After that 3 years more are needed to become an homeopathist. It is hard to believe a homeopathist is less educated and trained than a "regular" doctor. The reverse seems to be true.By the way, I have no affiliation whatsoever with Homeopathy. I am among those (also more or less rediculed by Gilovich) who believe our health should be in our own hands, doctors being only helpers. So I try to understand what we should expect from our mind and body (not two separate entities), from our food, from the doctors and from the drugs (in this order of precedence).The final chapters are a little boring. It seems the author wanted to put as many things in as few pages as possible in order to support his views. It is quite miscellaneous and clearly shows the author has an axe to grind.Again, this is a good book that deserves to be read. The fact that the author is himself victim of the failures he sees so clearly in our reasoning does not belittle his work.
S**N
A great guide for those who want to know about the inner workings behind what you believe
In How We Know What Isn’t So, author Thomas Gilovich sets out to answer the question of why we fall victim to non-scientific hyperbole and beliefs by using scientific research in the field of psychology, as well as a plethora of topical examples, to show readers what are potential causes in the errors of their thinking. Gilovich presents logical arguments, backed up by clinical research, as to the consequences of clichéd beliefs and how over-generalizing challenging subjects lead not only to logical fallacies but to an incorrect understanding of the general human condition.Part One of the book looks at the reasoning behind why we are susceptible to ideas and conclusions that are not supported by fact. An example is given about pattern recognition in the realm of sports that deftly demonstrates how the human brain is programmed to seek out patterns, sometimes even if there aren’t any to be found. Likewise, this phenomenon convinces the subject erroneously and leads to one believing false information. The example revolves around a basketball player’s belief that scoring comes in streaks, and that one can develop a “hot-hand”. Research, however, shows that statistically, a prior make or miss has no bearing on the success of a future shot attempt. This belief in a player’s mind leads them to believe that if they have made one or two shots in a row, that they will continue to make them at a greater percentage, leading them to change the way they play; such as not passing to open teammates. This idea is further proven incorrect by the introduction of regression and how the concept shows that some decline must occur once a peak has been achieved.Part Two of the book delves into the motivations behind our beliefs, such as how social standards, biases, and overstated conversations can convince us of false realities. Again, the author uses several practical examples: one going back to the sports world and describing the biasing effects of referees who unfairly penalize certain jersey colors, as well as conditioning story of Albert, a young boy who was subjected to conditioning tests with animals and sounds.The final section takes an unexpected turn and goes after a handful of unconventional beliefs such as alternative medicine and extrasensory perception (ESP). Gilovich reveals himself to be quite the skeptic as he skillfully pokes holes in the non-scientific nature of these activities.Using his extensive background in social and behavioral psychology, Gilovich has created an insightful book that is essentially a "how-to" guide to avoiding irrational thinking. By giving the reader a set of tools to critically think about data, long-held beliefs, and newer fringe philosophies, Gilovich has empowered his audience to challenge the status quo by analyzing and evaluating the information that goes into making decisions or choosing what to believe as fact. The biggest criticisms of the book are related to how some topics seem to be discussed longer than necessary, and that several of the references are outdated. That being said, for a book that is 25+ years old, the content is written in a way that keeps the reader engaged, and explains the core concepts in a way that the layperson can sufficiently understand.
J**N
An antidote to superstition and conspiracy theories?
Although it's a bit technical, the basic ideas are really important - the author also has some great Youtube videos outlining these ideas - and I wish they could be taught in schools and higher education, along with other critical thinking skills. Given that we seem to be in a world that is flooded with information and misinformation and that most of us are hard pressed to know what to believe, an explanation of how the human brain works, how it interprets or misunderstands information and why, seems very timely. Indeed, vital.
F**D
The flawed tool of reason
How do we end up knowing things that just aren't so?The brain is hard-wired to detect order in the nature of things. We can learn from experience by accumulated observations and this has obvious survival advantages in evolutionary terms.But where do things start to go wrong? First of all, we see ordered patterns of outcomes that are in fact the blind product of chance. Chance produces less alternation than our intuition leads us to expect. If we toss a coin 20 times, we're unlikely to see 10 heads and 10 tails. A series of 20 tosses has a 50-50 chance of producing 4 heads in row. When we see patterns such as the lucky streak in baseball we think we are spotting an order that isn't in fact there.The regression effect also fools us into misattributing a cause to an effect. You perform exceptional badly or excel when taking an exam, much worse or better than your average. Your next result is likely to be better or worse as you move back to your average. That's the regression effect. But we make assumptions that the exceptional and atypical is representative when the regression effect would tell us otherwise: investors may assume that a company's bumper profits in one year will predict will be repeated in future years when in all likelihood they will actually fall.We underdetermine beliefs with insufficient data, treating weakly tested hypotheses as facts. We look for confirmatory examples while overlooking or discounting facts that contradict a belief. We fail in other words to understand the distinction between necessary and sufficient evidence. We seize on isolated, salient examples of pieces of data that prematurely confirm a hypothesis. Take the homoeopathist's claims that a cancer patient was miraculously cured after taking an alternative remedy. The recovery is treated as conclusive evidence of the remedy's efficacy. But such evidence is in itself insufficient to proof anything - isolated facts do not in themselves provide sufficient confirmation. They are too vulnerable to the discovery of counter-examples that contradict the hypothesis.We leap to such conclusions because when we test for a hypothesis, we fail to define what success or failure is. Too often beliefs are formed with vague definitions of what counts as a successful confirmation. Studies of identical twins separated at birth may well track an identity of life outcomes that point strongly to genetic influences. But there are many outcomes or results in any given life. Some of these may overlap and give the impression of congruence. So the twins may both choose the same occupation and this is indeed a striking identity of outcome but this is only one such outcome, and others may vary. The danger once again is taking an overlap of outcomes from two sets of data similar while overlooking variances. Likewise many predictions are couched so vaguely to guarantee against disconfirmation, akin to Woody Allen's spoof Nostradamus character who portentously avers that `two nations will go to war but only one will win'.Does our social nature compensate for this? Not necessarily. We tend to associate with like-minded people and to fight shy of conflict and controversy. Therefore members of presidential advisory groups hold their own counsel. We keep our mouths shut during a meeting at work. We do not want to be seen to rock the boat. The result is that others believe that their beliefs are more broadly shared than they actually are (one reason why the bore and the name dropper carries on with a self-defeating strategy is precisely the reluctance of others to point it out).Good heavens, having said all this how on earth can we tell if our beliefs our well founded? There is no easy way out of these cognitive illusions. It's not all bad. We do have good reasons for example to accept the theory of gravity, which has weight (so to speak) and well attested by centuries of sense and statistical data. So we can rightly disregard claims of levitation on this basis.We can also tighten up our definitions of what counts as confirmation, as we noted earlier. If we were testing whether a training course that claims it can raise sales staff performance really works, then we would define successful confirmation as increased sales figures. The scientific process of peer review also helps: we can make sure that a researcher does not which members of the trial group are receiving the new drug being tested, so preconceptions of success or failure do not contaminate the researcher's observations. We can test if a claim for an extraordinary effect like Extra Sensory Perception can be replicated (it can't).These are palliatives however. We can only strive imperfectly to try and recognise when our reasoning faculties are leading us up blind-allies. This book will help you at least be a little more vigilant when it comes to forming conclusions about why you think you are right to believe the way you do
G**R
Rewarding read,but not always easy
A clear presentation of cognitive biases in human mis-reasoning.But no doubt many humans have a cognitive bias to reject the concept of cognitive bias,so these most in need of becoming more aware are also most likely not to take its message on board,despite the evidence and research given in support.Might help though.
J**S
Should be required reading for all
This should be required reading for all the mystics and soothsayers. I lost my copy of this book from universtiy days and decided to replace it.
O**E
Five Stars
Very good book
Trustpilot
3 weeks ago
1 day ago