August 25, 2006
This is the tendency to believe that you control, or at least partially influence, things that you do not. For example, that by leaning to one side after you have bowled a definite gutter ball you might influence it to move back towards the center. Okay, not really… but I do do that sometimes.
This is the fuzzy land of superstitions that you don’t really believe rationally but which you still engage in emotionally. If you had $100 bet on the outcome of a coin toss, what would you do? Loosen up your shoulders? Hop up and down? Toss the coin up high, or keep it low? People who win a lot of coin tosses in a row might begin to feel like they’re better guessers. They might get angry with you if you try to make them lose their concentration.
One question I have about this bias is whether or not it is a harmful one. What are the consequences of feeling like you’re actually playing the video game even though you haven’t put any coins in it and it’s on demo mode? What side effects are there to screaming at the television telling the character not to go into the basement when they will sure get killed? Does it give you a false sense of confidence that eventually leads you making choices that lead to failure? It seems like if that were the case, that the illusion of control would correct itself over time… leading to a sense of control that was fairly accurate. In other words, if the illusion of control was harmful, it would eventually lead you to believe that your control of things was harmful and therefore make you try to control things less in order to not harm them. However, if the illusion of control has neutral or positive benefits, then that would explain why it stuck around… it would self-reinforce itself.
If you think about it with squinty eyes, you can even see how optimism and the “go get ’em” attitude of many very successful people involves an element of this illusion of control. It’s in that first split second of coming across a chance event or occurance that might end up being in your favor or against it… do you shy away from it, look at it indifferently, or take it on as something you can influence? And which reaction leads to the best outcomes?
Walking through this list of biases has been very interesting for me due to this strange confusion about their role and impact on the practical matters of how our brain works. They exist because they often lead to a net gain somehow… they are shortcuts and assumptions that we can’t help but make if we are to have any hope of processing as much information as we do on a daily basis. And yet at some point every shortcut will reveal flaws and occassionally steer us wrong. At which point should we meet the bias and say, “Thank you for your shortcut, but I will take it from here”? This is the relationship we must become aware of, between the conscious and the subconscious. Between the fast, cheap, and general, to the slow, expensive, and specific.
August 16, 2006
This is the tendency for people to value more immediate payoffs higher than remote payoffs.
Sort of makes sense, right? I’d prefer to get $5 today than $5 tomorrow. But would I rather get $10 today or $11 next week? Or would I rather get $500 today or $1,000 a year from now? In all three cases I’d probably take what I could get now rather than wait for the bigger payoff in the future.
However, the other twist is that this bias diminishes if both payoffs aren’t that close to the present. For example, I’d rather take $1,000 5 years from now than $500 4 years from now. As long as I’m waiting 4 years, might as well wait the 5th for twice as much. But if you compare it to the example in the paragraph above, you’ll see how this logic seems to flip flop a bit. In both cases waiting a year could gain me $500, but waiting a year right now seems harder than waiting a year 4 years from now. Hence the hyperbolic nature of this bias… it slowly twists over time. It’s irrational because you treat the same problem differenly depending on an arbitrary variable (its nearness in time to you).
Marketers can take advantage of this bias by offering you something small now in exchange for something bigger later. Credit cards, banks, and loan companies seem to thrive on this bias alone.
July 19, 2006
This is the tendency to compare two things based on one dimension rather than taking all dimensions into consideration. For example, if you currently dislike your job because it has a terrible commute and someone offers you a job that’s within walking distance of your house, you may be susceptible to thinking the second job is therefore better.
At first glance, it appears that the thing you hate about your current job doesn’t exist in the new one, and therefore you would be happier there. Of course, there are many other things that could contribute to the second job being better or worse than the first one. We tend to conduct comparisons along one axis and assume that all other things are equal. This is obviously a silly tactic.
If you tend to focus on the negative, the focusing effect can create a downward spiral of negative thinking. By focusing on a negative aspect of your current situation, many other things that you don’t have will always look like they’re better… from which the term “the grass is always greener on the other side of the fence” is coined. You begin to see yourself as an unlucky or lower person than others.
If you tend to focus on the positive, the focusing effect can create an attitude of complacency and stability. Everything will seem worse (or potentially worse) than your current situation, so you will never change jobs, never move to a new city, never meet new people, simply because you imagine the rest of the world’s experiences to be slightly or drastically worse than your own.
Which of these two sides of the focusing effect bias coin do you land on? I think I’m probably more on the negative side, thinking things could always be better… but at the same time I’m optimistic that I could reach those better things if I simply focused more.
July 13, 2006
This is the tendency for people to value something more as soon as they own it. For example, if you own a certain t-shirt, you will place a higher price on it than a tshirt that you don’t own. Most economists believe that you will be willing to sell something for the same price you paid to receive it, but if this bias is real (and there is some debate), then the simple act of owning an object will inflate the price. Perhaps because the cost of choosing is valuable in itself. Perhaps because of the effort involved in finding it in the first place, and having to find it again if you wanted it again. Perhaps because of a sense of scarcity (what if you can’t repurchase the item you just sold for the same price). Whatever the cause, it seems like a bias that encourages inflation… and therefore rewards buying early, and holding on to what you have.
June 21, 2006
The tendency for people to scrutinize evidence that contradicts their previous beliefs and to uncritically accept evidence that supports it. Useful because it helps us catch (aka pay attention to) information that might result in altering our behavior and beliefs. Harmful because existing beliefs continue to attract unscrutinized "evidence" at a much quicker rate than information that contradicts our beliefs. Weak beliefs become stronger over time simply by the fact that they encourage us to validate them more than they encourage us to invalidate them.
As most people are probably noticing, most of these biases are simply shortcuts that we take in order to make quick decisions. They are the various filters we put on incoming information to know when to pay close attention to something and when not to. As such, there's really no easy cure for a cognitive bias. To lower these filters is not only very difficult, but simply not practical… we would revert to the overstimulated confusion of childhood… paying attention to meaningless details while letting important new information slip accidentally by.
What would be useful, however, is if we were able to know at any given point which of our cognitive biases were used in the most recent deluge of information. The best way to do this that I can think of is to simply know the names for all of them… giving something a name makes it easier to spot.
To review the cognitive biases we've covered so far:
June 16, 2006
This is the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object. [Wikipedia]
I think this is one of the more powerful and disturbing cognitive biases that we have to deal with. Our brains are designed to notice change in value rather than value itself, and there are endless ways to trick us into thinking something is a good simply because it is not as bad as it used to be, and something is bad simply because it is not as good as it used to be.
Another more dramatic name for this bias could be Value Blindness. We simply can't judge a things value in itself without having a context in which the value is determined. Context serves as a bag of comparison possibilities. Think about it for a second. Do you realize that people purposefully limit their scope of understanding about the size and magnificence of the universe in order to keep their own significance in the universe in check? We purposefully underestimate the amount of happiness and intelligence in the world in order to support our own sense of happiness and intelligence. This isn't done with malicious intent, we simply have trouble believing in our own significance while also understanding our literal insignificance relative to the entire country, world, solar system, galaxy, and universe. In some cases, feeling insignificant, feeling like a tiny speck in a grand universe, feeling like the tiniest almost invisible speck in all of space time, can lead to depression and maybe even suicide. Words like insignificant, hopeless, impossible, and futile are a result of contrast effect with too large of a perspective of the world or too small of a perspective of yourself.
The reason that this is dangerous is because we know about it and people who have motivation to make things appear better or worse than they really are have a very simple tool to do so. People can make money off of you with this bias. The most obvious examples are in marketing, with sales and aspirational items, and determining your status and wealth by comparing yourself to your neighbors, but like I mentioned already it can go very deeply into our own understanding of self and meaning.
What tools are there to help correct for the contrast effect? There is the brute force willpower way of correcting for it by making note of what you're comparing everything with. But is it possible to stop noticing contrast in value? I don't think it is. We're doomed! Haha.
One other stopgap duct tape option is to make sure you regularly calibrate your comparison systems. Experience a wide range of lifestyles, philosophies, and attitudes towards your world. Travel frequently and meet new people and try new things and keep as much variety in your life as possible. This way, you can be sure to have a broad palette of comparisons available, and you can influence your comparisons to go in a direction of your choice.
June 5, 2006
This is our uncanny tendency to search for, or interpret, information that agrees with our preconceptions. For example, if you suspect that a certain person is a certain way (whether it be evil, lazy, or perfect), you will tend to notice and interpret that person's behaviors in such a way that support your belief, and consider evidence to the contrary as the result of errors of your own perception and judgement, or anomalies.
This is why, in the scientific method, it is useful to conduct an experiment in such a way as to attempt to disprove your theory, rather than prove it. In order for something to be true, it has to also be falsifiable. In other words, there has to be a way to prove it to be false, and those ways have to be tested, in order for it to be considered true.
The best way to help correct for this bias is to be open to contradictory evidence and to test our hypothesii by attempting to prove them wrong.
Read more: Confirmation Bias [Skeptics Dictionary]