Natural Rationality | decision-making in the economy of nature
Showing posts with label irrationality. Show all posts
Showing posts with label irrationality. Show all posts

4/28/08

Dan Ariely on Understanding the Logic Behind Illogical Decisions

Found on the American Association Management website: a podcast on behavioral economics


Dan Ariely on Understanding the Logic Behind Illogical Decisions

An MIT professor discovers that people tend to behave irrationally in a predictable fashion.

April 18, 2008 / Podcast # 08-16

Dan Ariely

Irrational behavior is a part of human nature, but as MIT professor Dan Ariely has discovered in 20 years of researching behavioral economics, people tend to behave irrationally in a predictable fashion. Drawing on psychology and economics, behavioral economics can show us why cautious people make poor decisions about sex when aroused, why patients get greater relief from a more expensive drug over its cheaper counterpart and why honest people may steal office supplies or communal food, but not money. According to Ariely’s new book Predictably Irrational, our understanding of economics, now based on the assumption of a rational subject, should, in fact, be based on our systematic, unsurprising irrationality. Ariely argues that greater understanding of previously ignored or misunderstood forces (emotions, relativity and social norms) that influence our economic behavior brings a variety of opportunities for reexamining individual motivation and consumer choice, as well as economic and educational policy.



10/10/07

Fairness and Schizophrenia in the Ultimatum

For the first time, a study look at schizophrenic patient behavior in the Ultimatum Game. Other studies of schizophrenic choice behavior revealed that they have difficulty in decisions under ambiguity and uncertainty (Lee et al, 2007), have a slight preference for immediate over long-term rewards, (Heerey et al, 2007), exhibit "strategic stiffness" (sticking to a strategy in sequential decision-making without integrating the outcomes of past choices; Kim et al, 2007), perform worse in the Iowa Gambling Task (Sevy et al. 2007)

A research team from Israel run a Ultimatum experiment with schizophrenic subjects (plus two control group, one depressive, one non-clinical). They had to split 20 New Israeli Shekels (NIS) (about 5 US$). Although schizophrenic patients' Responder behavior was not different from control group, their Proposer behavior was different: they tended to be less strategic.

With respect to offer level, results fall into three categories, fair (10 NIS), unfair (less than 10 NIS), and hyper-fair (more than 10 NIS). Schizophrenic patients tended to make less 'unfair' offer, and more 'hyper-fair' offer. Men were more generous than women.

According to the authors,

for schizophrenic Proposers, the possibility of dividing the money evenly was as reasonable as for healthy Proposers, whereas the option of being hyper-fair appears to be as reasonable as being unfair, in contrast to the pattern for healthy Proposers.
Agay et al. also studied the distribution of Proposers types according to their pattern of sequential decisions (how their second offer compared to their first). They identified three types:
  1. "‘Strong-strategic’ Proposers are those who adjusted their 2nd offer according to the response to their 1st offer, that is, raised their 2nd offer after their 1st one was rejected, or lowered their 2nd offer after their 1st offer was accepted.
  2. Weak-strategic’ Proposers are those who perseverated, that is, their 2nd offer was the same as their 1st offer.
  3. Finally, ‘non-strategic’ Proposers are those who unreasonably reduced their offer after a rejection, or raised their offer after an acceptance."
20% of the schizoprenic group are non-strategic, while none of the healthy subjects are non-strategic.


the highest proportion of non-strategic Proposers is in the schizophrenic group
The authors do not offer much explication for these results:

In the present framework, schizophrenic patients seemed to deal with the cognition-emotion conflict described in the fMRI study of Sanfey et al. (2003) [NOTE: the authors of the first neuroeconomics Ultimatum study] in a manner similar to that of healthy controls. However, it is important to note that the low proportion of rejections throughout the whole experiment makes this conclusion questionable.
Another study, however, shows that "siblings of patients with schizophrenia rejected unfair offers more often compared to control participants." (van ’t Wout et al, 2006, chap. 12), thus suggesting that Responder behavior might be, after all, different in patient with a genetic liability to schizophrenia. Yet another unresolved issue !

Related Posts

Reference
  • Agay, N., Kron, S., Carmel, Z., Mendlovic, S., & Levkovitz, Y. Ultimatum bargaining behavior of people affected by schizophrenia. Psychiatry Research, In Press, Corrected Proof.
  • Hamann, J., Cohen, R., Leucht, S., Busch, R., & Kissling, W. (2007). Shared decision making and long-term outcome in schizophrenia treatment. The Journal of clinical psychiatry, 68(7), 992-7.
  • Heerey, E. A., Robinson, B. M., McMahon, R. P., & Gold, J. M. (2007). Delay discounting in schizophrenia. Cognitive neuropsychiatry, 12(3), 213-21.
  • Hyojin Kim, Daeyeol Lee, Shin, Y., & Jeanyung Chey. (2007). Impaired strategic decision making in schizophrenia. Brain Res.
  • Lee, Y., Kim, Y., Seo, E., Park, O., Jeong, S., Kim, S. H., et al. (2007). Dissociation of emotional decision-making from cognitive decision-making in chronic schizophrenia. Psychiatry research, 152(2-3), 113-20.
  • Mascha van ’t Wout, Ahmet Akdeniz, Rene S. Kahn, Andre Aleman. Vulnerability for schizophrenia and goal-directed behavior: the Ultimatum Game in relatives of patients with schizophrenia. (manuscript), from The nature of emotional abnormalities in schizophrenia: Evidence from patients and high-risk individuals / Mascha van 't Wout, 2006, Proefschrift Universiteit Utrecht.
  • McKay, R., Langdon, R., & Coltheart, M. (2007). Jumping to delusions? Paranoia, probabilistic reasoning, and need for closure. Cognitive neuropsychiatry, 12(4), 362-76.
  • Sevy, S., Burdick, K. E., Visweswaraiah, H., Abdelmessih, S., Lukin, M., Yechiam, E., et al. (2007). Iowa Gambling Task in schizophrenia: A review and new data in patients with schizophrenia and co-occurring cannabis use disorders. Schizophrenia Research, 92(1-3), 74-84.



10/1/07

The Rationality of Soccer Goalkeepers


(source: Flickr)


A study in the Journal of Economic Psychology analyzes soccer goalkeepers penalty decision-making: jump left, jump right, or stay in the center. Bar-Eli et al.'s study of

(... ) 286 penalty kicks in top leagues and championships worldwide shows that given the probability distribution of kick direction, the optimal strategy for goalkeepers is to stay in the goal's center.
The probability of stopping a penalty kick are the following:


Why do goalkeeper jump left of right, when the optimal nash-equilibirum strategy is is to stay in the goal's center? Because jumping is the norm, and thus

(...) a goal scored yields worse feelings for the goalkeeper following inaction (staying in the center) than following action (jumping), leading to a bias for action.
This study illustrates the tension between internal(subjective) and external (objective) rationality discussed in my last post: statistically speaking, as a rule for winning games, to jump is (externally) suboptimal; but given the social norm and the associated emotional feeling, jumping is (internally) rational. Note also how modeling the game is important for normative issue: two other studies, (Palacios-Huerta, 2003; Chiappori et al., 2002) concluded that goalkeepers play a rational strategy, but they supposed that shooter and goalkeeper had only two options, (kick/jump) left or right. Bar-Eli et al. added (kick/stay) in the center.


Reference




9/29/07

How is (internal) irrationality possible?

Much unhappiness (...) has less to do with not getting what we want, and more to do with not wanting what we like.(Gilbert & Wilson, 2000)

Yes, we should make choices by multiplying probabilities and utilities, but how can we possibly do this if we can’t estimate those utilities beforehand? (Gilbert, 2006)

When we preview the future and prefeel its consequences, we are soliciting advice from our ancestors. This method is ingenious but imperfect. (Gilbert, et al. 2007)


Although we easily and intuitively assess each other’s behavior, speech, or character as irrational, providing a non-trivial account of irrationality might be tricky (something we philosophers like to deal with!) Let’s distinguish internal and external assessment of rationality: an internal (or subjective) assessment of rationality is an evaluation of the coherence of intentions, actions and plans. An external (or objective) assessment of rationality is an evaluation of the effectiveness of a rule or procedure. It assesses the optimality of a rule for achieving a certain goal. An action can be rational from the first perspective but not from the second one, and vice versa. Hence subjects’ poor performance in probabilistic reasoning can be internally rational without being externally rational: the Gambler’s fallacy is and will always be a fallacy: it is possible, however, that fallacious reasoners follow rational rules, maximizing an unorthodox utility function. Consequently, it is easy to understand how one can be externally irrational, but less easy to make sense of internal irrationality.

An interesting suggestion comes from hedonic psychology, and mostly Dan Gilbert’s research: irrationality is possible if agents fail to want things they like. Gilbert research focuses on Affective Forecasting, i.e., the forecasting of one's affect (emotional state) in the future (Gilbert, 2006; Wilson & Gilbert, 2003): anticipating the affective valence, intensity, duration and nature of specific emotions. Just like Tversky and Kahneman studied biases in probabilistic reasoning, he and his collaborator study biases in hedonistic reasoning.

In many cases, for instance, people do not like or dislike an event as much as they thought they would. They want things that do not promote welfare, and not want things that would promote their welfare. This what Gilbert call “miswanting”. We miswant, explain Gilbert, because of affective forecasting biases.

Take for instance impact biases: subject overestimate the length (durability bias) or intensity (intensity bias) of future emotive states (Gilbert et al., 1998):

“Research suggests that people routinely overestimate the emotional impact of negative events ranging from professional failures and romantic breakups to electoral losses, sports defeats, and medical setbacks”. (Gilbert et al., 2004).

They also underestimate the emotional impact of positive events such as winning a lottery (Brickman et al., 1978): newly rich lottery winners rated their happiness at this stage of their life as only 4.0, (on a 6-point scale, 0 to 5) which does not differ significantly from the rating of the control subjects. Also surprising to many people is the fact that paraplegics and quadriplegics rated their lives at 3.0, which is above the midpoint of the scale (2.5). In another study, Boyd et al., (1990) solicited the utility of life with a colostomy from several different groups: patients who had rectal cancer and who had been treated by radiation, patients who had rectal cancer and who had been treated by a colostomy, physicians who had experience treating patients with gastrointestinal malignancies, and two groups of healthy individuals. The patients with a colostomy and the physicians rated life with a colostomy significantly higher than did the other three groups. Another bias is the Empathy gap: humans fail to empathize or predict correctly how they will feel in the future, i.e. what kind of emotional state they will be in. Sometimes, we fail to take into account how much our psychological “immune system” will ameliorate reactions to negative events. People do not realize how they will rationalize negative outcomes once they occur (the Immune neglect). People also often mispredict regret (Gilbert et al., 2004b):
the top six biggest regrets in life center on (in descending order) education, career, romance, parenting, the self, and leisure. (…) people's biggest regrets are a reflection of where in life they see their largest opportunities; that is, where they see tangible prospects for change, growth, and renewal. (Roese & Summerville, 2005).
So a perfectly rational agent, at time t, would choose to do X at t+1 given what she expects her future valuation of X to be. As studies showed, however, we are bad predictors of our own future subjective appreciation. The person we are at t+1 may not totally agree with the person we were at t. So, in one sense, this gives a non-trivial meaning to internal irrationality: since our affective forecasting competence is biased, we may not always choose what we like or like what we choose. Hedonic psychology might have identified incoherence between intentions, actions and plans, an internal failure in our practical rationality.

Recommended reading:



References

  • Berns, G. S., Chappelow, J., Cekic, M., Zink, C. F., Pagnoni, G., & Martin-Skurski, M. E. (2006). Neurobiological Substrates of Dread. Science, 312(5774), 754-758.
  • Boyd, N. F., Sutherland, H. J., Heasman, K. Z., Tritchler, D. L., & Cummings, B. J. (1990). Whose Utilities for Decision Analysis? Med Decis Making, 10(1), 58-67.
  • Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery Winners and Accident Victims: Is Happiness Relative? J Pers Soc Psychol, 36(8), 917-927.
  • Gilbert, D. T. (2006). Stumbling on Happiness (1st ed.). New York: A.A. Knopf.
  • Gilbert, D. T., & Ebert, J. E. J. (2002). Decisions and Revisions: The Affective Forecasting of Changeable Outcomes. Journal of Personality and Social Psychology, 82(4), 503–514.
  • Gilbert, D. T., Lieberman, M. D., Morewedge, C. K., & Wilson, T. D. (2004a). The Peculiar Longevity of Things Not So Bad. Psychological Science, 15(1), 14-19.
  • Gilbert, D. T., Morewedge, C. K., Risen, J. L., & Wilson, T. D. (2004b). Looking Forward to Looking Backward. The Misprediction of Regret. Psychological Science, 15(5), 346-350.
  • Gilbert, D. T., Pinel, E. C., Wilson, T. D., Blumberg, S. J., & Wheatley, T. P. (1998). Immune Neglect: A Source of Durability Bias in Affective Forecasting. J Pers Soc Psychol, 75(3), 617-638.
  • Gilbert, D. T., & Wilson, T. D. (2000). Miswanting: Some Problems in the Forecasting of Future Affective States. Feeling and thinking: The role of affect in social cognition, 178–197.
  • Kermer, D. A., Driver-Linn, E., Wilson, T. D., & Gilbert, D. T. (2006). Loss Aversion Is an Affective Forecasting Error. Psychological Science, 17(8), 649-653.
  • Loomes, G., & Sugden, R. (1982). Regret Theory: An Alternative Theory of Rational Choice under Uncertainty. The Economic Journal, 92(368), 805-824.
  • Roese, N. J., & Summerville, A. (2005). What We Regret Most... And Why. Personality and Social Psychology Bulletin, 31(9), 1273.
  • Seidl, C. (2002). Preference Reversal. Journal of Economic Surveys, 16(5), 621-655.
  • Slovic, P., Finucane, M., Peters, E., & MacGregor, D. G. (2002). Rational Actors or Rational Fools: Implications of the Affect Heuristic for Behavioral Economics. Journal of Socio-Economics, 31(4), 329-342.
  • Srivastava, A., Locke, E. A., & Bartol, K. M. (2001). Money and Subjective Well-Being: It's Not the Money, It's the Motives. J Pers Soc Psychol, 80(6), 959-971.
  • Thaler, A., & Tversky, R. H. (1990). Anomalies: Preference Reversals. Journal of Economic Perspectives, 4, 201-211.
  • Wilson, T. D., & Gilbert, D. T. (2003). Affective Forecasting. Advances in experimental social psychology, 35, 345-411.



9/16/07

Natural Irrationality. How judgement and decision-making can go wrong

Lifehack has an interesting post about "7 Stupid Thinking Errors You Probably Make". Readers of this blog might be already familiar with these (confirmation bias, recency effects, etc.), so here a the full list from Wikipedia:

  • Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias.
  • Base rate fallacy
  • Bias blind spot — the tendency not to compensate for one's own cognitive biases.
  • Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
  • Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
  • Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
  • Endowment effect — "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".
  • Extreme aversion — the tendency to avoid extremes, being more likely to choose an option if it is the intermediate choice.
  • Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Framing — by using a too narrow approach or description of the situation or issue.
  • Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
  • Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
  • Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias — the tendency to seek information even when it cannot affect action.
  • Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
  • Loss aversion — "the disutility of giving up an object is greater than the utility associated with acquiring it".(see also sunk cost effects and Endowment effect).
  • Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
  • Need for closure — the need to reach a veredict in important matters; to have an answer and to escape the feeling of doubt and uncertainty. The personal context (time or social pressure) might increase this bias.
  • Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
  • Omission bias — The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Planning fallacy — the tendency to underestimate task-completion times.
  • Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance - the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Selective perception — the tendency for expectations to affect perception.
  • Status quo bias — the tendency for people to like things to stay relatively the same (see also Loss aversion and Endowment effect).
  • Unit bias — the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
  • Von Restorff effect — the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
  • Zero-risk bias — preference for reducing a small risk to zero over a greater reduction in a larger risk.