Natural Rationality | decision-making in the economy of nature


The psychopath, the prisoner's dilemma and the invisible hand of morality

In the prisoner’s dilemma the police holds, in separate cells, two individuals accused of robbing a bank. The suspects (let’s call them Bob and Alice) are unable to communicate with each others. The police offers them the following options: confess or remain silent. If one confesses – implicating his or her partner – and the other remains silent, the former goes free while the other gets a 10 years sentence. If they both confess, they will serve a 5-years sentence. If they both remain silent, the sentence will be reduced to 2 years. The situation can be represented as the following payoff matrix:

Assuming that Bob and Alice have common knowledge – everybody knows that everybody knows that everybody knows, etc., ad infinitum – of each other’s rationality and the rules of the game, they should confess. Indeed, they will expect each other to make the best move, which is confessing, since confessing gives you either freedom or a 5-years sentence, while remaining silent entails either a 2-years or a 10-years sentence. The best reply to this move is also confessing, thus we expect Bob and Alice to confess. Even if they would be better-off in remaining silent, this choice is suboptimal: they risk a 10-years sentence if the other does not remain silent. In other words, they should not choose the cooperative move.

Experimental game theory indicates that subjects cooperate massively in prisoner’s dilemma. Recently, neuroeconomics showed that players enjoy cooperating, what economists refer to as the “warm glow of giving”. In the prisoner’s dilemma, players who initiate and players who experience mutual cooperation display activation in nucleus accumbens and other reward-related areas (Rilling et al. 2002).

In a new paper, Rilling and its collaborators (2007) now investigate how psychopathy influences cooperation in the prisoner's dilemma. Their subjects were not psychopaths per se: instead, they used normal individuals and--with a questionnaire--rated their attitudes on a "psychopathy scale". While in a scanner, they were then asked to play a prisoner's dilemma with nonscanned subjects. They were in fact playing against a computer following the "forgiving tit-for-tat" strategy", analogous to tit-for-tat excepts that it reciprocates previous defection only 67% of the time.

Behavioral results indicate that psychopathy is correlated with defection, even after mutual cooperation. One explanation could be that psychopaths have impaired amygdala, and hence are less sensible to aversive conditioning. This is coherent with fMRI data that suggests that the Cooperate-Defect outcome (I cooperate, you defect) elicit less aversive reaction in individual who score higher in psychopathy. Moreover, choosing to defect elicited more activity in the ACC and DLPFC (areas classically involved in emotional modulation and cognitive control), suggesting that defecting is effortful. Psychopathy, however, is correlated with less activity in these areas: it thus seems easier for psychopathic personalities to be non-cooperative. "Regular" people need more cognitive effort to override their cooperative biases.

fMRI suggest also that low-psychopathy and high-psychopathy subjects differs on how their brain implements cooperative behavior: while the formers rely on emotional biases (strong activation in the OFC, weak activation in DLPFC), the latters rely on cognitive control (weak activation in the OFC, strong activation in DLPFC). High-psychopathy subjects would be, according to Rilling et al., weakly emotionally biased toward defection: they exhibit a stronger OFC activation and a weaker DLPFC for defection. Thus, it seems that normal subjects tend to experiments the immediate gratification of cooperation, independently of the monetary payoff. Psychopaths do not feel the "warm glow" of cooperation, and thus do not cooperate.

Philosophically, there is an interesting lessons here: both low-psychopathy and high-psychopathy subjects follow their own, selfish biases: low-psychopathy ones enjoy cooperating, and high-psychopathy prefer defecting. This is consistent with a thesis I will one day describe more thoroughly another day, the "Invisible Hand of Morality": like markets, morality emerges out of the interaction of selfish agents. Luckily, thanks to evolution, culture, education, norms, etc., normal people selfishness tends to be geared toward cooperation. Psychopaths are not more selfish than normal people: their selfishness do not value cooperation, or other social virtues. Thus morality is not (only) "in the head": it is partly distributed is sensorimotor/somatovisceral mechanisms, cultural habits, external cues, institutions, etc. The other lesson is that morality is multi-realizable: it can be realized through emotional biases or cognitive control.

  • Rilling, J. K., Glenn, A. L., Jairam, M. R., Pagnoni, G., Goldsmith, D. R., Elfenbein, H. A., et al. (2007). Neural correlates of social cooperation and non-cooperation as a function of psychopathy. Biological Psychiatry, 61(11), 1260-1271.
  • Rilling, J., Gutman, D., Zeh, T., Pagnoni, G., Berns, G., & Kilts, C. (2002). A neural basis for social cooperation. Neuron, 35(2), 395-405.