(An overview of different conceptions of decision-making in philosophy, economics and psychology.)
Rational agents display their rationality mainly in making decisions. Certain decisions are more basic (turn left or turn right), others are crucial issues (“to be or not to be”). In any case, being an agent entails making choices. Even abstinence is decision, as thinkers like William James or Jean-Paul Sartre once pointed out. In our ordinary use of the word, our folk-psychology inclines us to believe that making a decision implies a deliberation: a weighting of beliefs, desires and intentions (Malle et al., 2001). In philosophy of mind, the standard conception of decision-making equates deciding and forming an intention before an action (Davidson, 1980, 2004; Hall, 1978; Searle, 2001). According to different analysis, this intention can be equivalent to, inferred from or accompanied by, desires and beliefs. Thus, the decisions rational agents make are motivated by reasons. Rational actions are explained by these reasons, the purported causes of the actions. Beliefs and desires are also constitutive of rationality because they justify rational action: there is a logical coherence between beliefs, desires and actions. Actions are irrational when their causes do not justify them. Beliefs and desires are embedded in our interpretations of rational agents as rational agents: “[a]nyone who superimposes the longitudes of desire and the latitudes of belief is already attributing rationality” (Sorensen, 2004, p. 291). Hence, on this account, X is a rational agent if X can be interpreted as an agent whose actions are justified by the beliefs and desires that caused her to make a particular choice. The attribution of rational agency is then based on the success of applying an interpretation scheme that presuppose the rationality of the agent, such as the Dennettian "intentional stance", the Davidsonian "principle of charity" or the Popperian "principle of rationality" (Davidson, 1980; Dennett, 1987; Popper, 1994).
The abstract structure of this interpretation scheme has been formalized by theoretical economics and rational-choice theory. Economics, according to a standard definition by Lionel Robbins, is the “science which studies human behavior as a relationship between ends and scarce means which have alternative uses” (Robbins, 1932, p. 15). This definition shows the centrality of decision-making in economic science: since means are scarce, behavior should use them efficiently. The two branches of rational-choice theory, decision theory and game theory, specifies the formal constraints on optimal decision-making in individual and interactive contexts. An individual agent facing a choice between two actions can make a rational decision is she takes into account two parameters: the probability and utility of the consequences of each action. By multiplying the subjective probability by the subjective utility of an action’s outcomes, she can select the action that have the higher subjective expected utility(see Baron, 2000, for an introduction). Game theory models agents making decisions in a strategic context, where the preferences of at least another agent must be taken into account. Decision-making is represented as the selection of a strategy in a game, that is, a set of rules that dictates the range of possible actions and the payoffs of any conjunct of actions. Thus, economic decision-making is mainly about computing probabilities and utilities (Weirich, 2004 ). The philosopher’s beliefs-desire model is hence reflected in the economist’s probability-utility model: probabilities represent beliefs while utilities represent desires.
Rational-choice theory can be construed as a normative theory (what agents should do) or as a descriptive one (what agents do). On its descriptive construal, rational-choice theory is a framework for building predictive models of choice behavior: which lottery an agent would select, whether an agent would cooperate or not in a prisoner’s dilemma, etc. Experimental economics, behavioral economics, cognitive science and psychology (I will refer to these empirical approaches of rationality as ‘psychology’) use this model to study how subjects make decisions and which mechanisms they rely on for choosing. These patterns of inference and behavior can then be compared with rational-choice theory. In numerous studies, Amos Tversky and Daniel Kahneman showed that decision-makers’ judgments deviate markedly from normative theories (Kahneman, 2003; Kahneman et al., 1982; Tversky, 1975). Subjects tend to make decisions according to their ‘framing’ of a situation (the way they represent the situation, e.g. as a gain or as a loss), and exhibit loss-, risk- and ambiguity-aversion (Camerer, 2000; Kahneman & Tversky, 1979, 1991, 2000; Thaler, 1980). In most of their experiments, Tversky and Kahneman asked subjects to choose among different options in fictive situations in order to assess the similarity between natural ways of thinking and normative decision theory. For instance, subjects were presented the following situation (Tversky & Kahneman, 1981):
Imagine that the United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows:
- If Program A is adopted, 200 people will be saved
- If Program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved.
Which of the two programs would you favor?
Most of the respondent opted for A, the risk-averse solution. When respondent were offered the following version:
- If Program A is adopted, 400 people will die
- If Program B is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die
Although Program A has exactly the same outcome in both versions (400 people die, 200 will be saved), in the second version Program B is the most popular. Thus, not only are subjects risk-averse, but their risk-aversion depends on the framing of the situation. Subjects have a different attitude whether a situation is presented as a gain or as a loss. The study of decision-making is thus the study of the heuristics and biases that impinge upon human judgment. The explanatory target is the discrepancies between rational-choice theory and human psychology. Just like the psychology of perception tries to explain visual illusions (e.g. the Muller-Lyer illusion), the psychology of decision tries to explain cognitive illusions: why agents prefer systematically one kind of prospect to another when rational-choice theory recommends another. Loss-aversion, for instance, can be explained by the shape of the value function: it is concave for gains and convex for losses. Thus loosing $100 hurts more than winning $100 makes one happy.
Proponent of the ecological rationality approach suggested nonetheless that these heuristics and bias might be adaptive in certain contexts and that failures of human rationality can be lessen in proper ecological conditions. For instance, when probabilities are presented as frequencies (6 out of 10) instead of subjective probabilities (60%), results tend to be much better, partly because we encounter more sequences of events than degrees of beliefs. These heuristics might be ‘fast and frugal’ procedures tailored for certain tasks, thus leading to suboptimal outcomes in other contexts. (Gigerenzer, 1991; Gigerenzer et al., 1999). Or they could be vestigial adaptations to ecological and social environments where our hunters-gatherers ancestors lived. Thus heuristics may not completely ineffective.
Baron, J. (2000). Thinking and deciding (3rd ed.). Cambridge, UK ; New York: Cambridge University Press.
Camerer, C. (2000). Prospect theory in the wild. In D. Kahneman & A. Tversky (Eds.), Choice, values, and frames (pp. 288-300). New York: Cambridge University Press.
Davidson, D. (1980). Essays on actions and events. Oxford: Oxford University Press.
Davidson, D. (2004). Problems of rationality. Oxford: Oxford University Press.
Dennett, D. C. (1987). The intentional stance. Cambridge, Mass.: MIT Press.
Gigerenzer, G. (1991). How to make cognitive illusions disappear: Beyond heuristics and biases. European Review of Social Psychology, 2(S 83), 115.
Gigerenzer, G., Todd, P. M., & ABC Research Group. (1999). Simple heuristics that make us smart. New York: Oxford University Press.
Hall, J. W. (1978). Deciding as a way of intending. The Journal of Philosophy, 75(10), 553-564.
Kahneman, D. (2003). A perspective on judgment and choice: Mapping bounded rationality. Am Psychol, 58(9), 697-720.
Kahneman, D., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty : Heuristics and biases. Cambridge ; New York: Cambridge University Press.
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47, 263-291.
Kahneman, D., & Tversky, A. (1991). Loss aversion in riskless choice: A reference-dependent model. The Quartely Journal of Economics, 106(4), 1039-1061.
Kahneman, D., & Tversky, A. (2000). Choices, values, and frames. Cambridge, UK: Cambridge University Press.
Malle, B. F., Moses, L. J., & Baldwin, D. A. (2001). Intentions and intentionality : Foundations of social cognition. Cambridge, Mass.: MIT Press.
Popper, K. R. (1994). Models, instruments, and truth: The status of the rationality principle in the social sciences. In The myth of the framework. In defence of science and rationality (pp. 154-184). London: Routledge.
Robbins, L. (1932). An essay on the nature and signifiance of economic science. London Macmillan.
Searle, J. (2001). Rationality in action. Cambridge, Mass.: MIT Press.
Sorensen, R. (2004). Charity implies meta-charity. Philosophy and Phenomenological Research, 26, 290-315.
Thaler, R. H. (1980). Toward a positive theory of consumer choice. Journal of Economic Behavior & Organization, 1(1), 39-60.
Tversky, A. (1975). A critique of expected utility theory: Descriptive and normative considerations. Erkenntnis, V9(2), 163-173.
Tversky, A., & Kahneman, D. (1981). The framing of decisions and psychology of choice. Science, 211, 453-458.
Weirich, P. (2004 ). Economic rationality. In A. in Mele, & Rawlings, P. (Ed.), Oxford handbook of rationality (pp. 380–398). Oxford: Oxford University Press.
(An overview of different conceptions of decision-making in philosophy, economics and psychology.)