Yesterday we had the pleasure of welcoming Alan Sanfey from the Decision Neuroscience Lab at the Donders Centre for Cognitive Neuroscience for a GEMH Talk celebrating this month's theme: prosociality and kindness. While Alan is a psychological scientist by education, he has adopted methods from neuroscience (fMRI) and economics (computational modeling) in order to create a better and more precise understanding of people's behaviours, and in particular people's decision-making. In his talk, Alan has shed some light on different reasons for people to be prosocial and how studies of the brain can disentangle processes that result in the same behaviour but are very different in motivation.
The talk started off by remarking that the three fields most prominently represented in the work of a decision neuroscientist - namely psychology, economics, and neuroscience - are very different in nature. Not only because they investigate different concepts, but also because they have different goals. Whereas psychology and neuroscience are descriptive ('what people are actually doing'), economics on the other hand is prescriptive, focusing on 'optimal' behaviour ('what people should be doing'). Decision neuroscience combines these three fields in an effort to come up with biologically and psychologically plausible models of human decision-making that are theoretically structured as well.
Economics tends to focus on gambling decisions solely revolving around money and its value to people, and these types of experiments are most often featured in lab studies. Social decision-making, however, is a field on the rise, and actively researched in Alan's lab as well. But what constitutes a 'social' decision? Alan proposes that a social decision is one in which we take into account other people, specifically their beliefs and values. We generally care about those, and under most circumstances try to take them into account when making these social decisions. As a result, this social factor influences our decision-making and future interactions.
An especially interesting branch of social decision-making is the one that addresses prosocial decision-making. After all, both everyday life and lab studies find that people tend to be prosocial even when they don't have to be (which might very well be the very definition of 'prosocial'). What motivates people to be prosocial to others? Generally, there are three lines of thought when it comes to this conundrum: the warm glow theory, the guilt aversion theory, and the inequity aversion theory.
The warm glow theory proposes that being kind to others simply makes us feel good about ourselves, giving us a 'warm glow', and that this boost in positive energy is all the reason people need to be nice to each other. The guilt aversion and inequity aversion theories are of a less merry tone, saying that it isn't that we are prosocial because it makes us feel nice, we're nice to others because we feel some sort of obligation to do so, either because we feel guilty if we don't meet another person's expectations, or because we feel that there is a certain fairness standard to be upheld (and we'll feel bad about ourselves if we don't). Alan points out that thanks to neuroscience, there is now a growing consensus that one of the latter theories might have it right: brain imaging studies have shown that when people display prosocial behaviour (in this case, give money to another person when there was no formal obligation to do so), areas in the brain light up that are commonly associated with negative emotional processing, suggesting that being prosocial may not be such a pleasant experience after all. Strikingly, keeping the money to themselves was associated with increased activity in areas commonly related to the positive experience of receiving a reward.
So, the warm glow theory is out the window, it seems. Still, there are two competing theories left, and while they are expressed the same way (i.e. by being nice to others), they are different in nature. Alan explains that another gift from economics, the so-called Trust Game (see for instance this video for a visual illustration), helps us take a closer look at these motivations, and find out which of the two is actually at play. In a Trust Game, the player (the trustee) receives a certain amount of money from another person (the trustor) who is given a standard starting amount, say 10 euros. This money is multiplied by a given factor (4, by default), meaning that if the trustor sends the player 4 euros, this will be quadrupled and the player will end up with 16 euros, and the trustor will be left with 6 euros. Both the player and the trustor know what the multiplier is, and so the trustor knows the amount the player will end up with, and the player knows that the trustor is aware of the money he has. Now, it's the player's turn to either return any amount of his money to the trustor who invested in him (hence this game is sometimes also called the Investment Game), but he is by no means obligated to. By looking at the amount of money returned by the player to the trustor (if any), researchers can see whether people display reciprocal kindness, or not.
However, with this version of the game, the two theories of prosociality will still yield similar actions (namely, money returned to the trustor by the player). As a solution to this problem, Alan introduced the concept of a 'hidden multiplier', an idea developed in his lab, that will allow to track whether people return money based on what they think is fair (which is generally half of what they have), or based on what they know the trustor will expect from them (which is also half of what they have). How? By implementing a hidden multiplier that is often the standard factor of 4, but sometimes 2 and sometimes 6 (a paper explaining this mechanic is currently being submitted for publication at Nature Neuroscience!), Alan's lab has been able to show that there are certain subtypes of people: those who generally follow the inequity aversion theory (i.e. return half of what they have, regardless of whether the trustor expects this amount), those who generally follow the guilt aversion theory (i.e. return exactly that amount that they know the trustor expects to get, regardless of whether that's half of what they have), those who tend to be generally greedy and keep everything for themselves (although this is always a minority in these experiments), and interestingly a last subgroup of so-called 'moral opportunists' as Alan lovingly dubbed them: people who try to meet the other person's expectation when they have plenty of money to go around, but only give a little bit when they have less themselves, thus mixing the inequity aversion and guilt aversion motivations.
Lastly, Alan explained that these differences in motivations are actually also visible in the brains of people making these seemingly similar decisions: a new method developed by his lab in collaboration with a U.S. research group has allowed them to find regions in the brain that respond in the same way in people that show the same motivational behaviour. Those who make their decision based on guilt aversion tend to show significant activity in the insula, the area mentioned earlier as being associated with negative emotional processing. Those who base their decisions on inequity aversion all seem to show significant activity in the medial prefrontal cortex, an area previously related to thorough evaluation and calculation of options. Lastly, the moral opportunists show both of these patterns, with their brains behaving guilt-averse when their decisions try to meet others' expectations, and behaving inequity-averse when their decisions are based on fairness norms.
To sum up, Alan Sanfey has shown us that prosociality can be motivated in several ways depending on the type of person you are, and that these motivations can be associated with actual differences in brain activity. But most importantly, his work has once again shown us that people do care about social motivations, and that social decision-making methods have great potential for furthering our knowledge about ourselves.
Bottom line: whatever your motivation is - be kind!